00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v22.11" build number 1894 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3160 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.059 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/rocky9-vg-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-vg.groovy 00:00:00.059 The recommended git tool is: git 00:00:00.060 using credential 00000000-0000-0000-0000-000000000002 00:00:00.061 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/rocky9-vg-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.104 Fetching changes from the remote Git repository 00:00:00.105 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.156 Using shallow fetch with depth 1 00:00:00.156 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.156 > git --version # timeout=10 00:00:00.204 > git --version # 'git version 2.39.2' 00:00:00.204 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.239 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.239 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:08.433 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:08.449 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:08.464 Checking out Revision 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 (FETCH_HEAD) 00:00:08.464 > git config core.sparsecheckout # timeout=10 00:00:08.478 > git read-tree -mu HEAD # timeout=10 00:00:08.498 > git checkout -f 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=5 00:00:08.519 Commit message: "pool: fixes for VisualBuild class" 00:00:08.519 > git rev-list --no-walk 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=10 00:00:08.638 [Pipeline] Start of Pipeline 00:00:08.655 [Pipeline] library 00:00:08.657 Loading library shm_lib@master 00:00:08.657 Library shm_lib@master is cached. Copying from home. 00:00:08.676 [Pipeline] node 00:00:23.679 Still waiting to schedule task 00:00:23.679 Waiting for next available executor on ‘vagrant-vm-host’ 00:08:46.618 Running on VM-host-SM4 in /var/jenkins/workspace/rocky9-vg-autotest 00:08:46.620 [Pipeline] { 00:08:46.631 [Pipeline] catchError 00:08:46.633 [Pipeline] { 00:08:46.648 [Pipeline] wrap 00:08:46.658 [Pipeline] { 00:08:46.665 [Pipeline] stage 00:08:46.666 [Pipeline] { (Prologue) 00:08:46.686 [Pipeline] echo 00:08:46.687 Node: VM-host-SM4 00:08:46.691 [Pipeline] cleanWs 00:08:46.698 [WS-CLEANUP] Deleting project workspace... 00:08:46.698 [WS-CLEANUP] Deferred wipeout is used... 00:08:46.704 [WS-CLEANUP] done 00:08:46.885 [Pipeline] setCustomBuildProperty 00:08:46.954 [Pipeline] nodesByLabel 00:08:46.956 Found a total of 2 nodes with the 'sorcerer' label 00:08:46.965 [Pipeline] httpRequest 00:08:46.969 HttpMethod: GET 00:08:46.970 URL: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:08:46.970 Sending request to url: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:08:46.972 Response Code: HTTP/1.1 200 OK 00:08:46.972 Success: Status code 200 is in the accepted range: 200,404 00:08:46.973 Saving response body to /var/jenkins/workspace/rocky9-vg-autotest/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:08:47.115 [Pipeline] sh 00:08:47.393 + tar --no-same-owner -xf jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:08:47.412 [Pipeline] httpRequest 00:08:47.416 HttpMethod: GET 00:08:47.417 URL: http://10.211.164.101/packages/spdk_e55c9a81251968acc91e4d44169353be1987a3e4.tar.gz 00:08:47.417 Sending request to url: http://10.211.164.101/packages/spdk_e55c9a81251968acc91e4d44169353be1987a3e4.tar.gz 00:08:47.418 Response Code: HTTP/1.1 200 OK 00:08:47.418 Success: Status code 200 is in the accepted range: 200,404 00:08:47.419 Saving response body to /var/jenkins/workspace/rocky9-vg-autotest/spdk_e55c9a81251968acc91e4d44169353be1987a3e4.tar.gz 00:08:49.612 [Pipeline] sh 00:08:49.903 + tar --no-same-owner -xf spdk_e55c9a81251968acc91e4d44169353be1987a3e4.tar.gz 00:08:53.199 [Pipeline] sh 00:08:53.480 + git -C spdk log --oneline -n5 00:08:53.480 e55c9a812 vbdev_error: decrement error_num atomically 00:08:53.480 f16e9f4d2 lib/event: framework_get_reactors supports getting pid and tid 00:08:53.480 2d610abe8 lib/env_dpdk: add spdk_get_tid function 00:08:53.480 f470a0dc6 event: do not call reactor events from spdk_thread context 00:08:53.480 8d3fdcaba nvmf: cleanup maximum number of subsystem namespace remanent code 00:08:53.501 [Pipeline] withCredentials 00:08:53.512 > git --version # timeout=10 00:08:53.524 > git --version # 'git version 2.39.2' 00:08:53.539 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:08:53.542 [Pipeline] { 00:08:53.552 [Pipeline] retry 00:08:53.555 [Pipeline] { 00:08:53.576 [Pipeline] sh 00:08:53.856 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:08:54.126 [Pipeline] } 00:08:54.150 [Pipeline] // retry 00:08:54.156 [Pipeline] } 00:08:54.176 [Pipeline] // withCredentials 00:08:54.189 [Pipeline] httpRequest 00:08:54.193 HttpMethod: GET 00:08:54.194 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:08:54.195 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:08:54.197 Response Code: HTTP/1.1 200 OK 00:08:54.198 Success: Status code 200 is in the accepted range: 200,404 00:08:54.198 Saving response body to /var/jenkins/workspace/rocky9-vg-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:08:55.430 [Pipeline] sh 00:08:55.709 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:08:57.623 [Pipeline] sh 00:08:57.908 + git -C dpdk log --oneline -n5 00:08:57.908 caf0f5d395 version: 22.11.4 00:08:57.908 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:08:57.908 dc9c799c7d vhost: fix missing spinlock unlock 00:08:57.908 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:08:57.908 6ef77f2a5e net/gve: fix RX buffer size alignment 00:08:57.927 [Pipeline] writeFile 00:08:57.945 [Pipeline] sh 00:08:58.226 + jbp/jenkins/jjb-config/jobs/scripts/autorun_quirks.sh 00:08:58.238 [Pipeline] sh 00:08:58.518 + cat autorun-spdk.conf 00:08:58.518 SPDK_TEST_UNITTEST=1 00:08:58.518 SPDK_RUN_FUNCTIONAL_TEST=1 00:08:58.518 SPDK_TEST_BLOCKDEV=1 00:08:58.518 SPDK_TEST_DAOS=1 00:08:58.518 SPDK_RUN_ASAN=1 00:08:58.518 SPDK_TEST_USE_IGB_UIO=1 00:08:58.518 SPDK_TEST_RELEASE_BUILD=1 00:08:58.518 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:08:58.518 SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:08:58.518 SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:08:58.524 RUN_NIGHTLY=1 00:08:58.526 [Pipeline] } 00:08:58.544 [Pipeline] // stage 00:08:58.559 [Pipeline] stage 00:08:58.561 [Pipeline] { (Run VM) 00:08:58.576 [Pipeline] sh 00:08:58.858 + jbp/jenkins/jjb-config/jobs/scripts/prepare_nvme.sh 00:08:58.858 + echo 'Start stage prepare_nvme.sh' 00:08:58.858 Start stage prepare_nvme.sh 00:08:58.858 + [[ -n 1 ]] 00:08:58.858 + disk_prefix=ex1 00:08:58.858 + [[ -n /var/jenkins/workspace/rocky9-vg-autotest ]] 00:08:58.858 + [[ -e /var/jenkins/workspace/rocky9-vg-autotest/autorun-spdk.conf ]] 00:08:58.858 + source /var/jenkins/workspace/rocky9-vg-autotest/autorun-spdk.conf 00:08:58.858 ++ SPDK_TEST_UNITTEST=1 00:08:58.858 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:08:58.858 ++ SPDK_TEST_BLOCKDEV=1 00:08:58.858 ++ SPDK_TEST_DAOS=1 00:08:58.858 ++ SPDK_RUN_ASAN=1 00:08:58.858 ++ SPDK_TEST_USE_IGB_UIO=1 00:08:58.858 ++ SPDK_TEST_RELEASE_BUILD=1 00:08:58.858 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:08:58.858 ++ SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:08:58.858 ++ SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:08:58.858 ++ RUN_NIGHTLY=1 00:08:58.858 + cd /var/jenkins/workspace/rocky9-vg-autotest 00:08:58.858 + nvme_files=() 00:08:58.858 + declare -A nvme_files 00:08:58.858 + backend_dir=/var/lib/libvirt/images/backends 00:08:58.858 + nvme_files['nvme.img']=5G 00:08:58.858 + nvme_files['nvme-cmb.img']=5G 00:08:58.858 + nvme_files['nvme-multi0.img']=4G 00:08:58.858 + nvme_files['nvme-multi1.img']=4G 00:08:58.858 + nvme_files['nvme-multi2.img']=4G 00:08:58.858 + nvme_files['nvme-openstack.img']=8G 00:08:58.858 + nvme_files['nvme-zns.img']=5G 00:08:58.858 + (( SPDK_TEST_NVME_PMR == 1 )) 00:08:58.858 + (( SPDK_TEST_FTL == 1 )) 00:08:58.858 + (( SPDK_TEST_NVME_FDP == 1 )) 00:08:58.858 + [[ ! -d /var/lib/libvirt/images/backends ]] 00:08:58.858 + for nvme in "${!nvme_files[@]}" 00:08:58.858 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-multi2.img -s 4G 00:08:58.858 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-multi2.img', fmt=raw size=4294967296 preallocation=falloc 00:08:58.858 + for nvme in "${!nvme_files[@]}" 00:08:58.858 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-cmb.img -s 5G 00:08:59.117 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-cmb.img', fmt=raw size=5368709120 preallocation=falloc 00:08:59.117 + for nvme in "${!nvme_files[@]}" 00:08:59.117 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-openstack.img -s 8G 00:08:59.117 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-openstack.img', fmt=raw size=8589934592 preallocation=falloc 00:08:59.117 + for nvme in "${!nvme_files[@]}" 00:08:59.117 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-zns.img -s 5G 00:09:00.056 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-zns.img', fmt=raw size=5368709120 preallocation=falloc 00:09:00.056 + for nvme in "${!nvme_files[@]}" 00:09:00.056 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-multi1.img -s 4G 00:09:00.056 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-multi1.img', fmt=raw size=4294967296 preallocation=falloc 00:09:00.056 + for nvme in "${!nvme_files[@]}" 00:09:00.056 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme-multi0.img -s 4G 00:09:00.314 Formatting '/var/lib/libvirt/images/backends/ex1-nvme-multi0.img', fmt=raw size=4294967296 preallocation=falloc 00:09:00.314 + for nvme in "${!nvme_files[@]}" 00:09:00.314 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex1-nvme.img -s 5G 00:09:01.345 Formatting '/var/lib/libvirt/images/backends/ex1-nvme.img', fmt=raw size=5368709120 preallocation=falloc 00:09:01.345 ++ sudo grep -rl ex1-nvme.img /etc/libvirt/qemu 00:09:01.345 + echo 'End stage prepare_nvme.sh' 00:09:01.345 End stage prepare_nvme.sh 00:09:01.358 [Pipeline] sh 00:09:01.635 + DISTRO=rocky9 CPUS=10 RAM=12288 jbp/jenkins/jjb-config/jobs/scripts/vagrant_create_vm.sh 00:09:01.635 Setup: -n 10 -s 12288 -x http://proxy-dmz.intel.com:911 -p libvirt --qemu-emulator=/usr/local/qemu/vanilla-v8.0.0/bin/qemu-system-x86_64 --nic-model=e1000 -b /var/lib/libvirt/images/backends/ex1-nvme.img -H -a -v -f rocky9 00:09:01.635 00:09:01.635 DIR=/var/jenkins/workspace/rocky9-vg-autotest/spdk/scripts/vagrant 00:09:01.635 SPDK_DIR=/var/jenkins/workspace/rocky9-vg-autotest/spdk 00:09:01.635 VAGRANT_TARGET=/var/jenkins/workspace/rocky9-vg-autotest 00:09:01.635 HELP=0 00:09:01.635 DRY_RUN=0 00:09:01.635 NVME_FILE=/var/lib/libvirt/images/backends/ex1-nvme.img, 00:09:01.635 NVME_DISKS_TYPE=nvme, 00:09:01.635 NVME_AUTO_CREATE=0 00:09:01.635 NVME_DISKS_NAMESPACES=, 00:09:01.635 NVME_CMB=, 00:09:01.635 NVME_PMR=, 00:09:01.635 NVME_ZNS=, 00:09:01.635 NVME_MS=, 00:09:01.635 NVME_FDP=, 00:09:01.635 SPDK_VAGRANT_DISTRO=rocky9 00:09:01.635 SPDK_VAGRANT_VMCPU=10 00:09:01.635 SPDK_VAGRANT_VMRAM=12288 00:09:01.635 SPDK_VAGRANT_PROVIDER=libvirt 00:09:01.635 SPDK_VAGRANT_HTTP_PROXY=http://proxy-dmz.intel.com:911 00:09:01.635 SPDK_QEMU_EMULATOR=/usr/local/qemu/vanilla-v8.0.0/bin/qemu-system-x86_64 00:09:01.635 SPDK_OPENSTACK_NETWORK=0 00:09:01.635 VAGRANT_PACKAGE_BOX=0 00:09:01.635 VAGRANTFILE=/var/jenkins/workspace/rocky9-vg-autotest/spdk/scripts/vagrant/Vagrantfile 00:09:01.635 FORCE_DISTRO=true 00:09:01.635 VAGRANT_BOX_VERSION= 00:09:01.635 EXTRA_VAGRANTFILES= 00:09:01.635 NIC_MODEL=e1000 00:09:01.635 00:09:01.635 mkdir: created directory '/var/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt' 00:09:01.635 /var/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt /var/jenkins/workspace/rocky9-vg-autotest 00:09:04.920 Bringing machine 'default' up with 'libvirt' provider... 00:09:05.491 ==> default: Creating image (snapshot of base box volume). 00:09:05.750 ==> default: Creating domain with the following settings... 00:09:05.750 ==> default: -- Name: rocky9-9.0-1711172311-2200_default_1717762288_60b97fdd93de18afcaed 00:09:05.750 ==> default: -- Domain type: kvm 00:09:05.750 ==> default: -- Cpus: 10 00:09:05.750 ==> default: -- Feature: acpi 00:09:05.750 ==> default: -- Feature: apic 00:09:05.750 ==> default: -- Feature: pae 00:09:05.750 ==> default: -- Memory: 12288M 00:09:05.750 ==> default: -- Memory Backing: hugepages: 00:09:05.750 ==> default: -- Management MAC: 00:09:05.750 ==> default: -- Loader: 00:09:05.750 ==> default: -- Nvram: 00:09:05.750 ==> default: -- Base box: spdk/rocky9 00:09:05.750 ==> default: -- Storage pool: default 00:09:05.750 ==> default: -- Image: /var/lib/libvirt/images/rocky9-9.0-1711172311-2200_default_1717762288_60b97fdd93de18afcaed.img (20G) 00:09:05.750 ==> default: -- Volume Cache: default 00:09:05.750 ==> default: -- Kernel: 00:09:05.750 ==> default: -- Initrd: 00:09:05.750 ==> default: -- Graphics Type: vnc 00:09:05.750 ==> default: -- Graphics Port: -1 00:09:05.750 ==> default: -- Graphics IP: 127.0.0.1 00:09:05.750 ==> default: -- Graphics Password: Not defined 00:09:05.750 ==> default: -- Video Type: cirrus 00:09:05.750 ==> default: -- Video VRAM: 9216 00:09:05.750 ==> default: -- Sound Type: 00:09:05.750 ==> default: -- Keymap: en-us 00:09:05.750 ==> default: -- TPM Path: 00:09:05.750 ==> default: -- INPUT: type=mouse, bus=ps2 00:09:05.750 ==> default: -- Command line args: 00:09:05.750 ==> default: -> value=-device, 00:09:05.750 ==> default: -> value=nvme,id=nvme-0,serial=12340,addr=0x10, 00:09:05.750 ==> default: -> value=-drive, 00:09:05.750 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex1-nvme.img,if=none,id=nvme-0-drive0, 00:09:05.750 ==> default: -> value=-device, 00:09:05.750 ==> default: -> value=nvme-ns,drive=nvme-0-drive0,bus=nvme-0,nsid=1,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:09:06.011 ==> default: Creating shared folders metadata... 00:09:06.011 ==> default: Starting domain. 00:09:07.929 ==> default: Waiting for domain to get an IP address... 00:09:26.008 ==> default: Waiting for SSH to become available... 00:09:26.008 ==> default: Configuring and enabling network interfaces... 00:09:38.220 default: SSH address: 192.168.121.232:22 00:09:38.220 default: SSH username: vagrant 00:09:38.220 default: SSH auth method: private key 00:09:44.789 ==> default: Rsyncing folder: /mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/spdk/ => /home/vagrant/spdk_repo/spdk 00:09:52.909 ==> default: Rsyncing folder: /mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/dpdk/ => /home/vagrant/spdk_repo/dpdk 00:10:01.093 ==> default: Mounting SSHFS shared folder... 00:10:04.374 ==> default: Mounting folder via SSHFS: /mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt/output => /home/vagrant/spdk_repo/output 00:10:04.374 ==> default: Checking Mount.. 00:10:06.273 ==> default: Folder Successfully Mounted! 00:10:06.273 ==> default: Running provisioner: file... 00:10:07.681 default: ~/.gitconfig => .gitconfig 00:10:08.247 00:10:08.247 SUCCESS! 00:10:08.247 00:10:08.247 cd to /var/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt and type "vagrant ssh" to use. 00:10:08.247 Use vagrant "suspend" and vagrant "resume" to stop and start. 00:10:08.247 Use vagrant "destroy" followed by "rm -rf /var/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt" to destroy all trace of vm. 00:10:08.247 00:10:08.256 [Pipeline] } 00:10:08.275 [Pipeline] // stage 00:10:08.283 [Pipeline] dir 00:10:08.284 Running in /var/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt 00:10:08.285 [Pipeline] { 00:10:08.301 [Pipeline] catchError 00:10:08.303 [Pipeline] { 00:10:08.318 [Pipeline] sh 00:10:08.595 + vagrant ssh-config --host vagrant 00:10:08.595 + sed -ne+ /^Host/,$p 00:10:08.595 tee ssh_conf 00:10:11.917 Host vagrant 00:10:11.917 HostName 192.168.121.232 00:10:11.917 User vagrant 00:10:11.917 Port 22 00:10:11.917 UserKnownHostsFile /dev/null 00:10:11.917 StrictHostKeyChecking no 00:10:11.917 PasswordAuthentication no 00:10:11.917 IdentityFile /var/lib/libvirt/images/.vagrant.d/boxes/spdk-VAGRANTSLASH-rocky9/9.0-1711172311-2200/libvirt/rocky9 00:10:11.917 IdentitiesOnly yes 00:10:11.917 LogLevel FATAL 00:10:11.917 ForwardAgent yes 00:10:11.917 ForwardX11 yes 00:10:11.917 00:10:11.960 [Pipeline] withEnv 00:10:11.962 [Pipeline] { 00:10:11.978 [Pipeline] sh 00:10:12.257 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant #!/bin/bash 00:10:12.257 source /etc/os-release 00:10:12.257 [[ -e /image.version ]] && img=$(< /image.version) 00:10:12.257 # Minimal, systemd-like check. 00:10:12.257 if [[ -e /.dockerenv ]]; then 00:10:12.257 # Clear garbage from the node's name: 00:10:12.257 # agt-er_autotest_547-896 -> autotest_547-896 00:10:12.257 # $HOSTNAME is the actual container id 00:10:12.257 agent=$HOSTNAME@${DOCKER_SWARM_PLUGIN_JENKINS_AGENT_NAME#*_} 00:10:12.257 if grep -q "/etc/hostname" /proc/self/mountinfo; then 00:10:12.257 # We can assume this is a mount from a host where container is running, 00:10:12.257 # so fetch its hostname to easily identify the target swarm worker. 00:10:12.257 container="$(< /etc/hostname) ($agent)" 00:10:12.257 else 00:10:12.257 # Fallback 00:10:12.257 container=$agent 00:10:12.257 fi 00:10:12.257 fi 00:10:12.257 echo "${NAME} ${VERSION_ID}|$(uname -r)|${img:-N/A}|${container:-N/A}" 00:10:12.257 00:10:12.526 [Pipeline] } 00:10:12.546 [Pipeline] // withEnv 00:10:12.556 [Pipeline] setCustomBuildProperty 00:10:12.573 [Pipeline] stage 00:10:12.575 [Pipeline] { (Tests) 00:10:12.594 [Pipeline] sh 00:10:12.872 + scp -F ssh_conf -r /var/jenkins/workspace/rocky9-vg-autotest/jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh vagrant@vagrant:./ 00:10:13.142 [Pipeline] sh 00:10:13.419 + scp -F ssh_conf -r /var/jenkins/workspace/rocky9-vg-autotest/jbp/jenkins/jjb-config/jobs/scripts/pkgdep-autoruner.sh vagrant@vagrant:./ 00:10:13.691 [Pipeline] timeout 00:10:13.692 Timeout set to expire in 1 hr 30 min 00:10:13.694 [Pipeline] { 00:10:13.709 [Pipeline] sh 00:10:13.989 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant git -C spdk_repo/spdk reset --hard 00:10:14.555 HEAD is now at e55c9a812 vbdev_error: decrement error_num atomically 00:10:14.568 [Pipeline] sh 00:10:14.846 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant sudo chown vagrant:vagrant spdk_repo 00:10:15.104 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:10:15.119 [Pipeline] sh 00:10:15.398 + scp -F ssh_conf -r /var/jenkins/workspace/rocky9-vg-autotest/autorun-spdk.conf vagrant@vagrant:spdk_repo 00:10:15.670 [Pipeline] sh 00:10:15.949 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant JOB_BASE_NAME=rocky9-vg-autotest ./autoruner.sh spdk_repo 00:10:16.208 ++ readlink -f spdk_repo 00:10:16.208 + DIR_ROOT=/home/vagrant/spdk_repo 00:10:16.208 + [[ -n /home/vagrant/spdk_repo ]] 00:10:16.208 + DIR_SPDK=/home/vagrant/spdk_repo/spdk 00:10:16.208 + DIR_OUTPUT=/home/vagrant/spdk_repo/output 00:10:16.208 + [[ -d /home/vagrant/spdk_repo/spdk ]] 00:10:16.208 + [[ ! -d /home/vagrant/spdk_repo/output ]] 00:10:16.208 + [[ -d /home/vagrant/spdk_repo/output ]] 00:10:16.209 + [[ rocky9-vg-autotest == pkgdep-* ]] 00:10:16.209 + cd /home/vagrant/spdk_repo 00:10:16.209 + source /etc/os-release 00:10:16.209 ++ NAME='Rocky Linux' 00:10:16.209 ++ VERSION='9.3 (Blue Onyx)' 00:10:16.209 ++ ID=rocky 00:10:16.209 ++ ID_LIKE='rhel centos fedora' 00:10:16.209 ++ VERSION_ID=9.3 00:10:16.209 ++ PLATFORM_ID=platform:el9 00:10:16.209 ++ PRETTY_NAME='Rocky Linux 9.3 (Blue Onyx)' 00:10:16.209 ++ ANSI_COLOR='0;32' 00:10:16.209 ++ LOGO=fedora-logo-icon 00:10:16.209 ++ CPE_NAME=cpe:/o:rocky:rocky:9::baseos 00:10:16.209 ++ HOME_URL=https://rockylinux.org/ 00:10:16.209 ++ BUG_REPORT_URL=https://bugs.rockylinux.org/ 00:10:16.209 ++ SUPPORT_END=2032-05-31 00:10:16.209 ++ ROCKY_SUPPORT_PRODUCT=Rocky-Linux-9 00:10:16.209 ++ ROCKY_SUPPORT_PRODUCT_VERSION=9.3 00:10:16.209 ++ REDHAT_SUPPORT_PRODUCT='Rocky Linux' 00:10:16.209 ++ REDHAT_SUPPORT_PRODUCT_VERSION=9.3 00:10:16.209 + uname -a 00:10:16.209 Linux rocky9-cloud-1711172311-2200 5.14.0-362.24.1.el9_3.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Mar 13 17:33:16 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 00:10:16.209 + sudo /home/vagrant/spdk_repo/spdk/scripts/setup.sh status 00:10:16.209 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:10:16.467 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:10:16.467 Hugepages 00:10:16.467 node hugesize free / total 00:10:16.467 node0 1048576kB 0 / 0 00:10:16.467 node0 2048kB 0 / 0 00:10:16.467 00:10:16.467 Type BDF Vendor Device NUMA Driver Device Block devices 00:10:16.467 virtio 0000:00:03.0 1af4 1001 unknown virtio-pci - vda 00:10:16.467 NVMe 0000:00:10.0 1b36 0010 0 nvme nvme0 nvme0n1 00:10:16.467 + rm -f /tmp/spdk-ld-path 00:10:16.467 + source autorun-spdk.conf 00:10:16.467 ++ SPDK_TEST_UNITTEST=1 00:10:16.467 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:10:16.467 ++ SPDK_TEST_BLOCKDEV=1 00:10:16.467 ++ SPDK_TEST_DAOS=1 00:10:16.467 ++ SPDK_RUN_ASAN=1 00:10:16.467 ++ SPDK_TEST_USE_IGB_UIO=1 00:10:16.467 ++ SPDK_TEST_RELEASE_BUILD=1 00:10:16.467 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:10:16.467 ++ SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:10:16.467 ++ SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:10:16.467 ++ RUN_NIGHTLY=1 00:10:16.467 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:10:16.467 + [[ -n '' ]] 00:10:16.467 + sudo git config --global --add safe.directory /home/vagrant/spdk_repo/spdk 00:10:16.467 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:10:16.467 + for M in /var/spdk/build-*-manifest.txt 00:10:16.467 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:10:16.467 + cp /var/spdk/build-pkg-manifest.txt /home/vagrant/spdk_repo/output/ 00:10:16.467 + for M in /var/spdk/build-*-manifest.txt 00:10:16.467 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:10:16.467 + cp /var/spdk/build-repo-manifest.txt /home/vagrant/spdk_repo/output/ 00:10:16.467 ++ uname 00:10:16.467 + [[ Linux == \L\i\n\u\x ]] 00:10:16.467 + sudo dmesg -T 00:10:16.467 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:10:16.467 + sudo dmesg --clear 00:10:16.467 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:10:16.467 + dmesg_pid=8955 00:10:16.467 + [[ Rocky Linux == FreeBSD ]] 00:10:16.467 + sudo dmesg -Tw 00:10:16.467 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:16.467 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:16.467 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:10:16.467 + [[ -x /usr/src/fio-static/fio ]] 00:10:16.467 + [[ '' == \/\q\e\m\u\_\v\f\i\o\/* ]] 00:10:16.467 + [[ ! -v VFIO_QEMU_BIN ]] 00:10:16.467 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:10:16.467 + vfios=(/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64) 00:10:16.467 + export 'VFIO_QEMU_BIN=/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:10:16.467 + VFIO_QEMU_BIN='/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:10:16.467 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:10:16.467 + spdk/autorun.sh /home/vagrant/spdk_repo/autorun-spdk.conf 00:10:16.467 Test configuration: 00:10:16.467 SPDK_TEST_UNITTEST=1 00:10:16.467 SPDK_RUN_FUNCTIONAL_TEST=1 00:10:16.467 SPDK_TEST_BLOCKDEV=1 00:10:16.467 SPDK_TEST_DAOS=1 00:10:16.467 SPDK_RUN_ASAN=1 00:10:16.467 SPDK_TEST_USE_IGB_UIO=1 00:10:16.467 SPDK_TEST_RELEASE_BUILD=1 00:10:16.467 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:10:16.467 SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:10:16.467 SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:10:16.725 RUN_NIGHTLY=1 12:12:40 -- common/autobuild_common.sh@15 -- $ source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:10:16.725 12:12:40 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:16.725 12:12:40 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:16.725 12:12:40 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:16.725 12:12:40 -- paths/export.sh@2 -- $ PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:10:16.725 12:12:40 -- paths/export.sh@3 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:10:16.725 12:12:40 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:10:16.725 12:12:40 -- paths/export.sh@5 -- $ export PATH 00:10:16.725 12:12:40 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:10:16.725 12:12:40 -- common/autobuild_common.sh@436 -- $ out=/home/vagrant/spdk_repo/spdk/../output 00:10:16.725 12:12:40 -- common/autobuild_common.sh@437 -- $ date +%s 00:10:16.725 12:12:40 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1717762360.XXXXXX 00:10:16.725 12:12:40 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1717762360.Qhy7tS 00:10:16.725 12:12:40 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:10:16.725 12:12:40 -- common/autobuild_common.sh@443 -- $ '[' -n v22.11.4 ']' 00:10:16.725 12:12:40 -- common/autobuild_common.sh@444 -- $ dirname /home/vagrant/spdk_repo/dpdk/build 00:10:16.725 12:12:40 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude=' --exclude /home/vagrant/spdk_repo/dpdk' 00:10:16.725 12:12:40 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp' 00:10:16.725 12:12:40 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /home/vagrant/spdk_repo/spdk/../output/scan-build-tmp --exclude /home/vagrant/spdk_repo/dpdk --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp --status-bugs' 00:10:16.725 12:12:40 -- common/autobuild_common.sh@453 -- $ get_config_params 00:10:16.725 12:12:40 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:10:16.725 12:12:40 -- common/autotest_common.sh@10 -- $ set +x 00:10:16.725 12:12:40 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --enable-asan --enable-coverage --with-dpdk=/home/vagrant/spdk_repo/dpdk/build' 00:10:16.725 12:12:40 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:10:16.725 12:12:40 -- pm/common@17 -- $ local monitor 00:10:16.725 12:12:40 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.725 12:12:40 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.725 12:12:40 -- pm/common@21 -- $ date +%s 00:10:16.725 12:12:40 -- pm/common@25 -- $ sleep 1 00:10:16.725 12:12:40 -- pm/common@21 -- $ date +%s 00:10:16.725 12:12:40 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-cpu-load -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autobuild.sh.1717762360 00:10:16.725 12:12:40 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-vmstat -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autobuild.sh.1717762360 00:10:16.725 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autobuild.sh.1717762360_collect-vmstat.pm.log 00:10:16.725 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autobuild.sh.1717762360_collect-cpu-load.pm.log 00:10:17.699 12:12:41 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:10:17.699 12:12:41 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:10:17.699 12:12:41 -- spdk/autobuild.sh@12 -- $ umask 022 00:10:17.699 12:12:41 -- spdk/autobuild.sh@13 -- $ cd /home/vagrant/spdk_repo/spdk 00:10:17.699 12:12:41 -- spdk/autobuild.sh@16 -- $ date -u 00:10:17.699 Fri Jun 7 12:12:41 UTC 2024 00:10:17.699 12:12:41 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:10:17.699 v24.09-pre-53-ge55c9a812 00:10:17.699 12:12:41 -- spdk/autobuild.sh@19 -- $ '[' 1 -eq 1 ']' 00:10:17.699 12:12:41 -- spdk/autobuild.sh@20 -- $ run_test asan echo 'using asan' 00:10:17.699 12:12:41 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:10:17.699 12:12:41 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:10:17.699 12:12:41 -- common/autotest_common.sh@10 -- $ set +x 00:10:17.699 ************************************ 00:10:17.699 START TEST asan 00:10:17.699 ************************************ 00:10:17.699 using asan 00:10:17.699 12:12:41 asan -- common/autotest_common.sh@1124 -- $ echo 'using asan' 00:10:17.699 00:10:17.699 real 0m0.000s 00:10:17.699 user 0m0.000s 00:10:17.699 sys 0m0.000s 00:10:17.699 12:12:41 asan -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:10:17.699 12:12:41 asan -- common/autotest_common.sh@10 -- $ set +x 00:10:17.699 ************************************ 00:10:17.699 END TEST asan 00:10:17.699 ************************************ 00:10:17.699 12:12:41 -- spdk/autobuild.sh@23 -- $ '[' 0 -eq 1 ']' 00:10:17.699 12:12:41 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:10:17.699 12:12:41 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:10:17.699 12:12:41 -- common/autobuild_common.sh@429 -- $ run_test build_native_dpdk _build_native_dpdk 00:10:17.699 12:12:41 -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:10:17.699 12:12:41 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:10:17.699 12:12:41 -- common/autotest_common.sh@10 -- $ set +x 00:10:17.699 ************************************ 00:10:17.699 START TEST build_native_dpdk 00:10:17.699 ************************************ 00:10:17.699 12:12:41 build_native_dpdk -- common/autotest_common.sh@1124 -- $ _build_native_dpdk 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=11 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=11 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/home/vagrant/spdk_repo/dpdk/build 00:10:17.699 12:12:41 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /home/vagrant/spdk_repo/dpdk/build 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/home/vagrant/spdk_repo/dpdk 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /home/vagrant/spdk_repo/dpdk ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/home/vagrant/spdk_repo/spdk 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /home/vagrant/spdk_repo/dpdk log --oneline -n 5 00:10:17.958 caf0f5d395 version: 22.11.4 00:10:17.958 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:10:17.958 dc9c799c7d vhost: fix missing spinlock unlock 00:10:17.958 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:10:17.958 6ef77f2a5e net/gve: fix RX buffer size alignment 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 11 -ge 5 ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 11 -ge 10 ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /home/vagrant/spdk_repo/dpdk 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@370 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=<' 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@342 -- $ : 1 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:10:17.958 12:12:41 build_native_dpdk -- scripts/common.sh@364 -- $ return 1 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:10:17.958 patching file config/rte_config.h 00:10:17.958 Hunk #1 succeeded at 60 (offset 1 line). 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@178 -- $ uname -s 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:10:17.958 12:12:41 build_native_dpdk -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/home/vagrant/spdk_repo/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:10:24.541 The Meson build system 00:10:24.541 Version: 1.4.0 00:10:24.541 Source dir: /home/vagrant/spdk_repo/dpdk 00:10:24.541 Build dir: /home/vagrant/spdk_repo/dpdk/build-tmp 00:10:24.541 Build type: native build 00:10:24.541 Program cat found: YES (/bin/cat) 00:10:24.541 Project name: DPDK 00:10:24.541 Project version: 22.11.4 00:10:24.541 C compiler for the host machine: gcc (gcc 11.4.1 "gcc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:10:24.541 C linker for the host machine: gcc ld.bfd 2.35.2-42 00:10:24.541 Host machine cpu family: x86_64 00:10:24.541 Host machine cpu: x86_64 00:10:24.541 Message: ## Building in Developer Mode ## 00:10:24.541 Program pkg-config found: YES (/bin/pkg-config) 00:10:24.541 Program check-symbols.sh found: YES (/home/vagrant/spdk_repo/dpdk/buildtools/check-symbols.sh) 00:10:24.541 Program options-ibverbs-static.sh found: YES (/home/vagrant/spdk_repo/dpdk/buildtools/options-ibverbs-static.sh) 00:10:24.541 Program objdump found: YES (/bin/objdump) 00:10:24.541 Program python3 found: YES (/usr/bin/python3) 00:10:24.541 Program cat found: YES (/bin/cat) 00:10:24.541 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:10:24.541 Checking for size of "void *" : 8 00:10:24.541 Checking for size of "void *" : 8 (cached) 00:10:24.541 Library m found: YES 00:10:24.541 Library numa found: YES 00:10:24.541 Has header "numaif.h" : YES 00:10:24.541 Library fdt found: NO 00:10:24.541 Library execinfo found: NO 00:10:24.541 Has header "execinfo.h" : YES 00:10:24.541 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:10:24.541 Run-time dependency libarchive found: NO (tried pkgconfig) 00:10:24.541 Run-time dependency libbsd found: NO (tried pkgconfig) 00:10:24.541 Run-time dependency jansson found: NO (tried pkgconfig) 00:10:24.541 Run-time dependency openssl found: YES 3.0.7 00:10:24.541 Run-time dependency libpcap found: NO (tried pkgconfig) 00:10:24.541 Library pcap found: NO 00:10:24.541 Compiler for C supports arguments -Wcast-qual: YES 00:10:24.541 Compiler for C supports arguments -Wdeprecated: YES 00:10:24.541 Compiler for C supports arguments -Wformat: YES 00:10:24.541 Compiler for C supports arguments -Wformat-nonliteral: NO 00:10:24.541 Compiler for C supports arguments -Wformat-security: NO 00:10:24.541 Compiler for C supports arguments -Wmissing-declarations: YES 00:10:24.541 Compiler for C supports arguments -Wmissing-prototypes: YES 00:10:24.541 Compiler for C supports arguments -Wnested-externs: YES 00:10:24.541 Compiler for C supports arguments -Wold-style-definition: YES 00:10:24.541 Compiler for C supports arguments -Wpointer-arith: YES 00:10:24.541 Compiler for C supports arguments -Wsign-compare: YES 00:10:24.541 Compiler for C supports arguments -Wstrict-prototypes: YES 00:10:24.541 Compiler for C supports arguments -Wundef: YES 00:10:24.541 Compiler for C supports arguments -Wwrite-strings: YES 00:10:24.541 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:10:24.541 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:10:24.541 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:10:24.541 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:10:24.541 Compiler for C supports arguments -mavx512f: YES 00:10:24.541 Checking if "AVX512 checking" compiles: YES 00:10:24.541 Fetching value of define "__SSE4_2__" : 1 00:10:24.541 Fetching value of define "__AES__" : 1 00:10:24.541 Fetching value of define "__AVX__" : 1 00:10:24.541 Fetching value of define "__AVX2__" : 1 00:10:24.541 Fetching value of define "__AVX512BW__" : 1 00:10:24.541 Fetching value of define "__AVX512CD__" : 1 00:10:24.541 Fetching value of define "__AVX512DQ__" : 1 00:10:24.541 Fetching value of define "__AVX512F__" : 1 00:10:24.541 Fetching value of define "__AVX512VL__" : 1 00:10:24.541 Fetching value of define "__PCLMUL__" : 1 00:10:24.541 Fetching value of define "__RDRND__" : 1 00:10:24.541 Fetching value of define "__RDSEED__" : 1 00:10:24.541 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:10:24.541 Compiler for C supports arguments -Wno-format-truncation: YES 00:10:24.541 Message: lib/kvargs: Defining dependency "kvargs" 00:10:24.541 Message: lib/telemetry: Defining dependency "telemetry" 00:10:24.541 Checking for function "getentropy" : YES 00:10:24.541 Message: lib/eal: Defining dependency "eal" 00:10:24.541 Message: lib/ring: Defining dependency "ring" 00:10:24.541 Message: lib/rcu: Defining dependency "rcu" 00:10:24.541 Message: lib/mempool: Defining dependency "mempool" 00:10:24.541 Message: lib/mbuf: Defining dependency "mbuf" 00:10:24.541 Fetching value of define "__PCLMUL__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512F__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512BW__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512VL__" : 1 (cached) 00:10:24.541 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:10:24.541 Compiler for C supports arguments -mpclmul: YES 00:10:24.541 Compiler for C supports arguments -maes: YES 00:10:24.541 Compiler for C supports arguments -mavx512f: YES (cached) 00:10:24.541 Compiler for C supports arguments -mavx512bw: YES 00:10:24.541 Compiler for C supports arguments -mavx512dq: YES 00:10:24.541 Compiler for C supports arguments -mavx512vl: YES 00:10:24.541 Compiler for C supports arguments -mvpclmulqdq: YES 00:10:24.541 Compiler for C supports arguments -mavx2: YES 00:10:24.541 Compiler for C supports arguments -mavx: YES 00:10:24.541 Message: lib/net: Defining dependency "net" 00:10:24.541 Message: lib/meter: Defining dependency "meter" 00:10:24.541 Message: lib/ethdev: Defining dependency "ethdev" 00:10:24.541 Message: lib/pci: Defining dependency "pci" 00:10:24.541 Message: lib/cmdline: Defining dependency "cmdline" 00:10:24.541 Message: lib/metrics: Defining dependency "metrics" 00:10:24.541 Message: lib/hash: Defining dependency "hash" 00:10:24.541 Message: lib/timer: Defining dependency "timer" 00:10:24.541 Fetching value of define "__AVX2__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512F__" : 1 (cached) 00:10:24.541 Fetching value of define "__AVX512VL__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512CD__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512BW__" : 1 (cached) 00:10:24.542 Message: lib/acl: Defining dependency "acl" 00:10:24.542 Message: lib/bbdev: Defining dependency "bbdev" 00:10:24.542 Message: lib/bitratestats: Defining dependency "bitratestats" 00:10:24.542 Run-time dependency libelf found: YES 0.189 00:10:24.542 lib/bpf/meson.build:43: WARNING: libpcap is missing, rte_bpf_convert API will be disabled 00:10:24.542 Message: lib/bpf: Defining dependency "bpf" 00:10:24.542 Message: lib/cfgfile: Defining dependency "cfgfile" 00:10:24.542 Message: lib/compressdev: Defining dependency "compressdev" 00:10:24.542 Message: lib/cryptodev: Defining dependency "cryptodev" 00:10:24.542 Message: lib/distributor: Defining dependency "distributor" 00:10:24.542 Message: lib/efd: Defining dependency "efd" 00:10:24.542 Message: lib/eventdev: Defining dependency "eventdev" 00:10:24.542 Message: lib/gpudev: Defining dependency "gpudev" 00:10:24.542 Message: lib/gro: Defining dependency "gro" 00:10:24.542 Message: lib/gso: Defining dependency "gso" 00:10:24.542 Message: lib/ip_frag: Defining dependency "ip_frag" 00:10:24.542 Message: lib/jobstats: Defining dependency "jobstats" 00:10:24.542 Message: lib/latencystats: Defining dependency "latencystats" 00:10:24.542 Message: lib/lpm: Defining dependency "lpm" 00:10:24.542 Fetching value of define "__AVX512F__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512IFMA__" : (undefined) 00:10:24.542 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:10:24.542 Message: lib/member: Defining dependency "member" 00:10:24.542 Message: lib/pcapng: Defining dependency "pcapng" 00:10:24.542 Compiler for C supports arguments -Wno-cast-qual: YES 00:10:24.542 Message: lib/power: Defining dependency "power" 00:10:24.542 Message: lib/rawdev: Defining dependency "rawdev" 00:10:24.542 Message: lib/regexdev: Defining dependency "regexdev" 00:10:24.542 Message: lib/dmadev: Defining dependency "dmadev" 00:10:24.542 Message: lib/rib: Defining dependency "rib" 00:10:24.542 Message: lib/reorder: Defining dependency "reorder" 00:10:24.542 Message: lib/sched: Defining dependency "sched" 00:10:24.542 Message: lib/security: Defining dependency "security" 00:10:24.542 Message: lib/stack: Defining dependency "stack" 00:10:24.542 Has header "linux/userfaultfd.h" : YES 00:10:24.542 Message: lib/vhost: Defining dependency "vhost" 00:10:24.542 Message: lib/ipsec: Defining dependency "ipsec" 00:10:24.542 Fetching value of define "__AVX512F__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:10:24.542 Fetching value of define "__AVX512BW__" : 1 (cached) 00:10:24.542 Message: lib/fib: Defining dependency "fib" 00:10:24.542 Message: lib/port: Defining dependency "port" 00:10:24.542 Message: lib/pdump: Defining dependency "pdump" 00:10:24.542 Message: lib/table: Defining dependency "table" 00:10:24.542 Message: lib/pipeline: Defining dependency "pipeline" 00:10:24.542 Message: lib/graph: Defining dependency "graph" 00:10:24.542 Message: lib/node: Defining dependency "node" 00:10:24.542 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:10:24.542 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:10:24.542 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:10:24.542 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:10:24.542 Compiler for C supports arguments -Wno-sign-compare: YES 00:10:24.542 Compiler for C supports arguments -Wno-unused-value: YES 00:10:24.542 Compiler for C supports arguments -Wno-format: YES 00:10:24.542 Compiler for C supports arguments -Wno-format-security: YES 00:10:24.542 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:10:25.956 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:10:25.956 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:10:25.956 Compiler for C supports arguments -Wno-unused-parameter: YES 00:10:25.956 Fetching value of define "__AVX2__" : 1 (cached) 00:10:25.956 Fetching value of define "__AVX512F__" : 1 (cached) 00:10:25.956 Fetching value of define "__AVX512BW__" : 1 (cached) 00:10:25.956 Compiler for C supports arguments -mavx512f: YES (cached) 00:10:25.956 Compiler for C supports arguments -mavx512bw: YES (cached) 00:10:25.956 Compiler for C supports arguments -march=skylake-avx512: YES 00:10:25.956 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:10:25.956 Program doxygen found: YES (/bin/doxygen) 00:10:25.956 Configuring doxy-api.conf using configuration 00:10:25.956 Program sphinx-build found: NO 00:10:25.956 Configuring rte_build_config.h using configuration 00:10:25.956 Message: 00:10:25.956 ================= 00:10:25.956 Applications Enabled 00:10:25.956 ================= 00:10:25.956 00:10:25.956 apps: 00:10:25.956 pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, test-eventdev, 00:10:25.956 test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, test-security-perf, 00:10:25.956 00:10:25.956 00:10:25.956 Message: 00:10:25.956 ================= 00:10:25.956 Libraries Enabled 00:10:25.956 ================= 00:10:25.956 00:10:25.956 libs: 00:10:25.956 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:10:25.956 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:10:25.956 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:10:25.956 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:10:25.956 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:10:25.956 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:10:25.956 table, pipeline, graph, node, 00:10:25.956 00:10:25.956 Message: 00:10:25.956 =============== 00:10:25.956 Drivers Enabled 00:10:25.956 =============== 00:10:25.956 00:10:25.956 common: 00:10:25.956 00:10:25.956 bus: 00:10:25.956 pci, vdev, 00:10:25.956 mempool: 00:10:25.956 ring, 00:10:25.956 dma: 00:10:25.956 00:10:25.956 net: 00:10:25.956 i40e, 00:10:25.956 raw: 00:10:25.956 00:10:25.956 crypto: 00:10:25.956 00:10:25.956 compress: 00:10:25.956 00:10:25.956 regex: 00:10:25.956 00:10:25.956 vdpa: 00:10:25.956 00:10:25.956 event: 00:10:25.956 00:10:25.956 baseband: 00:10:25.956 00:10:25.956 gpu: 00:10:25.956 00:10:25.956 00:10:25.956 Message: 00:10:25.956 ================= 00:10:25.956 Content Skipped 00:10:25.956 ================= 00:10:25.956 00:10:25.956 apps: 00:10:25.957 dumpcap: missing dependency, "libpcap" 00:10:25.957 00:10:25.957 libs: 00:10:25.957 kni: explicitly disabled via build config (deprecated lib) 00:10:25.957 flow_classify: explicitly disabled via build config (deprecated lib) 00:10:25.957 00:10:25.957 drivers: 00:10:25.957 common/cpt: not in enabled drivers build config 00:10:25.957 common/dpaax: not in enabled drivers build config 00:10:25.957 common/iavf: not in enabled drivers build config 00:10:25.957 common/idpf: not in enabled drivers build config 00:10:25.957 common/mvep: not in enabled drivers build config 00:10:25.957 common/octeontx: not in enabled drivers build config 00:10:25.957 bus/auxiliary: not in enabled drivers build config 00:10:25.957 bus/dpaa: not in enabled drivers build config 00:10:25.957 bus/fslmc: not in enabled drivers build config 00:10:25.957 bus/ifpga: not in enabled drivers build config 00:10:25.957 bus/vmbus: not in enabled drivers build config 00:10:25.957 common/cnxk: not in enabled drivers build config 00:10:25.957 common/mlx5: not in enabled drivers build config 00:10:25.957 common/qat: not in enabled drivers build config 00:10:25.957 common/sfc_efx: not in enabled drivers build config 00:10:25.957 mempool/bucket: not in enabled drivers build config 00:10:25.957 mempool/cnxk: not in enabled drivers build config 00:10:25.957 mempool/dpaa: not in enabled drivers build config 00:10:25.957 mempool/dpaa2: not in enabled drivers build config 00:10:25.957 mempool/octeontx: not in enabled drivers build config 00:10:25.957 mempool/stack: not in enabled drivers build config 00:10:25.957 dma/cnxk: not in enabled drivers build config 00:10:25.957 dma/dpaa: not in enabled drivers build config 00:10:25.957 dma/dpaa2: not in enabled drivers build config 00:10:25.957 dma/hisilicon: not in enabled drivers build config 00:10:25.957 dma/idxd: not in enabled drivers build config 00:10:25.957 dma/ioat: not in enabled drivers build config 00:10:25.957 dma/skeleton: not in enabled drivers build config 00:10:25.957 net/af_packet: not in enabled drivers build config 00:10:25.957 net/af_xdp: not in enabled drivers build config 00:10:25.957 net/ark: not in enabled drivers build config 00:10:25.957 net/atlantic: not in enabled drivers build config 00:10:25.957 net/avp: not in enabled drivers build config 00:10:25.957 net/axgbe: not in enabled drivers build config 00:10:25.957 net/bnx2x: not in enabled drivers build config 00:10:25.957 net/bnxt: not in enabled drivers build config 00:10:25.957 net/bonding: not in enabled drivers build config 00:10:25.957 net/cnxk: not in enabled drivers build config 00:10:25.957 net/cxgbe: not in enabled drivers build config 00:10:25.957 net/dpaa: not in enabled drivers build config 00:10:25.957 net/dpaa2: not in enabled drivers build config 00:10:25.957 net/e1000: not in enabled drivers build config 00:10:25.957 net/ena: not in enabled drivers build config 00:10:25.957 net/enetc: not in enabled drivers build config 00:10:25.957 net/enetfec: not in enabled drivers build config 00:10:25.957 net/enic: not in enabled drivers build config 00:10:25.957 net/failsafe: not in enabled drivers build config 00:10:25.957 net/fm10k: not in enabled drivers build config 00:10:25.957 net/gve: not in enabled drivers build config 00:10:25.957 net/hinic: not in enabled drivers build config 00:10:25.957 net/hns3: not in enabled drivers build config 00:10:25.957 net/iavf: not in enabled drivers build config 00:10:25.957 net/ice: not in enabled drivers build config 00:10:25.957 net/idpf: not in enabled drivers build config 00:10:25.957 net/igc: not in enabled drivers build config 00:10:25.957 net/ionic: not in enabled drivers build config 00:10:25.957 net/ipn3ke: not in enabled drivers build config 00:10:25.957 net/ixgbe: not in enabled drivers build config 00:10:25.957 net/kni: not in enabled drivers build config 00:10:25.957 net/liquidio: not in enabled drivers build config 00:10:25.957 net/mana: not in enabled drivers build config 00:10:25.957 net/memif: not in enabled drivers build config 00:10:25.957 net/mlx4: not in enabled drivers build config 00:10:25.957 net/mlx5: not in enabled drivers build config 00:10:25.957 net/mvneta: not in enabled drivers build config 00:10:25.957 net/mvpp2: not in enabled drivers build config 00:10:25.957 net/netvsc: not in enabled drivers build config 00:10:25.957 net/nfb: not in enabled drivers build config 00:10:25.957 net/nfp: not in enabled drivers build config 00:10:25.957 net/ngbe: not in enabled drivers build config 00:10:25.957 net/null: not in enabled drivers build config 00:10:25.957 net/octeontx: not in enabled drivers build config 00:10:25.957 net/octeon_ep: not in enabled drivers build config 00:10:25.957 net/pcap: not in enabled drivers build config 00:10:25.957 net/pfe: not in enabled drivers build config 00:10:25.957 net/qede: not in enabled drivers build config 00:10:25.957 net/ring: not in enabled drivers build config 00:10:25.957 net/sfc: not in enabled drivers build config 00:10:25.957 net/softnic: not in enabled drivers build config 00:10:25.957 net/tap: not in enabled drivers build config 00:10:25.957 net/thunderx: not in enabled drivers build config 00:10:25.957 net/txgbe: not in enabled drivers build config 00:10:25.957 net/vdev_netvsc: not in enabled drivers build config 00:10:25.957 net/vhost: not in enabled drivers build config 00:10:25.957 net/virtio: not in enabled drivers build config 00:10:25.957 net/vmxnet3: not in enabled drivers build config 00:10:25.957 raw/cnxk_bphy: not in enabled drivers build config 00:10:25.957 raw/cnxk_gpio: not in enabled drivers build config 00:10:25.957 raw/dpaa2_cmdif: not in enabled drivers build config 00:10:25.957 raw/ifpga: not in enabled drivers build config 00:10:25.957 raw/ntb: not in enabled drivers build config 00:10:25.957 raw/skeleton: not in enabled drivers build config 00:10:25.957 crypto/armv8: not in enabled drivers build config 00:10:25.957 crypto/bcmfs: not in enabled drivers build config 00:10:25.957 crypto/caam_jr: not in enabled drivers build config 00:10:25.957 crypto/ccp: not in enabled drivers build config 00:10:25.957 crypto/cnxk: not in enabled drivers build config 00:10:25.957 crypto/dpaa_sec: not in enabled drivers build config 00:10:25.957 crypto/dpaa2_sec: not in enabled drivers build config 00:10:25.957 crypto/ipsec_mb: not in enabled drivers build config 00:10:25.957 crypto/mlx5: not in enabled drivers build config 00:10:25.957 crypto/mvsam: not in enabled drivers build config 00:10:25.957 crypto/nitrox: not in enabled drivers build config 00:10:25.957 crypto/null: not in enabled drivers build config 00:10:25.957 crypto/octeontx: not in enabled drivers build config 00:10:25.957 crypto/openssl: not in enabled drivers build config 00:10:25.957 crypto/scheduler: not in enabled drivers build config 00:10:25.957 crypto/uadk: not in enabled drivers build config 00:10:25.957 crypto/virtio: not in enabled drivers build config 00:10:25.957 compress/isal: not in enabled drivers build config 00:10:25.957 compress/mlx5: not in enabled drivers build config 00:10:25.957 compress/octeontx: not in enabled drivers build config 00:10:25.957 compress/zlib: not in enabled drivers build config 00:10:25.957 regex/mlx5: not in enabled drivers build config 00:10:25.957 regex/cn9k: not in enabled drivers build config 00:10:25.957 vdpa/ifc: not in enabled drivers build config 00:10:25.957 vdpa/mlx5: not in enabled drivers build config 00:10:25.957 vdpa/sfc: not in enabled drivers build config 00:10:25.957 event/cnxk: not in enabled drivers build config 00:10:25.957 event/dlb2: not in enabled drivers build config 00:10:25.957 event/dpaa: not in enabled drivers build config 00:10:25.957 event/dpaa2: not in enabled drivers build config 00:10:25.957 event/dsw: not in enabled drivers build config 00:10:25.957 event/opdl: not in enabled drivers build config 00:10:25.957 event/skeleton: not in enabled drivers build config 00:10:25.957 event/sw: not in enabled drivers build config 00:10:25.957 event/octeontx: not in enabled drivers build config 00:10:25.957 baseband/acc: not in enabled drivers build config 00:10:25.957 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:10:25.957 baseband/fpga_lte_fec: not in enabled drivers build config 00:10:25.957 baseband/la12xx: not in enabled drivers build config 00:10:25.958 baseband/null: not in enabled drivers build config 00:10:25.958 baseband/turbo_sw: not in enabled drivers build config 00:10:25.958 gpu/cuda: not in enabled drivers build config 00:10:25.958 00:10:25.958 00:10:25.958 Build targets in project: 310 00:10:25.958 00:10:25.958 DPDK 22.11.4 00:10:25.958 00:10:25.958 User defined options 00:10:25.958 libdir : lib 00:10:25.958 prefix : /home/vagrant/spdk_repo/dpdk/build 00:10:25.958 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:10:25.958 c_link_args : 00:10:25.958 enable_docs : false 00:10:25.958 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:10:25.958 enable_kmods : false 00:10:25.958 machine : native 00:10:25.958 tests : false 00:10:25.958 00:10:25.958 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:10:25.958 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:10:26.214 12:12:49 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ninja -C /home/vagrant/spdk_repo/dpdk/build-tmp -j10 00:10:26.214 ninja: Entering directory `/home/vagrant/spdk_repo/dpdk/build-tmp' 00:10:26.214 [1/737] Generating lib/rte_telemetry_def with a custom command 00:10:26.214 [2/737] Generating lib/rte_kvargs_def with a custom command 00:10:26.214 [3/737] Generating lib/rte_kvargs_mingw with a custom command 00:10:26.214 [4/737] Generating lib/rte_telemetry_mingw with a custom command 00:10:26.214 [5/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:10:26.471 [6/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:10:26.471 [7/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:10:26.471 [8/737] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:10:26.471 [9/737] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:10:26.471 [10/737] Linking static target lib/librte_kvargs.a 00:10:26.471 [11/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:10:26.471 [12/737] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:10:26.471 [13/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:10:26.471 [14/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:10:26.471 [15/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:10:26.471 [16/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:10:26.728 [17/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:10:26.728 [18/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:10:26.728 [19/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:10:26.728 [20/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:10:26.728 [21/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:10:26.728 [22/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:10:26.728 [23/737] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:10:26.728 [24/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:10:26.985 [25/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:10:26.985 [26/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:10:26.985 [27/737] Linking target lib/librte_kvargs.so.23.0 00:10:26.985 [28/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:10:26.985 [29/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:10:26.985 [30/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:10:26.985 [31/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:10:26.985 [32/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:10:27.242 [33/737] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:10:27.242 [34/737] Linking static target lib/librte_telemetry.a 00:10:27.242 [35/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:10:27.242 [36/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:10:27.242 [37/737] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:10:27.242 [38/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:10:27.242 [39/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:10:27.242 [40/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:10:27.242 [41/737] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:10:27.242 [42/737] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:10:27.499 [43/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:10:27.499 [44/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:10:27.499 [45/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:10:27.499 [46/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:10:27.499 [47/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:10:27.756 [48/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:10:27.756 [49/737] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:10:27.756 [50/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:10:27.756 [51/737] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:10:27.756 [52/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:10:27.756 [53/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:10:27.756 [54/737] Linking target lib/librte_telemetry.so.23.0 00:10:27.756 [55/737] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:10:27.756 [56/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:10:27.756 [57/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:10:27.756 [58/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:10:27.756 [59/737] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:10:27.756 [60/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:10:27.756 [61/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:10:27.756 [62/737] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:10:27.756 [63/737] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:10:27.756 [64/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:10:27.756 [65/737] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:10:27.756 [66/737] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:10:28.013 [67/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:10:28.013 [68/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:10:28.013 [69/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:10:28.013 [70/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:10:28.013 [71/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:10:28.013 [72/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:10:28.013 [73/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:10:28.013 [74/737] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:10:28.013 [75/737] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:10:28.013 [76/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:10:28.013 [77/737] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:10:28.013 [78/737] Generating lib/rte_eal_def with a custom command 00:10:28.013 [79/737] Generating lib/rte_eal_mingw with a custom command 00:10:28.013 [80/737] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:10:28.270 [81/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:10:28.270 [82/737] Generating lib/rte_ring_def with a custom command 00:10:28.270 [83/737] Generating lib/rte_rcu_def with a custom command 00:10:28.270 [84/737] Generating lib/rte_ring_mingw with a custom command 00:10:28.270 [85/737] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:10:28.270 [86/737] Generating lib/rte_rcu_mingw with a custom command 00:10:28.270 [87/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:10:28.270 [88/737] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:10:28.270 [89/737] Linking static target lib/librte_ring.a 00:10:28.270 [90/737] Generating lib/rte_mempool_def with a custom command 00:10:28.270 [91/737] Generating lib/rte_mempool_mingw with a custom command 00:10:28.270 [92/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:10:28.529 [93/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:10:28.529 [94/737] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:10:28.529 [95/737] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:10:28.529 [96/737] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:10:28.529 [97/737] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:10:28.529 [98/737] Generating lib/rte_mbuf_def with a custom command 00:10:28.788 [99/737] Generating lib/rte_mbuf_mingw with a custom command 00:10:28.788 [100/737] Linking static target lib/librte_eal.a 00:10:28.788 [101/737] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:10:28.788 [102/737] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:10:28.788 [103/737] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:10:28.788 [104/737] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:10:28.788 [105/737] Linking static target lib/librte_rcu.a 00:10:29.046 [106/737] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:10:29.046 [107/737] Linking static target lib/librte_mempool.a 00:10:29.046 [108/737] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:10:29.046 [109/737] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:10:29.046 [110/737] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:10:29.046 [111/737] Linking static target lib/net/libnet_crc_avx512_lib.a 00:10:29.046 [112/737] Generating lib/rte_net_def with a custom command 00:10:29.046 [113/737] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:10:29.046 [114/737] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:10:29.305 [115/737] Generating lib/rte_net_mingw with a custom command 00:10:29.305 [116/737] Generating lib/rte_meter_def with a custom command 00:10:29.305 [117/737] Generating lib/rte_meter_mingw with a custom command 00:10:29.305 [118/737] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:10:29.305 [119/737] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:10:29.305 [120/737] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:10:29.305 [121/737] Linking static target lib/librte_meter.a 00:10:29.563 [122/737] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:10:29.563 [123/737] Linking static target lib/librte_net.a 00:10:29.563 [124/737] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:10:29.563 [125/737] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:10:29.563 [126/737] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:10:29.563 [127/737] Linking static target lib/librte_mbuf.a 00:10:29.563 [128/737] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:10:29.821 [129/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:10:29.821 [130/737] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:10:29.821 [131/737] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:10:29.821 [132/737] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:10:30.080 [133/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:10:30.080 [134/737] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:10:30.080 [135/737] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:10:30.339 [136/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:10:30.339 [137/737] Generating lib/rte_ethdev_def with a custom command 00:10:30.339 [138/737] Generating lib/rte_ethdev_mingw with a custom command 00:10:30.339 [139/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:10:30.339 [140/737] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:10:30.339 [141/737] Generating lib/rte_pci_mingw with a custom command 00:10:30.339 [142/737] Generating lib/rte_pci_def with a custom command 00:10:30.339 [143/737] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:10:30.339 [144/737] Linking static target lib/librte_pci.a 00:10:30.598 [145/737] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:10:30.598 [146/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:10:30.598 [147/737] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:10:30.598 [148/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:10:30.598 [149/737] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:10:30.598 [150/737] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:10:30.598 [151/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:10:30.856 [152/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:10:30.856 [153/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:10:30.856 [154/737] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:10:30.856 [155/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:10:30.856 [156/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:10:30.856 [157/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:10:30.856 [158/737] Generating lib/rte_cmdline_mingw with a custom command 00:10:30.856 [159/737] Generating lib/rte_cmdline_def with a custom command 00:10:30.856 [160/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:10:30.856 [161/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:10:30.856 [162/737] Generating lib/rte_metrics_def with a custom command 00:10:30.856 [163/737] Generating lib/rte_metrics_mingw with a custom command 00:10:30.856 [164/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:10:30.856 [165/737] Generating lib/rte_hash_def with a custom command 00:10:30.856 [166/737] Generating lib/rte_hash_mingw with a custom command 00:10:30.856 [167/737] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:10:31.114 [168/737] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:10:31.114 [169/737] Linking static target lib/librte_cmdline.a 00:10:31.114 [170/737] Generating lib/rte_timer_def with a custom command 00:10:31.114 [171/737] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:10:31.114 [172/737] Generating lib/rte_timer_mingw with a custom command 00:10:31.114 [173/737] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:10:31.114 [174/737] Linking static target lib/librte_metrics.a 00:10:31.373 [175/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:10:31.373 [176/737] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:10:31.373 [177/737] Linking static target lib/librte_timer.a 00:10:31.632 [178/737] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:10:31.632 [179/737] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:10:31.632 [180/737] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:10:31.890 [181/737] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:10:31.890 [182/737] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:10:31.890 [183/737] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:10:31.890 [184/737] Linking static target lib/librte_ethdev.a 00:10:31.890 [185/737] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:10:31.890 [186/737] Generating lib/rte_acl_def with a custom command 00:10:32.147 [187/737] Generating lib/rte_acl_mingw with a custom command 00:10:32.148 [188/737] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:10:32.148 [189/737] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:10:32.148 [190/737] Generating lib/rte_bbdev_def with a custom command 00:10:32.148 [191/737] Generating lib/rte_bbdev_mingw with a custom command 00:10:32.148 [192/737] Generating lib/rte_bitratestats_def with a custom command 00:10:32.148 [193/737] Generating lib/rte_bitratestats_mingw with a custom command 00:10:32.148 [194/737] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:10:32.406 [195/737] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:10:32.406 [196/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:10:32.406 [197/737] Linking static target lib/librte_bitratestats.a 00:10:32.664 [198/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:10:32.664 [199/737] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:10:32.664 [200/737] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:10:32.664 [201/737] Linking static target lib/librte_bbdev.a 00:10:32.969 [202/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:10:32.969 [203/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:10:33.227 [204/737] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:10:33.227 [205/737] Linking static target lib/librte_hash.a 00:10:33.227 [206/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:10:33.486 [207/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:10:33.486 [208/737] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:33.486 [209/737] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:10:33.486 [210/737] Generating lib/rte_bpf_def with a custom command 00:10:33.746 [211/737] Generating lib/rte_bpf_mingw with a custom command 00:10:33.746 [212/737] Generating lib/rte_cfgfile_def with a custom command 00:10:33.746 [213/737] Generating lib/rte_cfgfile_mingw with a custom command 00:10:33.746 [214/737] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:10:34.005 [215/737] Linking static target lib/librte_cfgfile.a 00:10:34.005 [216/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:10:34.005 [217/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:10:34.005 [218/737] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:10:34.005 [219/737] Generating lib/rte_compressdev_def with a custom command 00:10:34.005 [220/737] Generating lib/rte_compressdev_mingw with a custom command 00:10:34.263 [221/737] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:10:34.263 [222/737] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:10:34.263 [223/737] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:10:34.263 [224/737] Generating lib/rte_cryptodev_def with a custom command 00:10:34.263 [225/737] Generating lib/rte_cryptodev_mingw with a custom command 00:10:34.263 [226/737] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:10:34.522 [227/737] Linking static target lib/librte_bpf.a 00:10:34.522 [228/737] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:10:34.522 [229/737] Linking static target lib/librte_compressdev.a 00:10:34.522 [230/737] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:10:34.522 [231/737] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:10:34.781 [232/737] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:10:34.781 [233/737] Generating lib/rte_distributor_def with a custom command 00:10:34.781 [234/737] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:10:34.781 [235/737] Generating lib/rte_distributor_mingw with a custom command 00:10:34.781 [236/737] Generating lib/rte_efd_def with a custom command 00:10:34.781 [237/737] Generating lib/rte_efd_mingw with a custom command 00:10:35.038 [238/737] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:10:35.039 [239/737] Linking static target lib/librte_acl.a 00:10:35.039 [240/737] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:10:35.039 [241/737] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:10:35.039 [242/737] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:10:35.039 [243/737] Linking static target lib/librte_distributor.a 00:10:35.297 [244/737] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:10:35.297 [245/737] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:10:35.297 [246/737] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:10:35.555 [247/737] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:10:35.555 [248/737] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:35.814 [249/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:10:35.814 [250/737] Generating lib/rte_eventdev_def with a custom command 00:10:35.814 [251/737] Generating lib/rte_eventdev_mingw with a custom command 00:10:35.814 [252/737] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:10:35.814 [253/737] Linking static target lib/librte_efd.a 00:10:36.082 [254/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:10:36.082 [255/737] Generating lib/rte_gpudev_def with a custom command 00:10:36.082 [256/737] Generating lib/rte_gpudev_mingw with a custom command 00:10:36.082 [257/737] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:10:36.355 [258/737] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:10:36.355 [259/737] Linking static target lib/librte_cryptodev.a 00:10:36.355 [260/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:10:36.355 [261/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:10:36.613 [262/737] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:10:36.613 [263/737] Linking static target lib/librte_gpudev.a 00:10:36.613 [264/737] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:10:36.613 [265/737] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:10:36.871 [266/737] Generating lib/rte_gro_def with a custom command 00:10:36.871 [267/737] Generating lib/rte_gro_mingw with a custom command 00:10:36.871 [268/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:10:36.871 [269/737] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:10:37.129 [270/737] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:10:37.129 [271/737] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:10:37.387 [272/737] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:10:37.387 [273/737] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:10:37.387 [274/737] Linking static target lib/librte_gro.a 00:10:37.387 [275/737] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:10:37.387 [276/737] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:10:37.387 [277/737] Generating lib/rte_gso_def with a custom command 00:10:37.387 [278/737] Generating lib/rte_gso_mingw with a custom command 00:10:37.388 [279/737] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:10:37.646 [280/737] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:37.646 [281/737] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:10:37.646 [282/737] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:10:37.646 [283/737] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:10:37.646 [284/737] Linking static target lib/librte_gso.a 00:10:37.646 [285/737] Linking static target lib/librte_eventdev.a 00:10:37.904 [286/737] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:10:37.904 [287/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:10:37.904 [288/737] Generating lib/rte_ip_frag_def with a custom command 00:10:37.904 [289/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:10:37.904 [290/737] Generating lib/rte_ip_frag_mingw with a custom command 00:10:37.904 [291/737] Generating lib/rte_jobstats_def with a custom command 00:10:38.162 [292/737] Generating lib/rte_jobstats_mingw with a custom command 00:10:38.162 [293/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:10:38.162 [294/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:10:38.162 [295/737] Generating lib/rte_latencystats_def with a custom command 00:10:38.162 [296/737] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:10:38.162 [297/737] Linking static target lib/librte_jobstats.a 00:10:38.162 [298/737] Generating lib/rte_latencystats_mingw with a custom command 00:10:38.162 [299/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:10:38.162 [300/737] Generating lib/rte_lpm_def with a custom command 00:10:38.162 [301/737] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:38.162 [302/737] Generating lib/rte_lpm_mingw with a custom command 00:10:38.420 [303/737] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:10:38.420 [304/737] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:10:38.420 [305/737] Linking static target lib/librte_ip_frag.a 00:10:38.677 [306/737] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:10:38.677 [307/737] Linking static target lib/librte_latencystats.a 00:10:38.677 [308/737] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:10:38.677 [309/737] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:10:38.677 [310/737] Linking static target lib/member/libsketch_avx512_tmp.a 00:10:38.677 [311/737] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:10:38.678 [312/737] Generating lib/rte_member_def with a custom command 00:10:38.678 [313/737] Generating lib/rte_member_mingw with a custom command 00:10:38.678 [314/737] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:10:38.935 [315/737] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:10:38.935 [316/737] Generating lib/rte_pcapng_def with a custom command 00:10:38.935 [317/737] Generating lib/rte_pcapng_mingw with a custom command 00:10:38.936 [318/737] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:10:38.936 [319/737] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:10:38.936 [320/737] Linking target lib/librte_eal.so.23.0 00:10:38.936 [321/737] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:10:38.936 [322/737] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:10:38.936 [323/737] Linking static target lib/librte_lpm.a 00:10:39.194 [324/737] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:10:39.194 [325/737] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:10:39.194 [326/737] Linking target lib/librte_ring.so.23.0 00:10:39.194 [327/737] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:10:39.194 [328/737] Linking target lib/librte_meter.so.23.0 00:10:39.194 [329/737] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:10:39.194 [330/737] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:10:39.452 [331/737] Linking target lib/librte_rcu.so.23.0 00:10:39.452 [332/737] Linking target lib/librte_mempool.so.23.0 00:10:39.452 [333/737] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:10:39.452 [334/737] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:39.452 [335/737] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:10:39.453 [336/737] Linking target lib/librte_pci.so.23.0 00:10:39.453 [337/737] Linking target lib/librte_timer.so.23.0 00:10:39.453 [338/737] Linking target lib/librte_acl.so.23.0 00:10:39.453 [339/737] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:10:39.453 [340/737] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:10:39.453 [341/737] Linking target lib/librte_cfgfile.so.23.0 00:10:39.453 [342/737] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:10:39.453 [343/737] Linking static target lib/librte_pcapng.a 00:10:39.453 [344/737] Linking target lib/librte_jobstats.so.23.0 00:10:39.453 [345/737] Linking target lib/librte_mbuf.so.23.0 00:10:39.453 [346/737] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:10:39.453 [347/737] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:10:39.720 [348/737] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:10:39.720 [349/737] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:10:39.720 [350/737] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:10:39.720 [351/737] Generating lib/rte_power_def with a custom command 00:10:39.720 [352/737] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:10:39.720 [353/737] Generating lib/rte_power_mingw with a custom command 00:10:39.720 [354/737] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:10:39.720 [355/737] Generating lib/rte_rawdev_def with a custom command 00:10:39.720 [356/737] Linking target lib/librte_net.so.23.0 00:10:39.720 [357/737] Linking target lib/librte_bbdev.so.23.0 00:10:39.720 [358/737] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:10:39.720 [359/737] Linking target lib/librte_compressdev.so.23.0 00:10:39.976 [360/737] Linking target lib/librte_distributor.so.23.0 00:10:39.976 [361/737] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:10:39.976 [362/737] Linking target lib/librte_cryptodev.so.23.0 00:10:39.976 [363/737] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:10:39.976 [364/737] Generating lib/rte_rawdev_mingw with a custom command 00:10:39.976 [365/737] Linking target lib/librte_gpudev.so.23.0 00:10:39.976 [366/737] Linking target lib/librte_cmdline.so.23.0 00:10:39.976 [367/737] Linking target lib/librte_hash.so.23.0 00:10:39.976 [368/737] Linking target lib/librte_ethdev.so.23.0 00:10:39.976 [369/737] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:10:39.976 [370/737] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:10:39.976 [371/737] Generating lib/rte_regexdev_mingw with a custom command 00:10:39.976 [372/737] Generating lib/rte_regexdev_def with a custom command 00:10:39.976 [373/737] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:10:39.976 [374/737] Linking static target lib/librte_rawdev.a 00:10:39.976 [375/737] Generating lib/rte_dmadev_def with a custom command 00:10:39.976 [376/737] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:10:39.976 [377/737] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:10:40.236 [378/737] Generating lib/rte_dmadev_mingw with a custom command 00:10:40.236 [379/737] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:10:40.236 [380/737] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:10:40.236 [381/737] Linking target lib/librte_efd.so.23.0 00:10:40.236 [382/737] Linking target lib/librte_metrics.so.23.0 00:10:40.236 [383/737] Linking target lib/librte_gro.so.23.0 00:10:40.236 [384/737] Linking target lib/librte_bpf.so.23.0 00:10:40.236 [385/737] Linking target lib/librte_ip_frag.so.23.0 00:10:40.236 [386/737] Linking target lib/librte_gso.so.23.0 00:10:40.236 [387/737] Linking target lib/librte_lpm.so.23.0 00:10:40.236 [388/737] Linking target lib/librte_pcapng.so.23.0 00:10:40.236 [389/737] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:10:40.236 [390/737] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:10:40.236 [391/737] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:10:40.236 [392/737] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:10:40.236 [393/737] Linking static target lib/librte_member.a 00:10:40.236 [394/737] Linking target lib/librte_bitratestats.so.23.0 00:10:40.236 [395/737] Linking static target lib/librte_power.a 00:10:40.495 [396/737] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:10:40.495 [397/737] Linking target lib/librte_latencystats.so.23.0 00:10:40.495 [398/737] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:10:40.495 [399/737] Generating lib/rte_rib_def with a custom command 00:10:40.495 [400/737] Generating lib/rte_rib_mingw with a custom command 00:10:40.495 [401/737] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:10:40.495 [402/737] Linking static target lib/librte_dmadev.a 00:10:40.495 [403/737] Generating lib/rte_reorder_def with a custom command 00:10:40.495 [404/737] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:10:40.495 [405/737] Linking static target lib/librte_regexdev.a 00:10:40.495 [406/737] Generating lib/rte_reorder_mingw with a custom command 00:10:40.495 [407/737] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:40.495 [408/737] Linking target lib/librte_eventdev.so.23.0 00:10:40.753 [409/737] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:10:40.753 [410/737] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:40.753 [411/737] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:10:40.753 [412/737] Linking target lib/librte_member.so.23.0 00:10:40.753 [413/737] Linking target lib/librte_rawdev.so.23.0 00:10:40.753 [414/737] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:10:41.012 [415/737] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:10:41.012 [416/737] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:10:41.012 [417/737] Generating lib/rte_sched_def with a custom command 00:10:41.012 [418/737] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:10:41.012 [419/737] Generating lib/rte_sched_mingw with a custom command 00:10:41.012 [420/737] Generating lib/rte_security_def with a custom command 00:10:41.012 [421/737] Generating lib/rte_security_mingw with a custom command 00:10:41.012 [422/737] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:10:41.012 [423/737] Linking static target lib/librte_reorder.a 00:10:41.012 [424/737] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:10:41.012 [425/737] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:10:41.012 [426/737] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:10:41.012 [427/737] Linking static target lib/librte_stack.a 00:10:41.012 [428/737] Generating lib/rte_stack_def with a custom command 00:10:41.012 [429/737] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:41.012 [430/737] Generating lib/rte_stack_mingw with a custom command 00:10:41.012 [431/737] Linking target lib/librte_dmadev.so.23.0 00:10:41.270 [432/737] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:10:41.270 [433/737] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:10:41.270 [434/737] Linking static target lib/librte_rib.a 00:10:41.270 [435/737] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:10:41.270 [436/737] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:10:41.270 [437/737] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:10:41.270 [438/737] Linking target lib/librte_stack.so.23.0 00:10:41.270 [439/737] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:41.270 [440/737] Linking target lib/librte_reorder.so.23.0 00:10:41.270 [441/737] Linking target lib/librte_regexdev.so.23.0 00:10:41.528 [442/737] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:10:41.528 [443/737] Linking static target lib/librte_security.a 00:10:41.528 [444/737] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:10:41.528 [445/737] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:10:41.786 [446/737] Linking target lib/librte_power.so.23.0 00:10:41.786 [447/737] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:10:41.786 [448/737] Generating lib/rte_vhost_def with a custom command 00:10:41.786 [449/737] Generating lib/rte_vhost_mingw with a custom command 00:10:41.787 [450/737] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:10:42.044 [451/737] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:10:42.044 [452/737] Linking target lib/librte_rib.so.23.0 00:10:42.044 [453/737] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:10:42.045 [454/737] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:10:42.045 [455/737] Linking target lib/librte_security.so.23.0 00:10:42.045 [456/737] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:10:42.045 [457/737] Linking static target lib/librte_sched.a 00:10:42.303 [458/737] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:10:42.303 [459/737] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:10:42.303 [460/737] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:10:42.562 [461/737] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:10:42.562 [462/737] Generating lib/rte_ipsec_def with a custom command 00:10:42.562 [463/737] Generating lib/rte_ipsec_mingw with a custom command 00:10:42.562 [464/737] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:10:42.562 [465/737] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:10:42.851 [466/737] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:10:42.851 [467/737] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:10:42.851 [468/737] Linking target lib/librte_sched.so.23.0 00:10:42.851 [469/737] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:10:43.130 [470/737] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:10:43.130 [471/737] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:10:43.130 [472/737] Generating lib/rte_fib_def with a custom command 00:10:43.130 [473/737] Generating lib/rte_fib_mingw with a custom command 00:10:43.130 [474/737] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:10:43.130 [475/737] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:10:43.130 [476/737] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:10:43.388 [477/737] Linking static target lib/librte_ipsec.a 00:10:43.388 [478/737] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:10:43.388 [479/737] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:10:43.388 [480/737] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:10:43.388 [481/737] Linking static target lib/librte_fib.a 00:10:43.646 [482/737] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:10:43.646 [483/737] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:10:43.904 [484/737] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:10:43.904 [485/737] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:10:43.904 [486/737] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:10:43.904 [487/737] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:10:43.904 [488/737] Linking target lib/librte_ipsec.so.23.0 00:10:43.904 [489/737] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:10:44.161 [490/737] Linking target lib/librte_fib.so.23.0 00:10:44.419 [491/737] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:10:44.419 [492/737] Generating lib/rte_port_def with a custom command 00:10:44.419 [493/737] Generating lib/rte_port_mingw with a custom command 00:10:44.419 [494/737] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:10:44.419 [495/737] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:10:44.419 [496/737] Generating lib/rte_pdump_def with a custom command 00:10:44.419 [497/737] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:10:44.419 [498/737] Generating lib/rte_pdump_mingw with a custom command 00:10:44.419 [499/737] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:10:44.419 [500/737] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:10:44.677 [501/737] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:10:44.677 [502/737] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:10:44.677 [503/737] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:10:44.936 [504/737] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:10:44.936 [505/737] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:10:44.936 [506/737] Linking static target lib/librte_port.a 00:10:44.936 [507/737] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:10:44.936 [508/737] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:10:44.936 [509/737] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:10:45.194 [510/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:10:45.194 [511/737] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:10:45.194 [512/737] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:10:45.194 [513/737] Linking static target lib/librte_pdump.a 00:10:45.760 [514/737] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:10:45.760 [515/737] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:10:45.760 [516/737] Linking target lib/librte_pdump.so.23.0 00:10:45.760 [517/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:10:45.760 [518/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:10:45.760 [519/737] Generating lib/rte_table_def with a custom command 00:10:45.760 [520/737] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:10:45.760 [521/737] Generating lib/rte_table_mingw with a custom command 00:10:45.760 [522/737] Linking target lib/librte_port.so.23.0 00:10:45.760 [523/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:10:46.017 [524/737] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:10:46.017 [525/737] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:10:46.017 [526/737] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:10:46.017 [527/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:10:46.017 [528/737] Generating lib/rte_pipeline_def with a custom command 00:10:46.017 [529/737] Generating lib/rte_pipeline_mingw with a custom command 00:10:46.275 [530/737] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:10:46.275 [531/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:10:46.275 [532/737] Linking static target lib/librte_table.a 00:10:46.533 [533/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:10:46.533 [534/737] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:10:46.791 [535/737] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:10:46.791 [536/737] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:10:46.791 [537/737] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:10:47.049 [538/737] Generating lib/rte_graph_def with a custom command 00:10:47.049 [539/737] Generating lib/rte_graph_mingw with a custom command 00:10:47.307 [540/737] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:10:47.307 [541/737] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:10:47.307 [542/737] Linking static target lib/librte_graph.a 00:10:47.564 [543/737] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:10:47.564 [544/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:10:47.564 [545/737] Linking target lib/librte_table.so.23.0 00:10:47.565 [546/737] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:10:47.823 [547/737] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:10:47.823 [548/737] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:10:48.081 [549/737] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:10:48.081 [550/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:10:48.081 [551/737] Compiling C object lib/librte_node.a.p/node_null.c.o 00:10:48.339 [552/737] Compiling C object lib/librte_node.a.p/node_log.c.o 00:10:48.340 [553/737] Generating lib/rte_node_def with a custom command 00:10:48.340 [554/737] Generating lib/rte_node_mingw with a custom command 00:10:48.599 [555/737] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:10:48.599 [556/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:10:48.599 [557/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:10:48.599 [558/737] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:10:48.599 [559/737] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:10:48.857 [560/737] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:10:48.857 [561/737] Linking target lib/librte_graph.so.23.0 00:10:48.857 [562/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:10:48.857 [563/737] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:10:48.857 [564/737] Generating drivers/rte_bus_pci_def with a custom command 00:10:48.857 [565/737] Generating drivers/rte_bus_pci_mingw with a custom command 00:10:48.857 [566/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:10:48.858 [567/737] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:10:48.858 [568/737] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:10:48.858 [569/737] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:10:48.858 [570/737] Generating drivers/rte_bus_vdev_def with a custom command 00:10:48.858 [571/737] Generating drivers/rte_bus_vdev_mingw with a custom command 00:10:48.858 [572/737] Linking static target lib/librte_node.a 00:10:49.116 [573/737] Generating drivers/rte_mempool_ring_def with a custom command 00:10:49.116 [574/737] Generating drivers/rte_mempool_ring_mingw with a custom command 00:10:49.116 [575/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:10:49.116 [576/737] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:10:49.116 [577/737] Linking static target drivers/libtmp_rte_bus_pci.a 00:10:49.410 [578/737] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:10:49.410 [579/737] Linking static target drivers/libtmp_rte_bus_vdev.a 00:10:49.410 [580/737] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:10:49.410 [581/737] Linking target lib/librte_node.so.23.0 00:10:49.410 [582/737] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:10:49.410 [583/737] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:10:49.410 [584/737] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:10:49.667 [585/737] Linking static target drivers/librte_bus_pci.a 00:10:49.667 [586/737] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:10:49.668 [587/737] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:10:49.668 [588/737] Linking static target drivers/librte_bus_vdev.a 00:10:49.668 [589/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:10:49.668 [590/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:10:49.668 [591/737] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:10:49.926 [592/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:10:49.926 [593/737] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:10:49.926 [594/737] Linking static target drivers/libtmp_rte_mempool_ring.a 00:10:49.926 [595/737] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:10:49.926 [596/737] Linking target drivers/librte_bus_vdev.so.23.0 00:10:50.185 [597/737] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:10:50.185 [598/737] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:10:50.185 [599/737] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:10:50.185 [600/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:10:50.185 [601/737] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:10:50.185 [602/737] Linking static target drivers/librte_mempool_ring.a 00:10:50.185 [603/737] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:10:50.185 [604/737] Linking target drivers/librte_bus_pci.so.23.0 00:10:50.444 [605/737] Linking target drivers/librte_mempool_ring.so.23.0 00:10:50.444 [606/737] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:10:50.444 [607/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:10:51.010 [608/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:10:51.268 [609/737] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:10:51.268 [610/737] Linking static target drivers/net/i40e/base/libi40e_base.a 00:10:51.527 [611/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:10:52.094 [612/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:10:52.094 [613/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:10:52.353 [614/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:10:52.353 [615/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:10:52.612 [616/737] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:10:52.612 [617/737] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:10:52.612 [618/737] Generating drivers/rte_net_i40e_def with a custom command 00:10:52.612 [619/737] Generating drivers/rte_net_i40e_mingw with a custom command 00:10:52.612 [620/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:10:52.870 [621/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:10:53.438 [622/737] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:10:53.696 [623/737] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:10:53.696 [624/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:10:53.696 [625/737] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:10:53.696 [626/737] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:10:53.954 [627/737] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:10:53.954 [628/737] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:10:53.954 [629/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:10:54.522 [630/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:10:54.522 [631/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:10:54.522 [632/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:10:54.780 [633/737] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:10:54.780 [634/737] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:10:55.346 [635/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:10:55.346 [636/737] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:10:55.346 [637/737] Linking static target drivers/libtmp_rte_net_i40e.a 00:10:55.346 [638/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:10:55.346 [639/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:10:55.604 [640/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:10:55.604 [641/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:10:55.604 [642/737] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:10:55.604 [643/737] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:10:55.604 [644/737] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:10:55.604 [645/737] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:10:55.863 [646/737] Linking static target drivers/librte_net_i40e.a 00:10:55.863 [647/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:10:55.863 [648/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:10:56.121 [649/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:10:56.379 [650/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:10:56.379 [651/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:10:56.379 [652/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:10:56.379 [653/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:10:56.379 [654/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:10:56.638 [655/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:10:56.638 [656/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:10:56.896 [657/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:10:56.896 [658/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:10:56.896 [659/737] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:10:56.896 [660/737] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:10:57.155 [661/737] Linking target drivers/librte_net_i40e.so.23.0 00:10:57.155 [662/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:10:57.155 [663/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:10:57.720 [664/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:10:57.720 [665/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:10:57.977 [666/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:10:58.235 [667/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:10:58.235 [668/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:10:58.494 [669/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:10:58.494 [670/737] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:10:58.494 [671/737] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:10:58.752 [672/737] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:10:58.752 [673/737] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:10:58.752 [674/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:10:59.010 [675/737] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:10:59.010 [676/737] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:10:59.010 [677/737] Linking static target lib/librte_vhost.a 00:10:59.010 [678/737] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:10:59.268 [679/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:10:59.268 [680/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:10:59.268 [681/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:10:59.525 [682/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:10:59.525 [683/737] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:10:59.525 [684/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:10:59.525 [685/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:10:59.525 [686/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:10:59.525 [687/737] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:11:00.092 [688/737] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:11:00.092 [689/737] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:11:00.349 [690/737] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:11:00.349 [691/737] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:11:00.349 [692/737] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:11:01.284 [693/737] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:11:01.284 [694/737] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:11:01.284 [695/737] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:11:01.284 [696/737] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:11:01.284 [697/737] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:11:01.284 [698/737] Linking target lib/librte_vhost.so.23.0 00:11:01.850 [699/737] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:11:01.850 [700/737] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:11:01.850 [701/737] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:11:01.850 [702/737] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:11:02.108 [703/737] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:11:02.108 [704/737] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:11:02.365 [705/737] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:11:02.365 [706/737] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:11:02.623 [707/737] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:11:02.882 [708/737] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:11:02.882 [709/737] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:11:03.141 [710/737] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:11:03.141 [711/737] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:11:03.399 [712/737] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:11:03.399 [713/737] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:11:03.399 [714/737] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:11:03.657 [715/737] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:11:04.224 [716/737] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:11:04.224 [717/737] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:11:07.506 [718/737] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:11:07.765 [719/737] Linking static target lib/librte_pipeline.a 00:11:08.332 [720/737] Linking target app/dpdk-test-acl 00:11:08.332 [721/737] Linking target app/dpdk-proc-info 00:11:08.332 [722/737] Linking target app/dpdk-test-compress-perf 00:11:08.332 [723/737] Linking target app/dpdk-test-cmdline 00:11:08.332 [724/737] Linking target app/dpdk-test-fib 00:11:08.332 [725/737] Linking target app/dpdk-pdump 00:11:08.332 [726/737] Linking target app/dpdk-test-eventdev 00:11:08.332 [727/737] Linking target app/dpdk-test-crypto-perf 00:11:08.332 [728/737] Linking target app/dpdk-test-bbdev 00:11:08.590 [729/737] Linking target app/dpdk-test-gpudev 00:11:08.590 [730/737] Linking target app/dpdk-test-regex 00:11:08.849 [731/737] Linking target app/dpdk-test-pipeline 00:11:08.849 [732/737] Linking target app/dpdk-test-flow-perf 00:11:08.849 [733/737] Linking target app/dpdk-testpmd 00:11:08.849 [734/737] Linking target app/dpdk-test-sad 00:11:08.849 [735/737] Linking target app/dpdk-test-security-perf 00:11:13.098 [736/737] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:11:13.098 [737/737] Linking target lib/librte_pipeline.so.23.0 00:11:13.098 12:13:36 build_native_dpdk -- common/autobuild_common.sh@187 -- $ ninja -C /home/vagrant/spdk_repo/dpdk/build-tmp -j10 install 00:11:13.098 ninja: Entering directory `/home/vagrant/spdk_repo/dpdk/build-tmp' 00:11:13.098 [0/1] Installing files. 00:11:13.098 Installing subdir /home/vagrant/spdk_repo/dpdk/examples to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bbdev_app/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bbdev_app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bbdev_app/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bbdev_app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/README to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/dummy.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t1.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t2.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t3.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/commands.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/parse_obj_list.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/parse_obj_list.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/common/pkt_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/common/altivec/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/altivec 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/common/neon/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/neon 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/common/sse/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/sse 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/distributor/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/distributor 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/distributor/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/distributor 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/dma/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/dma 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/dma/dmafwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/dma 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/ethapp.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/ethapp.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/rte_ethtool.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/rte_ethtool.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_dev_self_test.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_dev_self_test.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_aes.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_ccm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.098 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_cmac.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_gcm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_hmac.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_rsa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_sha.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_tdes.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_xts.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_classify/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_classify 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_classify/flow_classify.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_classify 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_classify/ipv4_rules_file.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_classify 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/flow_blocks.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/helloworld/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/helloworld 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/helloworld/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/helloworld 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_fragmentation/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_fragmentation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_fragmentation/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_fragmentation 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/action.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/action.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/conn.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/conn.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cryptodev.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cryptodev.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/kni.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/kni.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/link.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/link.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/mempool.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/mempool.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/parser.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/parser.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/pipeline.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/pipeline.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/swq.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/swq.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tap.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tap.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/thread.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tmgr.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tmgr.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/firewall.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/flow.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/kni.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/route.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/rss.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/tap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_reassembly/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_reassembly 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_reassembly/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_reassembly 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ep0.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ep1.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/esp.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/esp.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/event_helper.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/event_helper.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/flow.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/flow.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipip.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_process.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_worker.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_worker.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/parser.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/parser.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/rt.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sad.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sad.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sp4.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sp6.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.099 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/linux_test.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/load_env.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/pkttest.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/pkttest.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/run_test.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipv4_multicast/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipv4_multicast 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/ipv4_multicast/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipv4_multicast 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/cat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/cat.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-crypto/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-crypto/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_common.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_poll.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_poll.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-jobstats/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-jobstats/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/shm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/shm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-graph/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-graph 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-graph/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-graph 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/perf_core.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/perf_core.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_default_v4.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_default_v6.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_route_parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_altivec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_fib.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.100 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_route.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_default_v4.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_default_v6.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_route_parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/link_status_interrupt/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/link_status_interrupt 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/link_status_interrupt/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/link_status_interrupt 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/shared/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/commands.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/mp_commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/mp_commands.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/symmetric_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/symmetric_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/ntb/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ntb 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/ntb/ntb_fwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ntb 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/packet_ordering/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/packet_ordering 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/packet_ordering/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/packet_ordering 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/conn.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/conn.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/obj.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/obj.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/thread.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ethdev.io to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_routing_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/hash_func.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/hash_func.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/learner.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/learner.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/meter.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/meter.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/mirroring.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/mirroring.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/packet.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/pcap.io to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/recirculation.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/recirculation.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/registers.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/registers.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/varbit.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/varbit.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_table.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/ptpclient/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ptpclient 00:11:13.101 Installing /home/vagrant/spdk_repo/dpdk/examples/ptpclient/ptpclient.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ptpclient 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/rte_policer.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/rte_policer.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/app_thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cfg_file.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cfg_file.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cmdline.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_ov.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_pie.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_red.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/stats.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/rxtx_callbacks/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/rxtx_callbacks/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/node/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/node 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/node/node.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/node 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/args.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/init.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/server/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/server 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/shared/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/service_cores/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/service_cores 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/service_cores/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/service_cores 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/skeleton/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/skeleton 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/skeleton/basicfwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/skeleton 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/timer/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/timer 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/timer/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/timer 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/vdpa_blk_compact.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/virtio_net.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/blk.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/blk_spec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk_compat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_crypto/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_crypto 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_crypto/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_crypto 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_manager.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_manager.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_monitor.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_monitor.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/parse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/power_manager.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/power_manager.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/vm_power_cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/vm_power_cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/parse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.102 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.103 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:11:13.103 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq 00:11:13.103 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq 00:11:13.103 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq_dcb/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq_dcb 00:11:13.103 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq_dcb/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq_dcb 00:11:13.103 Installing lib/librte_kvargs.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_kvargs.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_telemetry.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_telemetry.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_eal.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_eal.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_ring.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_ring.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_rcu.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_rcu.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_mempool.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_mempool.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_mbuf.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_mbuf.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.103 Installing lib/librte_net.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_net.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_meter.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_meter.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_ethdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_ethdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_pci.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_pci.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_cmdline.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_cmdline.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_metrics.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_metrics.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_hash.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_hash.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_timer.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_timer.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_acl.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_acl.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bbdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bbdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bitratestats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bitratestats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bpf.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.362 Installing lib/librte_bpf.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_cfgfile.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_cfgfile.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_compressdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_compressdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_cryptodev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_cryptodev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_distributor.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_distributor.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_efd.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_efd.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_eventdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_eventdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gpudev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gpudev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gro.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gro.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gso.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_gso.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_ip_frag.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_ip_frag.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_jobstats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_jobstats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_latencystats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_latencystats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_lpm.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_lpm.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_member.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_member.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pcapng.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pcapng.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_power.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_power.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_rawdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_rawdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_regexdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_regexdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_dmadev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_dmadev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_rib.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_rib.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_reorder.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_reorder.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_sched.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_sched.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_security.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_security.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_stack.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_stack.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_vhost.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_vhost.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_ipsec.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_ipsec.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_fib.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_fib.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_port.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_port.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pdump.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pdump.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_table.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_table.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pipeline.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_pipeline.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_graph.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_graph.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_node.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing lib/librte_node.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing drivers/librte_bus_pci.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing drivers/librte_bus_pci.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0 00:11:13.363 Installing drivers/librte_bus_vdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing drivers/librte_bus_vdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0 00:11:13.363 Installing drivers/librte_mempool_ring.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing drivers/librte_mempool_ring.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0 00:11:13.363 Installing drivers/librte_net_i40e.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:11:13.363 Installing drivers/librte_net_i40e.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0 00:11:13.363 Installing app/dpdk-pdump to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-proc-info to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-acl to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-bbdev to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-cmdline to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-compress-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-crypto-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-eventdev to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.363 Installing app/dpdk-test-fib to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-flow-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-gpudev to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-pipeline to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-testpmd to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-regex to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-sad to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing app/dpdk-test-security-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/config/rte_config.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/kvargs/rte_kvargs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/telemetry/rte_telemetry.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_atomic.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_byteorder.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_cpuflags.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_cycles.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_io.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_memcpy.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_pause.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_prefetch.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_rwlock.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_spinlock.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_vect.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_cpuflags.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_cycles.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_io.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_memcpy.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_pause.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_prefetch.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_rtm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_rwlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_spinlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_vect.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic_32.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic_64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_alarm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bitmap.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bitops.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_branch_prediction.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bus.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_class.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_common.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_compat.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_debug.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_dev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_devargs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.624 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal_memconfig.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_errno.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_epoll.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_fbarray.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_hexdump.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_hypervisor.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_interrupts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_keepalive.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_launch.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_lcore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_log.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_malloc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_mcslock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_memory.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_memzone.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pci_dev_features.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_per_lcore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pflock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_random.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_reciprocal.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_seqcount.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_seqlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_service.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_service_component.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_string_fns.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_tailq.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_thread.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_ticketlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_time.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace_point.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace_point_register.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_uuid.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_version.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_vfio.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/linux/include/rte_os.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_elem.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_c11_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_generic_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_hts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek_zc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_rts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/rcu/rte_rcu_qsbr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mempool/rte_mempool.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mempool/rte_mempool_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mempool/rte_mempool_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_ptype.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_dyn.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ip.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_tcp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_udp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_esp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_sctp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_icmp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_arp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ether.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_macsec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_vxlan.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_gre.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_gtp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_net.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_net_crc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_mpls.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_higig.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ecpri.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_geneve.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_l2tpv2.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ppp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/meter/rte_meter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_cman.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_dev_info.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_flow.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_flow_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_mtr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_mtr_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_tm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_tm_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_eth_ctrl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/pci/rte_pci.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_num.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_string.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_rdline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_vt100.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_socket.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.625 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_cirbuf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_portlist.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/metrics/rte_metrics.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/metrics/rte_metrics_telemetry.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_fbk_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_hash_crc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_jhash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash_gfni.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_generic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_sw.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_x86.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash_x86_gfni.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/timer/rte_timer.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/acl/rte_acl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/acl/rte_acl_osdep.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev_pmd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev_op.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bitratestats/rte_bitrate.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/bpf_def.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/rte_bpf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/rte_bpf_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cfgfile/rte_cfgfile.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/compressdev/rte_compressdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/compressdev/rte_comp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto_sym.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto_asym.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/distributor/rte_distributor.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/efd/rte_efd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_timer_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/gpudev/rte_gpudev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/gro/rte_gro.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/gso/rte_gso.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/ip_frag/rte_ip_frag.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/jobstats/rte_jobstats.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/latencystats/rte_latencystats.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_altivec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_scalar.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_sse.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_sve.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/member/rte_member.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/pcapng/rte_pcapng.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_empty_poll.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_intel_uncore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_pmd_mgmt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_guest_channel.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/rawdev/rte_rawdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/rawdev/rte_rawdev_pmd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/dmadev/rte_dmadev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/dmadev/rte_dmadev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/rib/rte_rib.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/rib/rte_rib6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/reorder/rte_reorder.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_approx.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_red.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_sched.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_sched_common.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_pie.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/security/rte_security.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/security/rte_security_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_std.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_generic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_c11.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_stubs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vdpa.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost_async.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_sa.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_sad.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_group.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/fib/rte_fib.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/fib/rte_fib6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_fd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_frag.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ras.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_sched.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_source_sink.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_sym_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.626 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_eventdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_fd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_source_sink.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pdump/rte_pdump.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_hash_func.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_em.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_learner.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_selector.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_wm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_acl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_array.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_cuckoo.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_func.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_lpm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_lpm_ipv6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_stub.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru_x86.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_func_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_pipeline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_port_in_action.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_table_action.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_pipeline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_extern.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_ctl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph_worker.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_ip4_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_eth_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/drivers/bus/pci/rte_bus_pci.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-devbind.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-pmdinfo.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-telemetry.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-hugepages.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/rte_build_config.h to /home/vagrant/spdk_repo/dpdk/build/include 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig 00:11:13.627 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/meson-private/libdpdk.pc to /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig 00:11:13.627 Installing symlink pointing to librte_kvargs.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so.23 00:11:13.627 Installing symlink pointing to librte_kvargs.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so 00:11:13.627 Installing symlink pointing to librte_telemetry.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so.23 00:11:13.627 Installing symlink pointing to librte_telemetry.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so 00:11:13.627 Installing symlink pointing to librte_eal.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so.23 00:11:13.627 Installing symlink pointing to librte_eal.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so 00:11:13.627 Installing symlink pointing to librte_ring.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so.23 00:11:13.627 Installing symlink pointing to librte_ring.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so 00:11:13.627 Installing symlink pointing to librte_rcu.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so.23 00:11:13.627 Installing symlink pointing to librte_rcu.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so 00:11:13.627 Installing symlink pointing to librte_mempool.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so.23 00:11:13.627 Installing symlink pointing to librte_mempool.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so 00:11:13.627 Installing symlink pointing to librte_mbuf.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so.23 00:11:13.627 Installing symlink pointing to librte_mbuf.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so 00:11:13.627 Installing symlink pointing to librte_net.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so.23 00:11:13.627 Installing symlink pointing to librte_net.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so 00:11:13.627 Installing symlink pointing to librte_meter.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so.23 00:11:13.627 Installing symlink pointing to librte_meter.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so 00:11:13.627 Installing symlink pointing to librte_ethdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so.23 00:11:13.627 Installing symlink pointing to librte_ethdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so 00:11:13.627 Installing symlink pointing to librte_pci.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so.23 00:11:13.627 Installing symlink pointing to librte_pci.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so 00:11:13.627 Installing symlink pointing to librte_cmdline.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so.23 00:11:13.627 Installing symlink pointing to librte_cmdline.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so 00:11:13.627 Installing symlink pointing to librte_metrics.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so.23 00:11:13.627 Installing symlink pointing to librte_metrics.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so 00:11:13.627 Installing symlink pointing to librte_hash.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so.23 00:11:13.627 Installing symlink pointing to librte_hash.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so 00:11:13.627 Installing symlink pointing to librte_timer.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so.23 00:11:13.627 Installing symlink pointing to librte_timer.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so 00:11:13.627 Installing symlink pointing to librte_acl.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so.23 00:11:13.627 Installing symlink pointing to librte_acl.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so 00:11:13.627 Installing symlink pointing to librte_bbdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so.23 00:11:13.627 Installing symlink pointing to librte_bbdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so 00:11:13.627 Installing symlink pointing to librte_bitratestats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so.23 00:11:13.627 Installing symlink pointing to librte_bitratestats.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so 00:11:13.627 Installing symlink pointing to librte_bpf.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so.23 00:11:13.627 Installing symlink pointing to librte_bpf.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so 00:11:13.627 Installing symlink pointing to librte_cfgfile.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so.23 00:11:13.627 Installing symlink pointing to librte_cfgfile.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so 00:11:13.627 Installing symlink pointing to librte_compressdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so.23 00:11:13.627 Installing symlink pointing to librte_compressdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so 00:11:13.627 Installing symlink pointing to librte_cryptodev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so.23 00:11:13.627 Installing symlink pointing to librte_cryptodev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so 00:11:13.627 Installing symlink pointing to librte_distributor.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so.23 00:11:13.627 Installing symlink pointing to librte_distributor.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so 00:11:13.627 Installing symlink pointing to librte_efd.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so.23 00:11:13.627 Installing symlink pointing to librte_efd.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so 00:11:13.627 Installing symlink pointing to librte_eventdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so.23 00:11:13.627 Installing symlink pointing to librte_eventdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so 00:11:13.627 Installing symlink pointing to librte_gpudev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so.23 00:11:13.627 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:11:13.627 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:11:13.627 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:11:13.627 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:11:13.627 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:11:13.627 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:11:13.627 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:11:13.627 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:11:13.627 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:11:13.627 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:11:13.627 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:11:13.627 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:11:13.628 Installing symlink pointing to librte_gpudev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so 00:11:13.628 Installing symlink pointing to librte_gro.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so.23 00:11:13.628 Installing symlink pointing to librte_gro.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so 00:11:13.628 Installing symlink pointing to librte_gso.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so.23 00:11:13.628 Installing symlink pointing to librte_gso.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so 00:11:13.628 Installing symlink pointing to librte_ip_frag.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so.23 00:11:13.628 Installing symlink pointing to librte_ip_frag.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so 00:11:13.628 Installing symlink pointing to librte_jobstats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so.23 00:11:13.628 Installing symlink pointing to librte_jobstats.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so 00:11:13.628 Installing symlink pointing to librte_latencystats.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so.23 00:11:13.628 Installing symlink pointing to librte_latencystats.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so 00:11:13.628 Installing symlink pointing to librte_lpm.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so.23 00:11:13.628 Installing symlink pointing to librte_lpm.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so 00:11:13.628 Installing symlink pointing to librte_member.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so.23 00:11:13.628 Installing symlink pointing to librte_member.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so 00:11:13.628 Installing symlink pointing to librte_pcapng.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so.23 00:11:13.628 Installing symlink pointing to librte_pcapng.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so 00:11:13.628 Installing symlink pointing to librte_power.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so.23 00:11:13.628 Installing symlink pointing to librte_power.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so 00:11:13.628 Installing symlink pointing to librte_rawdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so.23 00:11:13.628 Installing symlink pointing to librte_rawdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so 00:11:13.628 Installing symlink pointing to librte_regexdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so.23 00:11:13.628 Installing symlink pointing to librte_regexdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so 00:11:13.628 Installing symlink pointing to librte_dmadev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so.23 00:11:13.628 Installing symlink pointing to librte_dmadev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so 00:11:13.628 Installing symlink pointing to librte_rib.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so.23 00:11:13.628 Installing symlink pointing to librte_rib.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so 00:11:13.628 Installing symlink pointing to librte_reorder.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so.23 00:11:13.628 Installing symlink pointing to librte_reorder.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so 00:11:13.628 Installing symlink pointing to librte_sched.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so.23 00:11:13.628 Installing symlink pointing to librte_sched.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so 00:11:13.628 Installing symlink pointing to librte_security.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so.23 00:11:13.628 Installing symlink pointing to librte_security.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so 00:11:13.628 Installing symlink pointing to librte_stack.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so.23 00:11:13.628 Installing symlink pointing to librte_stack.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so 00:11:13.628 Installing symlink pointing to librte_vhost.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so.23 00:11:13.628 Installing symlink pointing to librte_vhost.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so 00:11:13.628 Installing symlink pointing to librte_ipsec.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so.23 00:11:13.628 Installing symlink pointing to librte_ipsec.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so 00:11:13.628 Installing symlink pointing to librte_fib.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so.23 00:11:13.628 Installing symlink pointing to librte_fib.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so 00:11:13.628 Installing symlink pointing to librte_port.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so.23 00:11:13.628 Installing symlink pointing to librte_port.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so 00:11:13.628 Installing symlink pointing to librte_pdump.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so.23 00:11:13.628 Installing symlink pointing to librte_pdump.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so 00:11:13.628 Installing symlink pointing to librte_table.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so.23 00:11:13.628 Installing symlink pointing to librte_table.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so 00:11:13.628 Installing symlink pointing to librte_pipeline.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so.23 00:11:13.628 Installing symlink pointing to librte_pipeline.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so 00:11:13.628 Installing symlink pointing to librte_graph.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so.23 00:11:13.628 Installing symlink pointing to librte_graph.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so 00:11:13.628 Installing symlink pointing to librte_node.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so.23 00:11:13.628 Installing symlink pointing to librte_node.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so 00:11:13.628 Installing symlink pointing to librte_bus_pci.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:11:13.628 Installing symlink pointing to librte_bus_pci.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:11:13.628 Installing symlink pointing to librte_bus_vdev.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:11:13.628 Installing symlink pointing to librte_bus_vdev.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:11:13.628 Installing symlink pointing to librte_mempool_ring.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:11:13.628 Installing symlink pointing to librte_mempool_ring.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:11:13.628 Installing symlink pointing to librte_net_i40e.so.23.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:11:13.628 Installing symlink pointing to librte_net_i40e.so.23 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:11:13.628 Running custom install script '/bin/sh /home/vagrant/spdk_repo/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:11:13.887 12:13:37 build_native_dpdk -- common/autobuild_common.sh@189 -- $ uname -s 00:11:13.887 12:13:37 build_native_dpdk -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:11:13.887 12:13:37 build_native_dpdk -- common/autobuild_common.sh@200 -- $ cat 00:11:13.887 12:13:37 build_native_dpdk -- common/autobuild_common.sh@205 -- $ cd /home/vagrant/spdk_repo/spdk 00:11:13.887 00:11:13.887 real 0m55.966s 00:11:13.887 user 5m8.807s 00:11:13.887 sys 1m15.593s 00:11:13.887 12:13:37 build_native_dpdk -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:11:13.887 12:13:37 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:11:13.887 ************************************ 00:11:13.887 END TEST build_native_dpdk 00:11:13.887 ************************************ 00:11:13.887 12:13:37 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:11:13.887 12:13:37 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:11:13.887 12:13:37 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:11:13.887 12:13:37 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:11:13.887 12:13:37 -- spdk/autobuild.sh@57 -- $ [[ 1 -eq 1 ]] 00:11:13.887 12:13:37 -- spdk/autobuild.sh@58 -- $ unittest_build 00:11:13.887 12:13:37 -- common/autobuild_common.sh@413 -- $ run_test unittest_build _unittest_build 00:11:13.887 12:13:37 -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:11:13.887 12:13:37 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:11:13.887 12:13:37 -- common/autotest_common.sh@10 -- $ set +x 00:11:13.887 ************************************ 00:11:13.887 START TEST unittest_build 00:11:13.887 ************************************ 00:11:13.887 12:13:37 unittest_build -- common/autotest_common.sh@1124 -- $ _unittest_build 00:11:13.888 12:13:37 unittest_build -- common/autobuild_common.sh@404 -- $ /home/vagrant/spdk_repo/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --enable-asan --enable-coverage --with-dpdk=/home/vagrant/spdk_repo/dpdk/build --without-shared 00:11:13.888 Using /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig for additional libs... 00:11:14.146 DPDK libraries: /home/vagrant/spdk_repo/dpdk/build/lib 00:11:14.146 DPDK includes: //home/vagrant/spdk_repo/dpdk/build/include 00:11:14.146 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:11:14.713 Using 'verbs' RDMA provider 00:11:33.731 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:11:48.606 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:11:48.606 Creating mk/config.mk...done. 00:11:48.606 Creating mk/cc.flags.mk...done. 00:11:48.606 Type 'make' to build. 00:11:48.606 12:14:11 unittest_build -- common/autobuild_common.sh@405 -- $ make -j10 00:11:48.606 make[1]: Nothing to be done for 'all'. 00:12:20.702 CC lib/log/log_flags.o 00:12:20.702 CC lib/log/log.o 00:12:20.702 CC lib/log/log_deprecated.o 00:12:20.702 CC lib/ut/ut.o 00:12:20.702 CC lib/ut_mock/mock.o 00:12:20.702 LIB libspdk_ut.a 00:12:20.702 LIB libspdk_log.a 00:12:20.702 LIB libspdk_ut_mock.a 00:12:20.702 CC lib/ioat/ioat.o 00:12:20.702 CXX lib/trace_parser/trace.o 00:12:20.702 CC lib/util/base64.o 00:12:20.702 CC lib/util/bit_array.o 00:12:20.702 CC lib/util/cpuset.o 00:12:20.702 CC lib/util/crc16.o 00:12:20.702 CC lib/util/crc32c.o 00:12:20.702 CC lib/util/crc32.o 00:12:20.702 CC lib/dma/dma.o 00:12:20.702 CC lib/vfio_user/host/vfio_user_pci.o 00:12:20.702 CC lib/vfio_user/host/vfio_user.o 00:12:20.702 CC lib/util/crc32_ieee.o 00:12:20.702 CC lib/util/crc64.o 00:12:20.702 CC lib/util/dif.o 00:12:20.702 CC lib/util/fd.o 00:12:20.702 LIB libspdk_dma.a 00:12:20.702 CC lib/util/file.o 00:12:20.702 LIB libspdk_ioat.a 00:12:20.702 CC lib/util/hexlify.o 00:12:20.702 CC lib/util/iov.o 00:12:20.702 CC lib/util/math.o 00:12:20.702 CC lib/util/pipe.o 00:12:20.702 CC lib/util/strerror_tls.o 00:12:20.702 CC lib/util/string.o 00:12:20.702 LIB libspdk_vfio_user.a 00:12:20.702 CC lib/util/uuid.o 00:12:20.702 CC lib/util/fd_group.o 00:12:20.702 CC lib/util/xor.o 00:12:20.702 CC lib/util/zipf.o 00:12:20.702 LIB libspdk_util.a 00:12:20.702 LIB libspdk_trace_parser.a 00:12:20.702 CC lib/conf/conf.o 00:12:20.702 CC lib/json/json_parse.o 00:12:20.702 CC lib/json/json_util.o 00:12:20.702 CC lib/json/json_write.o 00:12:20.702 CC lib/rdma/common.o 00:12:20.702 CC lib/rdma/rdma_verbs.o 00:12:20.702 CC lib/idxd/idxd.o 00:12:20.702 CC lib/idxd/idxd_user.o 00:12:20.702 CC lib/vmd/vmd.o 00:12:20.702 CC lib/env_dpdk/env.o 00:12:20.702 LIB libspdk_conf.a 00:12:20.702 CC lib/env_dpdk/memory.o 00:12:20.702 CC lib/env_dpdk/pci.o 00:12:20.702 CC lib/env_dpdk/init.o 00:12:20.702 CC lib/env_dpdk/threads.o 00:12:20.702 CC lib/env_dpdk/pci_ioat.o 00:12:20.702 LIB libspdk_rdma.a 00:12:20.702 LIB libspdk_json.a 00:12:20.702 CC lib/env_dpdk/pci_virtio.o 00:12:20.702 CC lib/env_dpdk/pci_vmd.o 00:12:20.702 LIB libspdk_idxd.a 00:12:20.702 CC lib/vmd/led.o 00:12:20.702 CC lib/env_dpdk/pci_idxd.o 00:12:20.702 CC lib/env_dpdk/pci_event.o 00:12:20.702 CC lib/env_dpdk/sigbus_handler.o 00:12:20.702 CC lib/env_dpdk/pci_dpdk.o 00:12:20.702 CC lib/env_dpdk/pci_dpdk_2207.o 00:12:20.702 CC lib/env_dpdk/pci_dpdk_2211.o 00:12:20.702 LIB libspdk_vmd.a 00:12:20.702 CC lib/jsonrpc/jsonrpc_client.o 00:12:20.702 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:12:20.702 CC lib/jsonrpc/jsonrpc_server.o 00:12:20.702 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:12:20.702 LIB libspdk_jsonrpc.a 00:12:20.702 LIB libspdk_env_dpdk.a 00:12:20.702 CC lib/rpc/rpc.o 00:12:20.960 LIB libspdk_rpc.a 00:12:21.221 CC lib/trace/trace_flags.o 00:12:21.221 CC lib/trace/trace.o 00:12:21.221 CC lib/trace/trace_rpc.o 00:12:21.221 CC lib/notify/notify.o 00:12:21.221 CC lib/notify/notify_rpc.o 00:12:21.221 CC lib/keyring/keyring.o 00:12:21.221 CC lib/keyring/keyring_rpc.o 00:12:21.479 LIB libspdk_notify.a 00:12:21.479 LIB libspdk_keyring.a 00:12:21.737 LIB libspdk_trace.a 00:12:21.995 CC lib/thread/iobuf.o 00:12:21.995 CC lib/thread/thread.o 00:12:21.995 CC lib/sock/sock_rpc.o 00:12:21.995 CC lib/sock/sock.o 00:12:22.562 LIB libspdk_sock.a 00:12:22.821 LIB libspdk_thread.a 00:12:22.821 CC lib/nvme/nvme_ctrlr_cmd.o 00:12:22.821 CC lib/nvme/nvme_ctrlr.o 00:12:22.821 CC lib/nvme/nvme_ns_cmd.o 00:12:22.821 CC lib/nvme/nvme_pcie_common.o 00:12:22.821 CC lib/nvme/nvme_fabric.o 00:12:22.821 CC lib/nvme/nvme_ns.o 00:12:22.821 CC lib/nvme/nvme_qpair.o 00:12:22.821 CC lib/nvme/nvme_pcie.o 00:12:22.821 CC lib/nvme/nvme.o 00:12:23.079 CC lib/accel/accel.o 00:12:23.646 CC lib/nvme/nvme_quirks.o 00:12:23.646 CC lib/accel/accel_rpc.o 00:12:23.646 CC lib/accel/accel_sw.o 00:12:23.646 CC lib/nvme/nvme_transport.o 00:12:23.646 CC lib/nvme/nvme_discovery.o 00:12:23.646 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:12:23.646 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:12:23.904 CC lib/nvme/nvme_tcp.o 00:12:23.904 CC lib/blob/request.o 00:12:23.904 CC lib/blob/blobstore.o 00:12:23.904 LIB libspdk_accel.a 00:12:23.904 CC lib/blob/zeroes.o 00:12:23.904 CC lib/blob/blob_bs_dev.o 00:12:24.162 CC lib/nvme/nvme_opal.o 00:12:24.162 CC lib/nvme/nvme_io_msg.o 00:12:24.162 CC lib/nvme/nvme_poll_group.o 00:12:24.162 CC lib/nvme/nvme_zns.o 00:12:24.162 CC lib/nvme/nvme_stubs.o 00:12:24.420 CC lib/nvme/nvme_auth.o 00:12:24.420 CC lib/init/json_config.o 00:12:24.420 CC lib/virtio/virtio.o 00:12:24.679 CC lib/virtio/virtio_vhost_user.o 00:12:24.679 CC lib/nvme/nvme_cuse.o 00:12:24.679 CC lib/init/subsystem.o 00:12:24.679 CC lib/init/subsystem_rpc.o 00:12:24.679 CC lib/init/rpc.o 00:12:24.679 CC lib/virtio/virtio_vfio_user.o 00:12:24.679 CC lib/virtio/virtio_pci.o 00:12:24.937 CC lib/nvme/nvme_rdma.o 00:12:24.937 LIB libspdk_init.a 00:12:24.937 LIB libspdk_virtio.a 00:12:25.196 CC lib/bdev/bdev_zone.o 00:12:25.196 CC lib/bdev/bdev.o 00:12:25.196 CC lib/bdev/bdev_rpc.o 00:12:25.196 CC lib/bdev/scsi_nvme.o 00:12:25.196 CC lib/bdev/part.o 00:12:25.196 CC lib/event/app.o 00:12:25.196 CC lib/event/reactor.o 00:12:25.196 CC lib/event/log_rpc.o 00:12:25.454 CC lib/event/app_rpc.o 00:12:25.454 CC lib/event/scheduler_static.o 00:12:25.713 LIB libspdk_blob.a 00:12:25.713 LIB libspdk_event.a 00:12:25.970 LIB libspdk_nvme.a 00:12:25.970 CC lib/lvol/lvol.o 00:12:25.970 CC lib/blobfs/blobfs.o 00:12:25.970 CC lib/blobfs/tree.o 00:12:26.534 LIB libspdk_blobfs.a 00:12:26.791 LIB libspdk_lvol.a 00:12:26.791 LIB libspdk_bdev.a 00:12:27.357 CC lib/nbd/nbd.o 00:12:27.357 CC lib/scsi/dev.o 00:12:27.357 CC lib/nbd/nbd_rpc.o 00:12:27.357 CC lib/scsi/port.o 00:12:27.357 CC lib/nvmf/ctrlr.o 00:12:27.357 CC lib/scsi/lun.o 00:12:27.357 CC lib/scsi/scsi.o 00:12:27.357 CC lib/scsi/scsi_bdev.o 00:12:27.357 CC lib/nvmf/ctrlr_discovery.o 00:12:27.357 CC lib/ftl/ftl_core.o 00:12:27.357 CC lib/nvmf/ctrlr_bdev.o 00:12:27.357 CC lib/nvmf/subsystem.o 00:12:27.357 CC lib/nvmf/nvmf.o 00:12:27.615 CC lib/nvmf/nvmf_rpc.o 00:12:27.615 CC lib/nvmf/transport.o 00:12:27.615 LIB libspdk_nbd.a 00:12:27.615 CC lib/scsi/scsi_pr.o 00:12:27.615 CC lib/scsi/scsi_rpc.o 00:12:27.615 CC lib/ftl/ftl_init.o 00:12:27.615 CC lib/ftl/ftl_layout.o 00:12:27.872 CC lib/ftl/ftl_debug.o 00:12:27.872 CC lib/ftl/ftl_io.o 00:12:27.872 CC lib/scsi/task.o 00:12:27.872 CC lib/nvmf/tcp.o 00:12:28.131 CC lib/ftl/ftl_sb.o 00:12:28.131 CC lib/nvmf/stubs.o 00:12:28.131 CC lib/ftl/ftl_l2p.o 00:12:28.131 LIB libspdk_scsi.a 00:12:28.131 CC lib/ftl/ftl_l2p_flat.o 00:12:28.131 CC lib/ftl/ftl_nv_cache.o 00:12:28.131 CC lib/ftl/ftl_band.o 00:12:28.389 CC lib/ftl/ftl_band_ops.o 00:12:28.389 CC lib/nvmf/mdns_server.o 00:12:28.389 CC lib/nvmf/rdma.o 00:12:28.389 CC lib/ftl/ftl_writer.o 00:12:28.389 CC lib/nvmf/auth.o 00:12:28.647 CC lib/ftl/ftl_rq.o 00:12:28.647 CC lib/ftl/ftl_reloc.o 00:12:28.647 CC lib/ftl/ftl_l2p_cache.o 00:12:28.647 CC lib/vhost/vhost.o 00:12:28.647 CC lib/iscsi/conn.o 00:12:28.647 CC lib/vhost/vhost_rpc.o 00:12:28.905 CC lib/vhost/vhost_scsi.o 00:12:28.905 CC lib/vhost/vhost_blk.o 00:12:28.905 CC lib/vhost/rte_vhost_user.o 00:12:28.905 CC lib/ftl/ftl_p2l.o 00:12:29.162 CC lib/ftl/mngt/ftl_mngt.o 00:12:29.162 CC lib/iscsi/init_grp.o 00:12:29.162 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:12:29.162 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:12:29.162 CC lib/iscsi/iscsi.o 00:12:29.419 CC lib/iscsi/md5.o 00:12:29.419 CC lib/ftl/mngt/ftl_mngt_startup.o 00:12:29.419 CC lib/ftl/mngt/ftl_mngt_md.o 00:12:29.419 CC lib/iscsi/param.o 00:12:29.419 CC lib/iscsi/portal_grp.o 00:12:29.676 CC lib/ftl/mngt/ftl_mngt_misc.o 00:12:29.676 CC lib/iscsi/tgt_node.o 00:12:29.676 CC lib/iscsi/iscsi_subsystem.o 00:12:29.676 CC lib/iscsi/iscsi_rpc.o 00:12:29.676 LIB libspdk_vhost.a 00:12:29.676 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:12:29.676 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:12:29.676 CC lib/ftl/mngt/ftl_mngt_band.o 00:12:29.676 CC lib/iscsi/task.o 00:12:29.933 LIB libspdk_nvmf.a 00:12:29.933 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:12:29.933 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:12:29.933 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:12:29.933 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:12:29.933 CC lib/ftl/utils/ftl_conf.o 00:12:29.933 CC lib/ftl/utils/ftl_md.o 00:12:29.933 CC lib/ftl/utils/ftl_mempool.o 00:12:30.190 CC lib/ftl/utils/ftl_bitmap.o 00:12:30.190 CC lib/ftl/utils/ftl_property.o 00:12:30.190 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:12:30.190 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:12:30.190 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:12:30.190 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:12:30.190 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:12:30.451 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:12:30.451 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:12:30.451 LIB libspdk_iscsi.a 00:12:30.451 CC lib/ftl/upgrade/ftl_sb_v3.o 00:12:30.451 CC lib/ftl/upgrade/ftl_sb_v5.o 00:12:30.451 CC lib/ftl/nvc/ftl_nvc_dev.o 00:12:30.451 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:12:30.451 CC lib/ftl/base/ftl_base_dev.o 00:12:30.451 CC lib/ftl/base/ftl_base_bdev.o 00:12:30.451 CC lib/ftl/ftl_trace.o 00:12:30.711 LIB libspdk_ftl.a 00:12:31.644 CC module/env_dpdk/env_dpdk_rpc.o 00:12:31.644 CC module/keyring/file/keyring.o 00:12:31.644 CC module/scheduler/dynamic/scheduler_dynamic.o 00:12:31.644 CC module/accel/ioat/accel_ioat.o 00:12:31.644 CC module/scheduler/gscheduler/gscheduler.o 00:12:31.644 CC module/accel/error/accel_error.o 00:12:31.644 CC module/blob/bdev/blob_bdev.o 00:12:31.644 CC module/sock/posix/posix.o 00:12:31.644 LIB libspdk_env_dpdk_rpc.a 00:12:31.644 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:12:31.644 CC module/keyring/linux/keyring.o 00:12:31.902 CC module/accel/ioat/accel_ioat_rpc.o 00:12:31.902 CC module/keyring/file/keyring_rpc.o 00:12:31.902 CC module/keyring/linux/keyring_rpc.o 00:12:31.902 LIB libspdk_scheduler_dynamic.a 00:12:31.902 LIB libspdk_scheduler_gscheduler.a 00:12:31.902 LIB libspdk_blob_bdev.a 00:12:31.902 LIB libspdk_scheduler_dpdk_governor.a 00:12:31.902 CC module/accel/error/accel_error_rpc.o 00:12:31.902 LIB libspdk_accel_ioat.a 00:12:32.160 LIB libspdk_keyring_linux.a 00:12:32.160 LIB libspdk_keyring_file.a 00:12:32.160 LIB libspdk_accel_error.a 00:12:32.160 CC module/accel/dsa/accel_dsa.o 00:12:32.160 CC module/accel/iaa/accel_iaa.o 00:12:32.418 CC module/bdev/gpt/gpt.o 00:12:32.418 CC module/bdev/lvol/vbdev_lvol.o 00:12:32.418 CC module/bdev/delay/vbdev_delay.o 00:12:32.418 CC module/bdev/error/vbdev_error.o 00:12:32.418 CC module/bdev/malloc/bdev_malloc.o 00:12:32.418 CC module/blobfs/bdev/blobfs_bdev.o 00:12:32.418 LIB libspdk_sock_posix.a 00:12:32.418 CC module/bdev/error/vbdev_error_rpc.o 00:12:32.418 CC module/accel/iaa/accel_iaa_rpc.o 00:12:32.418 CC module/accel/dsa/accel_dsa_rpc.o 00:12:32.418 CC module/bdev/null/bdev_null.o 00:12:32.676 CC module/bdev/gpt/vbdev_gpt.o 00:12:32.676 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:12:32.676 LIB libspdk_accel_dsa.a 00:12:32.676 LIB libspdk_accel_iaa.a 00:12:32.676 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:12:32.676 LIB libspdk_bdev_error.a 00:12:32.676 CC module/bdev/malloc/bdev_malloc_rpc.o 00:12:32.676 CC module/bdev/delay/vbdev_delay_rpc.o 00:12:32.934 CC module/bdev/null/bdev_null_rpc.o 00:12:32.934 LIB libspdk_blobfs_bdev.a 00:12:32.934 LIB libspdk_bdev_gpt.a 00:12:32.934 LIB libspdk_bdev_malloc.a 00:12:32.934 LIB libspdk_bdev_delay.a 00:12:32.934 CC module/bdev/passthru/vbdev_passthru.o 00:12:32.934 CC module/bdev/raid/bdev_raid.o 00:12:32.934 CC module/bdev/nvme/bdev_nvme.o 00:12:32.934 LIB libspdk_bdev_null.a 00:12:32.934 CC module/bdev/nvme/bdev_nvme_rpc.o 00:12:33.192 CC module/bdev/nvme/nvme_rpc.o 00:12:33.192 CC module/bdev/split/vbdev_split.o 00:12:33.192 LIB libspdk_bdev_lvol.a 00:12:33.192 CC module/bdev/zone_block/vbdev_zone_block.o 00:12:33.192 CC module/bdev/split/vbdev_split_rpc.o 00:12:33.192 CC module/bdev/aio/bdev_aio.o 00:12:33.192 CC module/bdev/ftl/bdev_ftl.o 00:12:33.450 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:12:33.450 CC module/bdev/raid/bdev_raid_rpc.o 00:12:33.450 CC module/bdev/ftl/bdev_ftl_rpc.o 00:12:33.450 LIB libspdk_bdev_split.a 00:12:33.450 CC module/bdev/nvme/bdev_mdns_client.o 00:12:33.450 LIB libspdk_bdev_passthru.a 00:12:33.450 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:12:33.707 CC module/bdev/aio/bdev_aio_rpc.o 00:12:33.707 CC module/bdev/raid/bdev_raid_sb.o 00:12:33.707 LIB libspdk_bdev_ftl.a 00:12:33.707 CC module/bdev/raid/raid0.o 00:12:33.707 CC module/bdev/raid/raid1.o 00:12:33.707 CC module/bdev/nvme/vbdev_opal.o 00:12:33.707 CC module/bdev/raid/concat.o 00:12:33.707 LIB libspdk_bdev_zone_block.a 00:12:33.707 LIB libspdk_bdev_aio.a 00:12:33.707 CC module/bdev/nvme/vbdev_opal_rpc.o 00:12:33.707 CC module/bdev/iscsi/bdev_iscsi.o 00:12:33.965 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:12:33.965 CC module/bdev/virtio/bdev_virtio_scsi.o 00:12:33.965 CC module/bdev/virtio/bdev_virtio_blk.o 00:12:33.965 CC module/bdev/virtio/bdev_virtio_rpc.o 00:12:33.965 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:12:33.965 LIB libspdk_bdev_raid.a 00:12:34.223 LIB libspdk_bdev_iscsi.a 00:12:34.223 LIB libspdk_bdev_virtio.a 00:12:34.481 LIB libspdk_bdev_nvme.a 00:12:35.048 CC module/event/subsystems/keyring/keyring.o 00:12:35.048 CC module/event/subsystems/vmd/vmd.o 00:12:35.048 CC module/event/subsystems/vmd/vmd_rpc.o 00:12:35.048 CC module/event/subsystems/scheduler/scheduler.o 00:12:35.048 CC module/event/subsystems/sock/sock.o 00:12:35.048 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:12:35.048 CC module/event/subsystems/iobuf/iobuf.o 00:12:35.048 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:12:35.305 LIB libspdk_event_vhost_blk.a 00:12:35.306 LIB libspdk_event_scheduler.a 00:12:35.306 LIB libspdk_event_iobuf.a 00:12:35.306 LIB libspdk_event_keyring.a 00:12:35.306 LIB libspdk_event_vmd.a 00:12:35.306 LIB libspdk_event_sock.a 00:12:35.873 CC module/event/subsystems/accel/accel.o 00:12:35.873 LIB libspdk_event_accel.a 00:12:36.132 CC module/event/subsystems/bdev/bdev.o 00:12:36.391 LIB libspdk_event_bdev.a 00:12:36.958 CC module/event/subsystems/nbd/nbd.o 00:12:36.958 CC module/event/subsystems/scsi/scsi.o 00:12:36.958 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:12:36.958 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:12:36.958 LIB libspdk_event_nbd.a 00:12:36.958 LIB libspdk_event_scsi.a 00:12:36.958 LIB libspdk_event_nvmf.a 00:12:37.523 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:12:37.523 CC module/event/subsystems/iscsi/iscsi.o 00:12:37.523 LIB libspdk_event_vhost_scsi.a 00:12:37.781 LIB libspdk_event_iscsi.a 00:12:38.039 CXX app/trace/trace.o 00:12:38.297 CC examples/sock/hello_world/hello_sock.o 00:12:38.297 CC examples/ioat/perf/perf.o 00:12:38.297 CC examples/accel/perf/accel_perf.o 00:12:38.297 CC examples/nvme/hello_world/hello_world.o 00:12:38.297 CC examples/vmd/lsvmd/lsvmd.o 00:12:38.298 CC examples/blob/hello_world/hello_blob.o 00:12:38.298 CC test/accel/dif/dif.o 00:12:38.298 CC examples/bdev/hello_world/hello_bdev.o 00:12:38.298 CC examples/nvmf/nvmf/nvmf.o 00:12:38.556 LINK lsvmd 00:12:38.556 LINK ioat_perf 00:12:38.556 LINK hello_sock 00:12:38.556 LINK hello_world 00:12:38.556 LINK spdk_trace 00:12:38.556 LINK hello_blob 00:12:38.556 LINK hello_bdev 00:12:38.556 LINK nvmf 00:12:38.556 LINK accel_perf 00:12:38.556 LINK dif 00:12:39.217 CC app/trace_record/trace_record.o 00:12:39.217 CC examples/ioat/verify/verify.o 00:12:39.476 LINK spdk_trace_record 00:12:39.476 LINK verify 00:12:39.734 CC app/nvmf_tgt/nvmf_main.o 00:12:39.735 LINK nvmf_tgt 00:12:39.993 CC examples/bdev/bdevperf/bdevperf.o 00:12:39.993 CC examples/vmd/led/led.o 00:12:39.993 CC examples/nvme/reconnect/reconnect.o 00:12:40.253 LINK led 00:12:40.253 CC test/app/bdev_svc/bdev_svc.o 00:12:40.253 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:12:40.511 LINK reconnect 00:12:40.511 LINK bdev_svc 00:12:40.511 LINK bdevperf 00:12:40.769 LINK nvme_fuzz 00:12:42.142 CC examples/blob/cli/blobcli.o 00:12:42.142 CC examples/nvme/nvme_manage/nvme_manage.o 00:12:42.142 CC test/app/histogram_perf/histogram_perf.o 00:12:42.400 LINK blobcli 00:12:42.400 LINK nvme_manage 00:12:42.400 LINK histogram_perf 00:12:42.400 CC test/bdev/bdevio/bdevio.o 00:12:42.400 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:12:42.400 CC test/app/jsoncat/jsoncat.o 00:12:42.657 LINK jsoncat 00:12:42.915 LINK bdevio 00:12:42.915 CC test/app/stub/stub.o 00:12:43.172 LINK stub 00:12:43.428 CC examples/util/zipf/zipf.o 00:12:43.684 LINK zipf 00:12:43.684 CC examples/thread/thread/thread_ex.o 00:12:43.940 CC app/iscsi_tgt/iscsi_tgt.o 00:12:43.940 CC examples/nvme/arbitration/arbitration.o 00:12:43.940 LINK thread 00:12:44.196 LINK iscsi_fuzz 00:12:44.196 LINK iscsi_tgt 00:12:44.196 LINK arbitration 00:12:44.196 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:12:44.452 CC examples/nvme/hotplug/hotplug.o 00:12:44.452 CC examples/idxd/perf/perf.o 00:12:44.452 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:12:44.709 LINK hotplug 00:12:44.709 LINK idxd_perf 00:12:44.967 CC test/blobfs/mkfs/mkfs.o 00:12:44.967 LINK vhost_fuzz 00:12:45.225 LINK mkfs 00:12:45.483 TEST_HEADER include/spdk/accel.h 00:12:45.741 TEST_HEADER include/spdk/accel_module.h 00:12:45.741 TEST_HEADER include/spdk/assert.h 00:12:45.741 TEST_HEADER include/spdk/barrier.h 00:12:45.741 TEST_HEADER include/spdk/base64.h 00:12:45.741 TEST_HEADER include/spdk/bdev.h 00:12:45.741 TEST_HEADER include/spdk/bdev_module.h 00:12:45.741 TEST_HEADER include/spdk/bdev_zone.h 00:12:45.741 TEST_HEADER include/spdk/bit_array.h 00:12:45.741 TEST_HEADER include/spdk/bit_pool.h 00:12:45.741 TEST_HEADER include/spdk/blob.h 00:12:45.741 TEST_HEADER include/spdk/blob_bdev.h 00:12:45.741 TEST_HEADER include/spdk/blobfs.h 00:12:45.741 TEST_HEADER include/spdk/blobfs_bdev.h 00:12:45.741 TEST_HEADER include/spdk/conf.h 00:12:45.741 TEST_HEADER include/spdk/config.h 00:12:45.741 TEST_HEADER include/spdk/cpuset.h 00:12:45.741 TEST_HEADER include/spdk/crc16.h 00:12:45.741 TEST_HEADER include/spdk/crc32.h 00:12:45.741 TEST_HEADER include/spdk/crc64.h 00:12:45.741 TEST_HEADER include/spdk/dif.h 00:12:45.741 TEST_HEADER include/spdk/dma.h 00:12:45.741 TEST_HEADER include/spdk/endian.h 00:12:45.741 TEST_HEADER include/spdk/env.h 00:12:45.741 TEST_HEADER include/spdk/env_dpdk.h 00:12:45.741 TEST_HEADER include/spdk/event.h 00:12:45.741 TEST_HEADER include/spdk/fd.h 00:12:45.741 TEST_HEADER include/spdk/fd_group.h 00:12:45.741 TEST_HEADER include/spdk/file.h 00:12:45.741 TEST_HEADER include/spdk/ftl.h 00:12:45.741 TEST_HEADER include/spdk/gpt_spec.h 00:12:45.741 TEST_HEADER include/spdk/hexlify.h 00:12:45.741 TEST_HEADER include/spdk/histogram_data.h 00:12:45.741 TEST_HEADER include/spdk/idxd.h 00:12:45.741 TEST_HEADER include/spdk/idxd_spec.h 00:12:45.741 TEST_HEADER include/spdk/init.h 00:12:45.741 TEST_HEADER include/spdk/ioat.h 00:12:45.741 TEST_HEADER include/spdk/ioat_spec.h 00:12:45.741 TEST_HEADER include/spdk/iscsi_spec.h 00:12:45.741 TEST_HEADER include/spdk/json.h 00:12:45.741 TEST_HEADER include/spdk/jsonrpc.h 00:12:45.741 TEST_HEADER include/spdk/keyring.h 00:12:45.741 TEST_HEADER include/spdk/keyring_module.h 00:12:45.741 TEST_HEADER include/spdk/likely.h 00:12:45.741 TEST_HEADER include/spdk/log.h 00:12:45.741 TEST_HEADER include/spdk/lvol.h 00:12:45.741 TEST_HEADER include/spdk/memory.h 00:12:45.741 TEST_HEADER include/spdk/mmio.h 00:12:45.741 TEST_HEADER include/spdk/nbd.h 00:12:45.741 TEST_HEADER include/spdk/notify.h 00:12:45.741 TEST_HEADER include/spdk/nvme.h 00:12:45.741 TEST_HEADER include/spdk/nvme_intel.h 00:12:45.741 TEST_HEADER include/spdk/nvme_ocssd.h 00:12:45.741 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:12:45.741 TEST_HEADER include/spdk/nvme_spec.h 00:12:45.741 TEST_HEADER include/spdk/nvme_zns.h 00:12:45.741 TEST_HEADER include/spdk/nvmf.h 00:12:45.741 TEST_HEADER include/spdk/nvmf_cmd.h 00:12:45.741 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:12:45.741 TEST_HEADER include/spdk/nvmf_spec.h 00:12:45.741 TEST_HEADER include/spdk/nvmf_transport.h 00:12:45.741 TEST_HEADER include/spdk/opal.h 00:12:45.741 TEST_HEADER include/spdk/opal_spec.h 00:12:45.741 TEST_HEADER include/spdk/pci_ids.h 00:12:45.741 TEST_HEADER include/spdk/pipe.h 00:12:45.741 TEST_HEADER include/spdk/queue.h 00:12:45.741 TEST_HEADER include/spdk/reduce.h 00:12:45.741 TEST_HEADER include/spdk/rpc.h 00:12:45.741 TEST_HEADER include/spdk/scheduler.h 00:12:45.741 TEST_HEADER include/spdk/scsi.h 00:12:45.741 TEST_HEADER include/spdk/scsi_spec.h 00:12:45.741 TEST_HEADER include/spdk/sock.h 00:12:45.741 TEST_HEADER include/spdk/stdinc.h 00:12:45.741 TEST_HEADER include/spdk/string.h 00:12:45.741 TEST_HEADER include/spdk/thread.h 00:12:45.741 CC examples/interrupt_tgt/interrupt_tgt.o 00:12:45.741 TEST_HEADER include/spdk/trace.h 00:12:45.741 TEST_HEADER include/spdk/trace_parser.h 00:12:45.741 TEST_HEADER include/spdk/tree.h 00:12:45.999 TEST_HEADER include/spdk/ublk.h 00:12:45.999 TEST_HEADER include/spdk/util.h 00:12:45.999 TEST_HEADER include/spdk/uuid.h 00:12:45.999 TEST_HEADER include/spdk/version.h 00:12:45.999 TEST_HEADER include/spdk/vfio_user_pci.h 00:12:45.999 TEST_HEADER include/spdk/vfio_user_spec.h 00:12:45.999 TEST_HEADER include/spdk/vhost.h 00:12:45.999 TEST_HEADER include/spdk/vmd.h 00:12:45.999 TEST_HEADER include/spdk/xor.h 00:12:45.999 TEST_HEADER include/spdk/zipf.h 00:12:45.999 CXX test/cpp_headers/accel.o 00:12:45.999 CC test/dma/test_dma/test_dma.o 00:12:46.000 CC examples/nvme/cmb_copy/cmb_copy.o 00:12:46.000 LINK interrupt_tgt 00:12:46.000 CXX test/cpp_headers/accel_module.o 00:12:46.257 CC examples/nvme/abort/abort.o 00:12:46.257 LINK cmb_copy 00:12:46.257 LINK test_dma 00:12:46.257 CXX test/cpp_headers/assert.o 00:12:46.514 CC app/spdk_tgt/spdk_tgt.o 00:12:46.514 LINK abort 00:12:46.514 CXX test/cpp_headers/barrier.o 00:12:46.514 CC app/spdk_lspci/spdk_lspci.o 00:12:46.771 LINK spdk_tgt 00:12:46.771 LINK spdk_lspci 00:12:46.771 CXX test/cpp_headers/base64.o 00:12:47.028 CXX test/cpp_headers/bdev.o 00:12:47.286 CXX test/cpp_headers/bdev_module.o 00:12:47.543 CXX test/cpp_headers/bdev_zone.o 00:12:47.543 CXX test/cpp_headers/bit_array.o 00:12:47.800 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:12:47.800 CXX test/cpp_headers/bit_pool.o 00:12:48.058 CXX test/cpp_headers/blob.o 00:12:48.058 CXX test/cpp_headers/blob_bdev.o 00:12:48.058 LINK pmr_persistence 00:12:48.058 CXX test/cpp_headers/blobfs.o 00:12:48.058 CXX test/cpp_headers/blobfs_bdev.o 00:12:48.315 CXX test/cpp_headers/conf.o 00:12:48.315 CC app/spdk_nvme_perf/perf.o 00:12:48.315 CC app/spdk_nvme_identify/identify.o 00:12:48.315 CC app/spdk_nvme_discover/discovery_aer.o 00:12:48.315 CXX test/cpp_headers/config.o 00:12:48.315 CXX test/cpp_headers/cpuset.o 00:12:48.572 CC app/spdk_top/spdk_top.o 00:12:48.572 CC app/vhost/vhost.o 00:12:48.572 LINK spdk_nvme_discover 00:12:48.572 CXX test/cpp_headers/crc16.o 00:12:48.842 LINK vhost 00:12:48.842 LINK spdk_nvme_perf 00:12:48.842 CXX test/cpp_headers/crc32.o 00:12:49.135 LINK spdk_nvme_identify 00:12:49.135 CC app/spdk_dd/spdk_dd.o 00:12:49.135 CXX test/cpp_headers/crc64.o 00:12:49.135 LINK spdk_top 00:12:49.392 LINK spdk_dd 00:12:49.392 CXX test/cpp_headers/dif.o 00:12:49.647 CXX test/cpp_headers/dma.o 00:12:49.647 CC app/fio/nvme/fio_plugin.o 00:12:49.904 CXX test/cpp_headers/endian.o 00:12:50.160 CC app/fio/bdev/fio_plugin.o 00:12:50.160 CC test/env/vtophys/vtophys.o 00:12:50.160 LINK spdk_nvme 00:12:50.160 CXX test/cpp_headers/env.o 00:12:50.160 CC test/env/mem_callbacks/mem_callbacks.o 00:12:50.417 LINK vtophys 00:12:50.417 CXX test/cpp_headers/env_dpdk.o 00:12:50.417 CXX test/cpp_headers/event.o 00:12:50.417 LINK mem_callbacks 00:12:50.674 LINK spdk_bdev 00:12:50.674 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:12:50.674 CXX test/cpp_headers/fd.o 00:12:50.674 CC test/env/memory/memory_ut.o 00:12:50.674 LINK env_dpdk_post_init 00:12:50.674 CXX test/cpp_headers/fd_group.o 00:12:50.933 CXX test/cpp_headers/file.o 00:12:50.933 CXX test/cpp_headers/ftl.o 00:12:50.933 CC test/env/pci/pci_ut.o 00:12:51.192 CXX test/cpp_headers/gpt_spec.o 00:12:51.192 CXX test/cpp_headers/hexlify.o 00:12:51.192 LINK memory_ut 00:12:51.192 CXX test/cpp_headers/histogram_data.o 00:12:51.450 LINK pci_ut 00:12:51.450 CXX test/cpp_headers/idxd.o 00:12:51.451 CC test/event/event_perf/event_perf.o 00:12:51.451 CXX test/cpp_headers/idxd_spec.o 00:12:51.451 CXX test/cpp_headers/init.o 00:12:51.709 LINK event_perf 00:12:51.709 CC test/lvol/esnap/esnap.o 00:12:51.709 CXX test/cpp_headers/ioat.o 00:12:51.709 CC test/rpc_client/rpc_client_test.o 00:12:51.967 CC test/nvme/aer/aer.o 00:12:51.967 CXX test/cpp_headers/ioat_spec.o 00:12:51.967 LINK rpc_client_test 00:12:51.967 CC test/nvme/reset/reset.o 00:12:51.967 CC test/nvme/sgl/sgl.o 00:12:51.967 CXX test/cpp_headers/iscsi_spec.o 00:12:52.225 LINK aer 00:12:52.225 LINK sgl 00:12:52.225 LINK reset 00:12:52.225 CXX test/cpp_headers/json.o 00:12:52.483 CXX test/cpp_headers/jsonrpc.o 00:12:52.483 CC test/event/reactor/reactor.o 00:12:52.483 CXX test/cpp_headers/keyring.o 00:12:52.742 LINK reactor 00:12:52.742 CXX test/cpp_headers/keyring_module.o 00:12:52.742 CC test/event/reactor_perf/reactor_perf.o 00:12:53.000 LINK reactor_perf 00:12:53.000 CXX test/cpp_headers/likely.o 00:12:53.000 CXX test/cpp_headers/log.o 00:12:53.258 CXX test/cpp_headers/lvol.o 00:12:53.258 CC test/thread/poller_perf/poller_perf.o 00:12:53.516 CXX test/cpp_headers/memory.o 00:12:53.516 CXX test/cpp_headers/mmio.o 00:12:53.516 LINK poller_perf 00:12:53.516 CC test/nvme/e2edp/nvme_dp.o 00:12:53.773 CXX test/cpp_headers/nbd.o 00:12:53.773 CXX test/cpp_headers/notify.o 00:12:53.773 CC test/unit/include/spdk/histogram_data.h/histogram_ut.o 00:12:53.773 CC test/nvme/overhead/overhead.o 00:12:53.773 CC test/nvme/err_injection/err_injection.o 00:12:53.773 CXX test/cpp_headers/nvme.o 00:12:53.773 CC test/nvme/startup/startup.o 00:12:53.773 CC test/event/app_repeat/app_repeat.o 00:12:54.030 LINK nvme_dp 00:12:54.030 LINK histogram_ut 00:12:54.030 LINK err_injection 00:12:54.030 CXX test/cpp_headers/nvme_intel.o 00:12:54.030 LINK startup 00:12:54.030 LINK overhead 00:12:54.030 LINK app_repeat 00:12:54.030 CC test/nvme/reserve/reserve.o 00:12:54.297 CXX test/cpp_headers/nvme_ocssd.o 00:12:54.297 LINK reserve 00:12:54.555 CXX test/cpp_headers/nvme_ocssd_spec.o 00:12:54.555 CC test/thread/lock/spdk_lock.o 00:12:54.555 CC test/unit/lib/accel/accel.c/accel_ut.o 00:12:54.555 CXX test/cpp_headers/nvme_spec.o 00:12:54.555 LINK esnap 00:12:54.813 CXX test/cpp_headers/nvme_zns.o 00:12:55.069 CXX test/cpp_headers/nvmf.o 00:12:55.327 CXX test/cpp_headers/nvmf_cmd.o 00:12:55.327 CXX test/cpp_headers/nvmf_fc_spec.o 00:12:55.585 LINK spdk_lock 00:12:55.585 CC test/event/scheduler/scheduler.o 00:12:55.585 CXX test/cpp_headers/nvmf_spec.o 00:12:55.843 CC test/unit/lib/bdev/bdev.c/bdev_ut.o 00:12:55.843 LINK scheduler 00:12:55.843 CC test/unit/lib/blobfs/tree.c/tree_ut.o 00:12:55.843 CXX test/cpp_headers/nvmf_transport.o 00:12:55.843 CC test/unit/lib/blob/blob_bdev.c/blob_bdev_ut.o 00:12:55.843 CC test/unit/lib/event/app.c/app_ut.o 00:12:55.843 CC test/unit/lib/dma/dma.c/dma_ut.o 00:12:56.101 LINK tree_ut 00:12:56.101 CXX test/cpp_headers/opal.o 00:12:56.101 CC test/nvme/simple_copy/simple_copy.o 00:12:56.359 CXX test/cpp_headers/opal_spec.o 00:12:56.359 LINK simple_copy 00:12:56.359 LINK blob_bdev_ut 00:12:56.359 CC test/unit/lib/blobfs/blobfs_async_ut/blobfs_async_ut.o 00:12:56.359 LINK accel_ut 00:12:56.359 CC test/unit/lib/blobfs/blobfs_sync_ut/blobfs_sync_ut.o 00:12:56.359 LINK app_ut 00:12:56.359 CXX test/cpp_headers/pci_ids.o 00:12:56.617 LINK dma_ut 00:12:56.617 CXX test/cpp_headers/pipe.o 00:12:56.874 CXX test/cpp_headers/queue.o 00:12:56.874 CC test/unit/lib/blob/blob.c/blob_ut.o 00:12:56.874 CXX test/cpp_headers/reduce.o 00:12:56.874 CC test/unit/lib/blobfs/blobfs_bdev.c/blobfs_bdev_ut.o 00:12:56.874 CC test/unit/lib/event/reactor.c/reactor_ut.o 00:12:57.133 CXX test/cpp_headers/rpc.o 00:12:57.133 CC test/unit/lib/ioat/ioat.c/ioat_ut.o 00:12:57.133 LINK blobfs_bdev_ut 00:12:57.391 CXX test/cpp_headers/scheduler.o 00:12:57.391 LINK blobfs_async_ut 00:12:57.391 LINK blobfs_sync_ut 00:12:57.391 CXX test/cpp_headers/scsi.o 00:12:57.649 LINK ioat_ut 00:12:57.649 CXX test/cpp_headers/scsi_spec.o 00:12:57.649 LINK reactor_ut 00:12:57.649 CXX test/cpp_headers/sock.o 00:12:57.649 CC test/unit/lib/iscsi/conn.c/conn_ut.o 00:12:57.907 CXX test/cpp_headers/stdinc.o 00:12:57.907 CC test/unit/lib/iscsi/init_grp.c/init_grp_ut.o 00:12:57.907 CC test/nvme/connect_stress/connect_stress.o 00:12:58.165 CC test/unit/lib/iscsi/iscsi.c/iscsi_ut.o 00:12:58.165 CXX test/cpp_headers/string.o 00:12:58.165 CC test/unit/lib/iscsi/param.c/param_ut.o 00:12:58.165 CXX test/cpp_headers/thread.o 00:12:58.165 LINK connect_stress 00:12:58.422 LINK init_grp_ut 00:12:58.422 CXX test/cpp_headers/trace.o 00:12:58.422 LINK param_ut 00:12:58.680 CC test/unit/lib/json/json_parse.c/json_parse_ut.o 00:12:58.680 CXX test/cpp_headers/trace_parser.o 00:12:58.938 CXX test/cpp_headers/tree.o 00:12:58.938 LINK conn_ut 00:12:58.938 CC test/unit/lib/jsonrpc/jsonrpc_server.c/jsonrpc_server_ut.o 00:12:58.938 CXX test/cpp_headers/ublk.o 00:12:59.195 CC test/unit/lib/log/log.c/log_ut.o 00:12:59.195 CXX test/cpp_headers/util.o 00:12:59.195 LINK bdev_ut 00:12:59.195 LINK jsonrpc_server_ut 00:12:59.453 CXX test/cpp_headers/uuid.o 00:12:59.453 LINK log_ut 00:12:59.453 CXX test/cpp_headers/version.o 00:12:59.710 CXX test/cpp_headers/vfio_user_pci.o 00:12:59.710 CXX test/cpp_headers/vfio_user_spec.o 00:12:59.710 CC test/unit/lib/iscsi/portal_grp.c/portal_grp_ut.o 00:12:59.710 CXX test/cpp_headers/vhost.o 00:12:59.710 CXX test/cpp_headers/vmd.o 00:12:59.710 CC test/nvme/boot_partition/boot_partition.o 00:12:59.710 CC test/unit/lib/bdev/part.c/part_ut.o 00:12:59.710 LINK iscsi_ut 00:12:59.967 LINK json_parse_ut 00:12:59.967 CC test/unit/lib/iscsi/tgt_node.c/tgt_node_ut.o 00:12:59.967 CXX test/cpp_headers/xor.o 00:12:59.967 LINK boot_partition 00:12:59.967 CC test/unit/lib/lvol/lvol.c/lvol_ut.o 00:13:00.224 CXX test/cpp_headers/zipf.o 00:13:00.224 CC test/unit/lib/notify/notify.c/notify_ut.o 00:13:00.224 CC test/unit/lib/json/json_util.c/json_util_ut.o 00:13:00.482 CC test/unit/lib/json/json_write.c/json_write_ut.o 00:13:00.482 LINK portal_grp_ut 00:13:00.482 CC test/unit/lib/nvme/nvme.c/nvme_ut.o 00:13:00.739 LINK notify_ut 00:13:00.739 LINK tgt_node_ut 00:13:00.997 LINK json_util_ut 00:13:00.997 CC test/unit/lib/nvmf/tcp.c/tcp_ut.o 00:13:01.254 LINK json_write_ut 00:13:01.254 CC test/unit/lib/nvme/nvme_ctrlr.c/nvme_ctrlr_ut.o 00:13:01.254 CC test/unit/lib/nvme/nvme_ctrlr_ocssd_cmd.c/nvme_ctrlr_ocssd_cmd_ut.o 00:13:01.254 CC test/unit/lib/nvme/nvme_ctrlr_cmd.c/nvme_ctrlr_cmd_ut.o 00:13:01.512 LINK blob_ut 00:13:01.512 LINK lvol_ut 00:13:01.512 CC test/unit/lib/nvme/nvme_ns.c/nvme_ns_ut.o 00:13:01.512 CC test/nvme/compliance/nvme_compliance.o 00:13:01.770 LINK nvme_ut 00:13:01.770 LINK nvme_compliance 00:13:02.028 CC test/unit/lib/nvmf/subsystem.c/subsystem_ut.o 00:13:02.028 CC test/unit/lib/nvmf/ctrlr.c/ctrlr_ut.o 00:13:02.028 LINK part_ut 00:13:02.028 LINK nvme_ctrlr_ocssd_cmd_ut 00:13:02.028 CC test/unit/lib/nvmf/ctrlr_discovery.c/ctrlr_discovery_ut.o 00:13:02.287 LINK nvme_ctrlr_cmd_ut 00:13:02.287 LINK nvme_ns_ut 00:13:02.546 CC test/unit/lib/bdev/scsi_nvme.c/scsi_nvme_ut.o 00:13:02.546 CC test/unit/lib/scsi/dev.c/dev_ut.o 00:13:02.824 CC test/unit/lib/scsi/lun.c/lun_ut.o 00:13:02.824 CC test/unit/lib/nvmf/ctrlr_bdev.c/ctrlr_bdev_ut.o 00:13:02.824 LINK scsi_nvme_ut 00:13:02.824 LINK dev_ut 00:13:03.392 CC test/unit/lib/bdev/gpt/gpt.c/gpt_ut.o 00:13:03.392 CC test/unit/lib/bdev/vbdev_lvol.c/vbdev_lvol_ut.o 00:13:03.392 LINK nvme_ctrlr_ut 00:13:03.392 LINK lun_ut 00:13:03.392 LINK ctrlr_discovery_ut 00:13:03.392 CC test/nvme/fused_ordering/fused_ordering.o 00:13:03.392 LINK subsystem_ut 00:13:03.392 LINK ctrlr_bdev_ut 00:13:03.650 LINK tcp_ut 00:13:03.650 LINK gpt_ut 00:13:03.650 LINK fused_ordering 00:13:03.909 CC test/unit/lib/nvme/nvme_ns_cmd.c/nvme_ns_cmd_ut.o 00:13:03.909 CC test/unit/lib/scsi/scsi.c/scsi_ut.o 00:13:03.909 CC test/unit/lib/scsi/scsi_bdev.c/scsi_bdev_ut.o 00:13:04.167 CC test/unit/lib/scsi/scsi_pr.c/scsi_pr_ut.o 00:13:04.167 LINK ctrlr_ut 00:13:04.167 CC test/unit/lib/sock/sock.c/sock_ut.o 00:13:04.167 CC test/unit/lib/thread/thread.c/thread_ut.o 00:13:04.425 CC test/unit/lib/thread/iobuf.c/iobuf_ut.o 00:13:04.425 LINK scsi_ut 00:13:04.425 LINK vbdev_lvol_ut 00:13:04.425 LINK scsi_pr_ut 00:13:04.683 CC test/unit/lib/nvmf/nvmf.c/nvmf_ut.o 00:13:04.683 LINK scsi_bdev_ut 00:13:04.941 CC test/unit/lib/bdev/mt/bdev.c/bdev_ut.o 00:13:04.941 CC test/nvme/doorbell_aers/doorbell_aers.o 00:13:04.941 CC test/unit/lib/nvmf/auth.c/auth_ut.o 00:13:04.941 LINK iobuf_ut 00:13:04.941 LINK doorbell_aers 00:13:05.200 CC test/unit/lib/nvmf/rdma.c/rdma_ut.o 00:13:05.200 CC test/unit/lib/nvmf/transport.c/transport_ut.o 00:13:05.458 CC test/unit/lib/sock/posix.c/posix_ut.o 00:13:05.458 LINK nvme_ns_cmd_ut 00:13:05.458 LINK sock_ut 00:13:05.458 LINK thread_ut 00:13:05.719 LINK nvmf_ut 00:13:05.977 CC test/unit/lib/nvme/nvme_ns_ocssd_cmd.c/nvme_ns_ocssd_cmd_ut.o 00:13:05.977 CC test/unit/lib/nvme/nvme_poll_group.c/nvme_poll_group_ut.o 00:13:05.977 CC test/unit/lib/nvme/nvme_pcie.c/nvme_pcie_ut.o 00:13:05.977 CC test/unit/lib/nvme/nvme_qpair.c/nvme_qpair_ut.o 00:13:06.235 LINK auth_ut 00:13:06.235 LINK posix_ut 00:13:06.801 CC test/nvme/fdp/fdp.o 00:13:06.801 CC test/unit/lib/nvme/nvme_quirks.c/nvme_quirks_ut.o 00:13:06.801 CC test/unit/lib/nvme/nvme_tcp.c/nvme_tcp_ut.o 00:13:06.801 LINK nvme_poll_group_ut 00:13:07.059 LINK fdp 00:13:07.059 LINK transport_ut 00:13:07.059 LINK nvme_qpair_ut 00:13:07.059 LINK nvme_ns_ocssd_cmd_ut 00:13:07.317 CC test/nvme/cuse/cuse.o 00:13:07.317 LINK rdma_ut 00:13:07.317 LINK nvme_quirks_ut 00:13:07.317 LINK bdev_ut 00:13:07.575 CC test/unit/lib/nvme/nvme_transport.c/nvme_transport_ut.o 00:13:07.575 LINK nvme_pcie_ut 00:13:07.575 CC test/unit/lib/nvme/nvme_io_msg.c/nvme_io_msg_ut.o 00:13:07.575 CC test/unit/lib/bdev/raid/bdev_raid.c/bdev_raid_ut.o 00:13:07.833 CC test/unit/lib/bdev/raid/bdev_raid_sb.c/bdev_raid_sb_ut.o 00:13:07.833 CC test/unit/lib/bdev/raid/concat.c/concat_ut.o 00:13:07.833 CC test/unit/lib/bdev/raid/raid1.c/raid1_ut.o 00:13:08.091 CC test/unit/lib/bdev/raid/raid0.c/raid0_ut.o 00:13:08.091 LINK cuse 00:13:08.091 LINK bdev_raid_sb_ut 00:13:08.349 LINK nvme_transport_ut 00:13:08.349 LINK concat_ut 00:13:08.349 LINK nvme_io_msg_ut 00:13:08.349 LINK raid1_ut 00:13:08.607 CC test/unit/lib/util/base64.c/base64_ut.o 00:13:08.607 CC test/unit/lib/util/bit_array.c/bit_array_ut.o 00:13:08.607 LINK raid0_ut 00:13:08.607 CC test/unit/lib/util/cpuset.c/cpuset_ut.o 00:13:08.607 CC test/unit/lib/util/crc16.c/crc16_ut.o 00:13:08.865 LINK base64_ut 00:13:08.865 CC test/unit/lib/util/crc32_ieee.c/crc32_ieee_ut.o 00:13:08.865 LINK nvme_tcp_ut 00:13:08.865 CC test/unit/lib/util/crc64.c/crc64_ut.o 00:13:08.865 CC test/unit/lib/util/crc32c.c/crc32c_ut.o 00:13:08.865 LINK crc16_ut 00:13:08.865 LINK cpuset_ut 00:13:08.865 LINK bdev_raid_ut 00:13:08.865 LINK crc32_ieee_ut 00:13:08.865 LINK bit_array_ut 00:13:09.123 LINK crc64_ut 00:13:09.123 CC test/unit/lib/util/dif.c/dif_ut.o 00:13:09.123 LINK crc32c_ut 00:13:09.381 CC test/unit/lib/env_dpdk/pci_event.c/pci_event_ut.o 00:13:09.381 CC test/unit/lib/bdev/bdev_zone.c/bdev_zone_ut.o 00:13:09.381 CC test/unit/lib/bdev/vbdev_zone_block.c/vbdev_zone_block_ut.o 00:13:09.381 CC test/unit/lib/nvme/nvme_pcie_common.c/nvme_pcie_common_ut.o 00:13:09.381 CC test/unit/lib/util/iov.c/iov_ut.o 00:13:09.381 CC test/unit/lib/nvme/nvme_opal.c/nvme_opal_ut.o 00:13:09.381 CC test/unit/lib/nvme/nvme_fabric.c/nvme_fabric_ut.o 00:13:09.381 CC test/unit/lib/nvme/nvme_rdma.c/nvme_rdma_ut.o 00:13:09.381 CC test/unit/lib/nvme/nvme_cuse.c/nvme_cuse_ut.o 00:13:09.642 LINK bdev_zone_ut 00:13:09.642 LINK pci_event_ut 00:13:09.642 LINK iov_ut 00:13:09.899 LINK vbdev_zone_block_ut 00:13:09.899 LINK dif_ut 00:13:09.899 CC test/unit/lib/util/math.c/math_ut.o 00:13:10.157 CC test/unit/lib/init/subsystem.c/subsystem_ut.o 00:13:10.157 CC test/unit/lib/bdev/nvme/bdev_nvme.c/bdev_nvme_ut.o 00:13:10.157 LINK nvme_opal_ut 00:13:10.157 LINK math_ut 00:13:10.157 LINK nvme_fabric_ut 00:13:10.414 CC test/unit/lib/util/pipe.c/pipe_ut.o 00:13:10.414 CC test/unit/lib/util/string.c/string_ut.o 00:13:10.414 LINK nvme_pcie_common_ut 00:13:10.414 CC test/unit/lib/util/xor.c/xor_ut.o 00:13:10.672 LINK subsystem_ut 00:13:10.672 LINK nvme_cuse_ut 00:13:10.672 LINK string_ut 00:13:10.930 LINK xor_ut 00:13:10.930 CC test/unit/lib/rpc/rpc.c/rpc_ut.o 00:13:10.930 CC test/unit/lib/init/rpc.c/rpc_ut.o 00:13:10.930 LINK pipe_ut 00:13:10.930 CC test/unit/lib/keyring/keyring.c/keyring_ut.o 00:13:11.189 CC test/unit/lib/vhost/vhost.c/vhost_ut.o 00:13:11.189 CC test/unit/lib/idxd/idxd_user.c/idxd_user_ut.o 00:13:11.189 LINK nvme_rdma_ut 00:13:11.189 CC test/unit/lib/idxd/idxd.c/idxd_ut.o 00:13:11.189 LINK rpc_ut 00:13:11.189 CC test/unit/lib/ftl/ftl_l2p/ftl_l2p_ut.o 00:13:11.447 LINK keyring_ut 00:13:11.447 CC test/unit/lib/rdma/common.c/common_ut.o 00:13:11.447 LINK rpc_ut 00:13:11.705 CC test/unit/lib/ftl/ftl_band.c/ftl_band_ut.o 00:13:11.705 LINK ftl_l2p_ut 00:13:11.705 CC test/unit/lib/ftl/ftl_io.c/ftl_io_ut.o 00:13:11.705 LINK idxd_user_ut 00:13:11.705 CC test/unit/lib/ftl/ftl_p2l.c/ftl_p2l_ut.o 00:13:11.963 LINK common_ut 00:13:11.963 CC test/unit/lib/ftl/ftl_bitmap.c/ftl_bitmap_ut.o 00:13:11.963 LINK idxd_ut 00:13:12.220 CC test/unit/lib/ftl/ftl_mempool.c/ftl_mempool_ut.o 00:13:12.220 LINK ftl_bitmap_ut 00:13:12.220 CC test/unit/lib/ftl/ftl_mngt/ftl_mngt_ut.o 00:13:12.220 CC test/unit/lib/ftl/ftl_sb/ftl_sb_ut.o 00:13:12.478 CC test/unit/lib/ftl/ftl_layout_upgrade/ftl_layout_upgrade_ut.o 00:13:12.478 LINK ftl_io_ut 00:13:12.478 LINK ftl_mempool_ut 00:13:12.736 LINK ftl_p2l_ut 00:13:12.736 LINK ftl_mngt_ut 00:13:12.736 LINK ftl_band_ut 00:13:12.994 LINK vhost_ut 00:13:13.252 LINK bdev_nvme_ut 00:13:13.252 LINK ftl_sb_ut 00:13:13.511 LINK ftl_layout_upgrade_ut 00:13:13.511 00:13:13.511 real 1m59.757s 00:13:13.511 user 8m46.271s 00:13:13.511 sys 3m47.042s 00:13:13.511 12:15:37 unittest_build -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:13:13.511 12:15:37 unittest_build -- common/autotest_common.sh@10 -- $ set +x 00:13:13.511 ************************************ 00:13:13.511 END TEST unittest_build 00:13:13.511 ************************************ 00:13:13.769 12:15:37 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:13:13.769 12:15:37 -- pm/common@29 -- $ signal_monitor_resources TERM 00:13:13.769 12:15:37 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:13:13.769 12:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:13:13.769 12:15:37 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-cpu-load.pid ]] 00:13:13.769 12:15:37 -- pm/common@44 -- $ pid=8994 00:13:13.770 12:15:37 -- pm/common@50 -- $ kill -TERM 8994 00:13:13.770 12:15:37 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:13:13.770 12:15:37 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-vmstat.pid ]] 00:13:13.770 12:15:37 -- pm/common@44 -- $ pid=8996 00:13:13.770 12:15:37 -- pm/common@50 -- $ kill -TERM 8996 00:13:13.770 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:13:13.770 12:15:37 -- spdk/autotest.sh@25 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:13:13.770 12:15:37 -- nvmf/common.sh@7 -- # uname -s 00:13:13.770 12:15:37 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:13:13.770 12:15:37 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:13:13.770 12:15:37 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:13:13.770 12:15:37 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:13:13.770 12:15:37 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:13:13.770 12:15:37 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:13:13.770 12:15:37 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:13:13.770 12:15:37 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:13:13.770 12:15:37 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:13:13.770 12:15:37 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:13:13.770 12:15:37 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:13:13.770 12:15:37 -- nvmf/common.sh@18 -- # NVME_HOSTID=de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:13:13.770 12:15:37 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:13:13.770 12:15:37 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:13:13.770 12:15:37 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:13:13.770 12:15:37 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:13:13.770 12:15:37 -- nvmf/common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:13:13.770 12:15:37 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:13:13.770 12:15:37 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:13:13.770 12:15:37 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:13:13.770 12:15:37 -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:13:13.770 12:15:37 -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:13:13.770 12:15:37 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:13:13.770 12:15:37 -- paths/export.sh@5 -- # export PATH 00:13:13.770 12:15:37 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:13:13.770 12:15:37 -- nvmf/common.sh@47 -- # : 0 00:13:13.770 12:15:37 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:13:13.770 12:15:37 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:13:13.770 12:15:37 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:13:13.770 12:15:37 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:13:13.770 12:15:37 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:13:13.770 12:15:37 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:13:13.770 12:15:37 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:13:13.770 12:15:37 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:13:13.770 12:15:37 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:13:13.770 12:15:37 -- spdk/autotest.sh@32 -- # uname -s 00:13:13.770 12:15:37 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:13:13.770 12:15:37 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:13:13.770 12:15:37 -- spdk/autotest.sh@34 -- # mkdir -p /home/vagrant/spdk_repo/spdk/../output/coredumps 00:13:13.770 12:15:37 -- spdk/autotest.sh@39 -- # echo '|/home/vagrant/spdk_repo/spdk/scripts/core-collector.sh %P %s %t' 00:13:13.770 12:15:37 -- spdk/autotest.sh@40 -- # echo /home/vagrant/spdk_repo/spdk/../output/coredumps 00:13:13.770 12:15:37 -- spdk/autotest.sh@44 -- # modprobe nbd 00:13:13.770 12:15:37 -- spdk/autotest.sh@46 -- # type -P udevadm 00:13:13.770 12:15:37 -- spdk/autotest.sh@46 -- # udevadm=/sbin/udevadm 00:13:13.770 12:15:37 -- spdk/autotest.sh@48 -- # udevadm_pid=181568 00:13:13.770 12:15:37 -- spdk/autotest.sh@47 -- # /sbin/udevadm monitor --property 00:13:13.770 12:15:37 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:13:13.770 12:15:37 -- pm/common@17 -- # local monitor 00:13:13.770 12:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:13:13.770 12:15:37 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:13:13.770 12:15:37 -- pm/common@21 -- # date +%s 00:13:13.770 12:15:37 -- pm/common@25 -- # sleep 1 00:13:13.770 12:15:37 -- pm/common@21 -- # date +%s 00:13:13.770 12:15:37 -- pm/common@21 -- # /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-cpu-load -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autotest.sh.1717762537 00:13:13.770 12:15:37 -- pm/common@21 -- # /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-vmstat -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autotest.sh.1717762537 00:13:14.028 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autotest.sh.1717762537_collect-vmstat.pm.log 00:13:14.028 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autotest.sh.1717762537_collect-cpu-load.pm.log 00:13:14.963 12:15:38 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:13:14.963 12:15:38 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:13:14.963 12:15:38 -- common/autotest_common.sh@723 -- # xtrace_disable 00:13:14.963 12:15:38 -- common/autotest_common.sh@10 -- # set +x 00:13:14.963 12:15:38 -- spdk/autotest.sh@59 -- # create_test_list 00:13:14.963 12:15:38 -- common/autotest_common.sh@747 -- # xtrace_disable 00:13:14.963 12:15:38 -- common/autotest_common.sh@10 -- # set +x 00:13:14.963 12:15:38 -- spdk/autotest.sh@61 -- # dirname /home/vagrant/spdk_repo/spdk/autotest.sh 00:13:14.963 12:15:38 -- spdk/autotest.sh@61 -- # readlink -f /home/vagrant/spdk_repo/spdk 00:13:14.963 12:15:38 -- spdk/autotest.sh@61 -- # src=/home/vagrant/spdk_repo/spdk 00:13:14.963 12:15:38 -- spdk/autotest.sh@62 -- # out=/home/vagrant/spdk_repo/spdk/../output 00:13:14.963 12:15:38 -- spdk/autotest.sh@63 -- # cd /home/vagrant/spdk_repo/spdk 00:13:14.963 12:15:38 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:13:14.963 12:15:38 -- common/autotest_common.sh@1454 -- # uname 00:13:14.963 12:15:38 -- common/autotest_common.sh@1454 -- # '[' Linux = FreeBSD ']' 00:13:14.963 12:15:38 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:13:14.963 12:15:38 -- common/autotest_common.sh@1474 -- # uname 00:13:14.963 12:15:38 -- common/autotest_common.sh@1474 -- # [[ Linux = FreeBSD ]] 00:13:14.963 12:15:38 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:13:14.963 12:15:38 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:13:14.963 12:15:38 -- spdk/autotest.sh@72 -- # hash lcov 00:13:14.963 12:15:38 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:13:14.963 12:15:38 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:13:14.963 --rc lcov_branch_coverage=1 00:13:14.963 --rc lcov_function_coverage=1 00:13:14.963 --rc genhtml_branch_coverage=1 00:13:14.963 --rc genhtml_function_coverage=1 00:13:14.963 --rc genhtml_legend=1 00:13:14.963 --rc geninfo_all_blocks=1 00:13:14.963 ' 00:13:14.963 12:15:38 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:13:14.963 --rc lcov_branch_coverage=1 00:13:14.963 --rc lcov_function_coverage=1 00:13:14.963 --rc genhtml_branch_coverage=1 00:13:14.963 --rc genhtml_function_coverage=1 00:13:14.963 --rc genhtml_legend=1 00:13:14.963 --rc geninfo_all_blocks=1 00:13:14.963 ' 00:13:14.963 12:15:38 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:13:14.963 --rc lcov_branch_coverage=1 00:13:14.963 --rc lcov_function_coverage=1 00:13:14.963 --rc genhtml_branch_coverage=1 00:13:14.963 --rc genhtml_function_coverage=1 00:13:14.963 --rc genhtml_legend=1 00:13:14.963 --rc geninfo_all_blocks=1 00:13:14.963 --no-external' 00:13:14.963 12:15:38 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:13:14.963 --rc lcov_branch_coverage=1 00:13:14.963 --rc lcov_function_coverage=1 00:13:14.963 --rc genhtml_branch_coverage=1 00:13:14.963 --rc genhtml_function_coverage=1 00:13:14.963 --rc genhtml_legend=1 00:13:14.963 --rc geninfo_all_blocks=1 00:13:14.963 --no-external' 00:13:14.963 12:15:38 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:13:14.963 lcov: LCOV version 1.15 00:13:14.963 12:15:38 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /home/vagrant/spdk_repo/spdk -o /home/vagrant/spdk_repo/spdk/../output/cov_base.info 00:13:27.165 /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:13:27.165 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc32.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc32.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel_module.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_zone.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/assert.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/assert.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/barrier.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/barrier.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/base64.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/base64.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/dma.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/dma.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_module.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_array.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_pool.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob_bdev.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc16.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc16.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs_bdev.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/conf.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/conf.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/config.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/config.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/dif.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/dif.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/cpuset.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc64.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc64.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/endian.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/endian.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/env.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/env.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/env_dpdk.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/event.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/event.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/reduce.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/reduce.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd_group.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/file.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/file.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ftl.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ftl.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/gpt_spec.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/hexlify.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/rpc.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/rpc.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/queue.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/queue.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/histogram_data.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd_spec.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/init.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/init.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat_spec.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/iscsi_spec.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/json.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/json.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/jsonrpc.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring.gcno:no functions found 00:13:37.134 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring.gcno 00:13:37.134 /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring_module.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/likely.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/likely.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/log.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/log.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/lvol.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/lvol.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/memory.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/memory.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/mmio.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/mmio.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/sock.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/sock.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/stdinc.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nbd.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nbd.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/notify.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/notify.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/string.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/string.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_intel.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/pci_ids.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_zns.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/pipe.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/pipe.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_cmd.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_transport.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scheduler.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/thread.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/thread.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace_parser.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/tree.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/tree.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ublk.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ublk.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/util.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/util.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/uuid.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/uuid.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/version.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/version.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_pci.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_spec.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vhost.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vhost.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vmd.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vmd.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/xor.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/xor.gcno 00:13:37.135 /home/vagrant/spdk_repo/spdk/test/cpp_headers/zipf.gcno:no functions found 00:13:37.135 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/zipf.gcno 00:14:03.680 12:16:26 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:14:03.680 12:16:26 -- common/autotest_common.sh@723 -- # xtrace_disable 00:14:03.680 12:16:26 -- common/autotest_common.sh@10 -- # set +x 00:14:03.680 12:16:26 -- spdk/autotest.sh@91 -- # rm -f 00:14:03.680 12:16:26 -- spdk/autotest.sh@94 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:03.680 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:03.680 0000:00:10.0 (1b36 0010): Already using the nvme driver 00:14:03.680 12:16:26 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:14:03.680 12:16:26 -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:14:03.680 12:16:26 -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:14:03.680 12:16:26 -- common/autotest_common.sh@1669 -- # local nvme bdf 00:14:03.680 12:16:26 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:14:03.680 12:16:26 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:14:03.680 12:16:26 -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:14:03.680 12:16:26 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:14:03.680 12:16:26 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:14:03.680 12:16:26 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:14:03.680 12:16:26 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:14:03.680 12:16:26 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:14:03.680 12:16:26 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:14:03.680 12:16:26 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:14:03.680 12:16:26 -- scripts/common.sh@387 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:14:03.680 No valid GPT data, bailing 00:14:03.680 12:16:26 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:14:03.680 12:16:26 -- scripts/common.sh@391 -- # pt= 00:14:03.680 12:16:26 -- scripts/common.sh@392 -- # return 1 00:14:03.680 12:16:26 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:14:03.680 1+0 records in 00:14:03.680 1+0 records out 00:14:03.680 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00521556 s, 201 MB/s 00:14:03.680 12:16:26 -- spdk/autotest.sh@118 -- # sync 00:14:03.680 12:16:26 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:14:03.680 12:16:26 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:14:03.680 12:16:26 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:14:05.053 12:16:28 -- spdk/autotest.sh@124 -- # uname -s 00:14:05.053 12:16:28 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:14:05.053 12:16:28 -- spdk/autotest.sh@125 -- # run_test setup.sh /home/vagrant/spdk_repo/spdk/test/setup/test-setup.sh 00:14:05.053 12:16:28 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:05.053 12:16:28 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:05.053 12:16:28 -- common/autotest_common.sh@10 -- # set +x 00:14:05.053 ************************************ 00:14:05.053 START TEST setup.sh 00:14:05.053 ************************************ 00:14:05.053 12:16:28 setup.sh -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/setup/test-setup.sh 00:14:05.053 * Looking for test storage... 00:14:05.053 * Found test storage at /home/vagrant/spdk_repo/spdk/test/setup 00:14:05.053 12:16:28 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:14:05.053 12:16:28 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:14:05.053 12:16:28 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /home/vagrant/spdk_repo/spdk/test/setup/acl.sh 00:14:05.053 12:16:28 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:05.053 12:16:28 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:05.053 12:16:28 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:14:05.053 ************************************ 00:14:05.053 START TEST acl 00:14:05.053 ************************************ 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/setup/acl.sh 00:14:05.053 * Looking for test storage... 00:14:05.053 * Found test storage at /home/vagrant/spdk_repo/spdk/test/setup 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1669 -- # local nvme bdf 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:14:05.053 12:16:28 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:14:05.053 12:16:28 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:14:05.053 12:16:28 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:05.053 12:16:28 setup.sh.acl -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:05.619 12:16:29 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:14:05.619 12:16:29 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:14:05.619 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:05.619 12:16:29 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:14:05.619 12:16:29 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:14:05.619 12:16:29 setup.sh.acl -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh status 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ (1af4 == *:*:*.* ]] 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:05.876 Hugepages 00:14:05.876 node hugesize free / total 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:05.876 00:14:05.876 Type BDF Vendor Device NUMA Driver Device Block devices 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:03.0 == *:*:*.* ]] 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ virtio-pci == nvme ]] 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:14:05.876 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:10.0 == *:*:*.* ]] 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\0\0\:\1\0\.\0* ]] 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:14:06.135 12:16:29 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:14:06.135 12:16:29 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:06.135 12:16:29 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:06.135 12:16:29 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:14:06.135 ************************************ 00:14:06.135 START TEST denied 00:14:06.135 ************************************ 00:14:06.135 12:16:29 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # denied 00:14:06.135 12:16:29 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:00:10.0' 00:14:06.135 12:16:29 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:00:10.0' 00:14:06.135 12:16:29 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:14:06.135 12:16:29 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:14:06.135 12:16:29 setup.sh.acl.denied -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:06.393 0000:00:10.0 (1b36 0010): Skipping denied controller at 0000:00:10.0 00:14:06.393 12:16:30 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:00:10.0 00:14:06.393 12:16:30 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:14:06.393 12:16:30 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:14:06.393 12:16:30 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:00:10.0 ]] 00:14:06.393 12:16:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:00:10.0/driver 00:14:06.652 12:16:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:14:06.652 12:16:30 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:14:06.652 12:16:30 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:14:06.652 12:16:30 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:06.652 12:16:30 setup.sh.acl.denied -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:06.910 00:14:06.910 real 0m0.895s 00:14:06.910 user 0m0.379s 00:14:06.910 sys 0m0.571s 00:14:06.910 12:16:30 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:06.910 12:16:30 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:14:06.910 ************************************ 00:14:06.910 END TEST denied 00:14:06.910 ************************************ 00:14:06.910 12:16:30 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:14:06.910 12:16:30 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:06.910 12:16:30 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:06.910 12:16:30 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:14:06.910 ************************************ 00:14:06.910 START TEST allowed 00:14:06.910 ************************************ 00:14:06.910 12:16:30 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # allowed 00:14:07.168 12:16:30 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:00:10.0 00:14:07.168 12:16:30 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:00:10.0 .*: nvme -> .*' 00:14:07.168 12:16:30 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:14:07.168 12:16:30 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:14:07.168 12:16:30 setup.sh.acl.allowed -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:07.735 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:14:07.735 12:16:31 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:14:07.735 12:16:31 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:14:07.735 12:16:31 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:14:07.735 12:16:31 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:07.735 12:16:31 setup.sh.acl.allowed -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:07.994 00:14:07.994 real 0m0.983s 00:14:07.994 user 0m0.318s 00:14:07.994 sys 0m0.641s 00:14:07.994 12:16:31 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:07.994 12:16:31 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:14:07.994 ************************************ 00:14:07.994 END TEST allowed 00:14:07.994 ************************************ 00:14:07.994 00:14:07.994 real 0m3.097s 00:14:07.994 user 0m1.207s 00:14:07.994 sys 0m1.970s 00:14:07.994 12:16:31 setup.sh.acl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:07.994 12:16:31 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:14:07.994 ************************************ 00:14:07.994 END TEST acl 00:14:07.994 ************************************ 00:14:07.994 12:16:31 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /home/vagrant/spdk_repo/spdk/test/setup/hugepages.sh 00:14:07.994 12:16:31 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:07.994 12:16:31 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:07.994 12:16:31 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:14:08.255 ************************************ 00:14:08.255 START TEST hugepages 00:14:08.255 ************************************ 00:14:08.255 12:16:31 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/setup/hugepages.sh 00:14:08.255 * Looking for test storage... 00:14:08.255 * Found test storage at /home/vagrant/spdk_repo/spdk/test/setup 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 3931208 kB' 'MemAvailable: 7451772 kB' 'Buffers: 2208 kB' 'Cached: 3725020 kB' 'SwapCached: 0 kB' 'Active: 1429312 kB' 'Inactive: 2407232 kB' 'Active(anon): 1008 kB' 'Inactive(anon): 127092 kB' 'Active(file): 1428304 kB' 'Inactive(file): 2280140 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 109420 kB' 'Mapped: 39740 kB' 'Shmem: 18524 kB' 'KReclaimable: 97912 kB' 'Slab: 167364 kB' 'SReclaimable: 97912 kB' 'SUnreclaim: 69452 kB' 'KernelStack: 5020 kB' 'PageTables: 3168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 4026292 kB' 'Committed_AS: 374156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23704 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.255 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.256 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:14:08.257 12:16:31 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:14:08.257 12:16:31 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:08.257 12:16:31 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:08.257 12:16:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:08.257 ************************************ 00:14:08.257 START TEST default_setup 00:14:08.257 ************************************ 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # default_setup 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:14:08.257 12:16:31 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:08.828 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:08.828 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6009248 kB' 'MemAvailable: 9530240 kB' 'Buffers: 2208 kB' 'Cached: 3725132 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2424096 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143684 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126308 kB' 'Mapped: 40108 kB' 'Shmem: 18524 kB' 'KReclaimable: 98024 kB' 'Slab: 167656 kB' 'SReclaimable: 98024 kB' 'SUnreclaim: 69632 kB' 'KernelStack: 4988 kB' 'PageTables: 3384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23768 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.828 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.829 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 18432 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=18432 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6008996 kB' 'MemAvailable: 9529988 kB' 'Buffers: 2208 kB' 'Cached: 3725132 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2424172 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143760 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126036 kB' 'Mapped: 40048 kB' 'Shmem: 18524 kB' 'KReclaimable: 98024 kB' 'Slab: 167664 kB' 'SReclaimable: 98024 kB' 'SUnreclaim: 69640 kB' 'KernelStack: 4972 kB' 'PageTables: 3344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23736 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.830 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.831 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6008744 kB' 'MemAvailable: 9529736 kB' 'Buffers: 2208 kB' 'Cached: 3725132 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423728 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143316 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125820 kB' 'Mapped: 39816 kB' 'Shmem: 18524 kB' 'KReclaimable: 98024 kB' 'Slab: 167664 kB' 'SReclaimable: 98024 kB' 'SUnreclaim: 69640 kB' 'KernelStack: 4940 kB' 'PageTables: 3280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23736 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.832 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.833 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:14:08.834 nr_hugepages=1024 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:14:08.834 resv_hugepages=0 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:08.834 surplus_hugepages=0 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:08.834 anon_hugepages=18432 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6008744 kB' 'MemAvailable: 9529736 kB' 'Buffers: 2208 kB' 'Cached: 3725132 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423948 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143536 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125780 kB' 'Mapped: 39816 kB' 'Shmem: 18524 kB' 'KReclaimable: 98024 kB' 'Slab: 167640 kB' 'SReclaimable: 98024 kB' 'SUnreclaim: 69616 kB' 'KernelStack: 4940 kB' 'PageTables: 3280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:08.834 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.095 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.096 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6008744 kB' 'MemUsed: 6238144 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2424092 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143680 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'FilePages: 3727340 kB' 'Mapped: 39816 kB' 'AnonPages: 125928 kB' 'Shmem: 18524 kB' 'KernelStack: 4940 kB' 'PageTables: 3280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 98024 kB' 'Slab: 167640 kB' 'SReclaimable: 98024 kB' 'SUnreclaim: 69616 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.097 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:09.098 node0=1024 expecting 1024 00:14:09.098 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:14:09.099 12:16:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:14:09.099 00:14:09.099 real 0m0.689s 00:14:09.099 user 0m0.298s 00:14:09.099 sys 0m0.358s 00:14:09.099 12:16:32 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:09.099 12:16:32 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:14:09.099 ************************************ 00:14:09.099 END TEST default_setup 00:14:09.099 ************************************ 00:14:09.099 12:16:32 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:14:09.099 12:16:32 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:09.099 12:16:32 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:09.099 12:16:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:09.099 ************************************ 00:14:09.099 START TEST per_node_1G_alloc 00:14:09.099 ************************************ 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # per_node_1G_alloc 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:09.099 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:09.359 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:09.359 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=512 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7062592 kB' 'MemAvailable: 10583568 kB' 'Buffers: 2208 kB' 'Cached: 3725124 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2424204 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143788 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126120 kB' 'Mapped: 39272 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167700 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69708 kB' 'KernelStack: 5088 kB' 'PageTables: 3212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23768 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.359 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.360 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 18432 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.361 12:16:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.361 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.628 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7062592 kB' 'MemAvailable: 10583564 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2423940 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143528 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125860 kB' 'Mapped: 39260 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167684 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69692 kB' 'KernelStack: 4980 kB' 'PageTables: 3084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:09.628 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.629 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.630 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.631 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.632 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.633 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.634 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.635 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.636 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.637 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.638 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.639 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7062592 kB' 'MemAvailable: 10583564 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2423896 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143484 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125816 kB' 'Mapped: 39260 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167684 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69692 kB' 'KernelStack: 4964 kB' 'PageTables: 3052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.640 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.641 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.642 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.643 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.644 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.645 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.646 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.647 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.648 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.649 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.650 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:09.651 nr_hugepages=512 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=512 00:14:09.651 resv_hugepages=0 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:09.651 surplus_hugepages=0 00:14:09.651 anon_hugepages=18432 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 512 == nr_hugepages + surp + resv )) 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 512 == nr_hugepages )) 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.651 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7062592 kB' 'MemAvailable: 10583564 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2423844 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143432 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125764 kB' 'Mapped: 39260 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167684 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69692 kB' 'KernelStack: 5016 kB' 'PageTables: 3020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.652 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.653 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.654 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.655 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.656 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.657 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 512 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 512 == nr_hugepages + surp + resv )) 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7062344 kB' 'MemUsed: 5184544 kB' 'SwapCached: 0 kB' 'Active: 1429404 kB' 'Inactive: 2423764 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143352 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'FilePages: 3727328 kB' 'Mapped: 39212 kB' 'AnonPages: 125904 kB' 'Shmem: 18516 kB' 'KernelStack: 4944 kB' 'PageTables: 2924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167692 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69700 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.658 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:09.659 node0=512 expecting 512 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:14:09.659 00:14:09.659 real 0m0.539s 00:14:09.659 user 0m0.241s 00:14:09.659 sys 0m0.331s 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:09.659 12:16:33 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:14:09.659 ************************************ 00:14:09.659 END TEST per_node_1G_alloc 00:14:09.659 ************************************ 00:14:09.659 12:16:33 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:14:09.659 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:09.659 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:09.659 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:09.659 ************************************ 00:14:09.659 START TEST even_2G_alloc 00:14:09.659 ************************************ 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # even_2G_alloc 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=1024 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:09.659 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:09.920 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:09.920 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.920 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6011528 kB' 'MemAvailable: 9532500 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429424 kB' 'Inactive: 2423992 kB' 'Active(anon): 1020 kB' 'Inactive(anon): 143584 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280408 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126224 kB' 'Mapped: 39524 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167652 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69660 kB' 'KernelStack: 5004 kB' 'PageTables: 3136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:09.921 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.185 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 18432 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6011528 kB' 'MemAvailable: 9532500 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429424 kB' 'Inactive: 2423900 kB' 'Active(anon): 1020 kB' 'Inactive(anon): 143492 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280408 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125872 kB' 'Mapped: 39524 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167652 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69660 kB' 'KernelStack: 4972 kB' 'PageTables: 3072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23704 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.186 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.187 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6011276 kB' 'MemAvailable: 9532248 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2423720 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143312 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280408 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125856 kB' 'Mapped: 39472 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167660 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69668 kB' 'KernelStack: 4968 kB' 'PageTables: 3216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.188 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.189 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:14:10.190 nr_hugepages=1024 00:14:10.190 resv_hugepages=0 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:10.190 surplus_hugepages=0 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:10.190 anon_hugepages=18432 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6011276 kB' 'MemAvailable: 9532248 kB' 'Buffers: 2208 kB' 'Cached: 3725120 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2423912 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143504 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280408 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126048 kB' 'Mapped: 39472 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167660 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69668 kB' 'KernelStack: 5020 kB' 'PageTables: 3184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.190 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.191 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6011276 kB' 'MemUsed: 6235612 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2423648 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143240 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280408 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'FilePages: 3727328 kB' 'Mapped: 39472 kB' 'AnonPages: 125780 kB' 'Shmem: 18516 kB' 'KernelStack: 4956 kB' 'PageTables: 3048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167644 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69652 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.192 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:10.193 node0=1024 expecting 1024 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:14:10.193 00:14:10.193 real 0m0.506s 00:14:10.193 user 0m0.244s 00:14:10.193 sys 0m0.299s 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:10.193 12:16:33 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:14:10.193 ************************************ 00:14:10.193 END TEST even_2G_alloc 00:14:10.193 ************************************ 00:14:10.193 12:16:33 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:14:10.193 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:10.193 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:10.193 12:16:33 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:10.193 ************************************ 00:14:10.193 START TEST odd_alloc 00:14:10.193 ************************************ 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # odd_alloc 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=1025 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:10.193 12:16:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:10.453 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:10.453 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6010104 kB' 'MemAvailable: 9531080 kB' 'Buffers: 2208 kB' 'Cached: 3725124 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2423580 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143168 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125752 kB' 'Mapped: 39556 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167724 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69732 kB' 'KernelStack: 4908 kB' 'PageTables: 3396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5073844 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.716 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.717 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 18432 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6009852 kB' 'MemAvailable: 9530828 kB' 'Buffers: 2208 kB' 'Cached: 3725124 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423784 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143372 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125924 kB' 'Mapped: 39500 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167740 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69748 kB' 'KernelStack: 4936 kB' 'PageTables: 3372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5073844 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.718 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.719 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6009852 kB' 'MemAvailable: 9530828 kB' 'Buffers: 2208 kB' 'Cached: 3725124 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423788 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143376 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125928 kB' 'Mapped: 39500 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167732 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69740 kB' 'KernelStack: 4936 kB' 'PageTables: 3372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5073844 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23736 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.720 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:10.721 nr_hugepages=1025 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:10.721 resv_hugepages=0 00:14:10.721 surplus_hugepages=0 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:10.721 anon_hugepages=18432 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:14:10.721 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6009852 kB' 'MemAvailable: 9530828 kB' 'Buffers: 2208 kB' 'Cached: 3725124 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423468 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143056 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125608 kB' 'Mapped: 39500 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167732 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69740 kB' 'KernelStack: 4920 kB' 'PageTables: 3340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5073844 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23768 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.722 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1025 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.723 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6009348 kB' 'MemUsed: 6237540 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2423756 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143344 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280412 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'FilePages: 3727332 kB' 'Mapped: 39500 kB' 'AnonPages: 125924 kB' 'Shmem: 18516 kB' 'KernelStack: 4920 kB' 'PageTables: 3332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167732 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69740 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Surp: 0' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.724 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:10.725 node0=1025 expecting 1025 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1025 expecting 1025' 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 1025 == \1\0\2\5 ]] 00:14:10.725 00:14:10.725 real 0m0.551s 00:14:10.725 user 0m0.316s 00:14:10.725 sys 0m0.274s 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:10.725 12:16:34 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:14:10.725 ************************************ 00:14:10.725 END TEST odd_alloc 00:14:10.725 ************************************ 00:14:10.984 12:16:34 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:14:10.985 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:10.985 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:10.985 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:10.985 ************************************ 00:14:10.985 START TEST custom_alloc 00:14:10.985 ************************************ 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # custom_alloc 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 1 > 1 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512' 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:10.985 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:11.246 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:11.246 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=512 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7061128 kB' 'MemAvailable: 10582108 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2424220 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143800 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 126088 kB' 'Mapped: 39512 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167860 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69868 kB' 'KernelStack: 4932 kB' 'PageTables: 3536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.246 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.247 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 18432 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.248 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7061128 kB' 'MemAvailable: 10582108 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429404 kB' 'Inactive: 2423564 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143144 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125680 kB' 'Mapped: 39272 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167648 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69656 kB' 'KernelStack: 4912 kB' 'PageTables: 3084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23704 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.249 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.250 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7061128 kB' 'MemAvailable: 10582108 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429404 kB' 'Inactive: 2423572 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143152 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125688 kB' 'Mapped: 39272 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167648 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69656 kB' 'KernelStack: 4896 kB' 'PageTables: 3052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23720 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.251 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.252 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.253 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:11.254 nr_hugepages=512 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=512 00:14:11.254 resv_hugepages=0 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:11.254 surplus_hugepages=0 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:11.254 anon_hugepages=18432 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 512 == nr_hugepages + surp + resv )) 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 512 == nr_hugepages )) 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7061128 kB' 'MemAvailable: 10582108 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429404 kB' 'Inactive: 2423528 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143108 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'AnonPages: 125628 kB' 'Mapped: 39272 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167648 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69656 kB' 'KernelStack: 4896 kB' 'PageTables: 3052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5599156 kB' 'Committed_AS: 390500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23752 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 1048576 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.254 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.255 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.256 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 512 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 512 == nr_hugepages + surp + resv )) 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.516 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 7061128 kB' 'MemUsed: 5185760 kB' 'SwapCached: 0 kB' 'Active: 1429404 kB' 'Inactive: 2423796 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 143376 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 12 kB' 'Writeback: 0 kB' 'FilePages: 3727336 kB' 'Mapped: 39272 kB' 'AnonPages: 125896 kB' 'Shmem: 18516 kB' 'KernelStack: 4912 kB' 'PageTables: 3084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167656 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69664 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.517 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:11.518 node0=512 expecting 512 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:14:11.518 00:14:11.518 real 0m0.540s 00:14:11.518 user 0m0.287s 00:14:11.518 sys 0m0.291s 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:11.518 12:16:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:14:11.518 ************************************ 00:14:11.518 END TEST custom_alloc 00:14:11.518 ************************************ 00:14:11.518 12:16:34 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:14:11.518 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:11.518 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:11.518 12:16:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:11.518 ************************************ 00:14:11.518 START TEST no_shrink_alloc 00:14:11.518 ************************************ 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # no_shrink_alloc 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=1 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:11.518 12:16:34 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:11.781 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:11.781 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6007800 kB' 'MemAvailable: 9528780 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429412 kB' 'Inactive: 2424096 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 143676 kB' 'Active(file): 1428400 kB' 'Inactive(file): 2280420 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 126360 kB' 'Mapped: 39400 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167632 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69640 kB' 'KernelStack: 4956 kB' 'PageTables: 3320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 390112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23768 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.781 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 18432 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.782 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6007800 kB' 'MemAvailable: 9528780 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2421196 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 140780 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123432 kB' 'Mapped: 38620 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167632 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69640 kB' 'KernelStack: 4940 kB' 'PageTables: 3020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 382704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23672 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.783 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:11.784 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6007800 kB' 'MemAvailable: 9528780 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2421108 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 140692 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123244 kB' 'Mapped: 38676 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167572 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69580 kB' 'KernelStack: 4896 kB' 'PageTables: 2748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 383092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23688 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.785 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:11.786 nr_hugepages=1024 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:14:11.786 resv_hugepages=0 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:11.786 surplus_hugepages=0 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:11.786 anon_hugepages=18432 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:11.786 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6007800 kB' 'MemAvailable: 9528780 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2421172 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 140756 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123320 kB' 'Mapped: 38676 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167548 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69556 kB' 'KernelStack: 4932 kB' 'PageTables: 2636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 383092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23704 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.047 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.048 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6007548 kB' 'MemUsed: 6239340 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2421064 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 140648 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'FilePages: 3727336 kB' 'Mapped: 38676 kB' 'AnonPages: 123200 kB' 'Shmem: 18516 kB' 'KernelStack: 4912 kB' 'PageTables: 2732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167524 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69532 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.049 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.050 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:12.051 node0=1024 expecting 1024 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:14:12.051 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:12.313 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:12.313 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:14:12.313 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ [always] madvise never != *\[\n\e\v\e\r\]* ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6004524 kB' 'MemAvailable: 9525504 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2421664 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 141248 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123788 kB' 'Mapped: 38536 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167544 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69552 kB' 'KernelStack: 4928 kB' 'PageTables: 3028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 383092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23768 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.313 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.314 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 18432 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=18432 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6004524 kB' 'MemAvailable: 9525504 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429416 kB' 'Inactive: 2421192 kB' 'Active(anon): 1012 kB' 'Inactive(anon): 140776 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123308 kB' 'Mapped: 38484 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167552 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69560 kB' 'KernelStack: 4832 kB' 'PageTables: 2816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 383092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23672 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.315 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:12.316 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6004524 kB' 'MemAvailable: 9525504 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2420780 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 140364 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 122884 kB' 'Mapped: 38428 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167552 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69560 kB' 'KernelStack: 4848 kB' 'PageTables: 2832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 383092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23672 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.317 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.318 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:14:12.580 nr_hugepages=1024 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:14:12.580 resv_hugepages=0 00:14:12.580 surplus_hugepages=0 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=18432 00:14:12.580 anon_hugepages=18432 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6004524 kB' 'MemAvailable: 9525504 kB' 'Buffers: 2208 kB' 'Cached: 3725128 kB' 'SwapCached: 0 kB' 'Active: 1429424 kB' 'Inactive: 2421048 kB' 'Active(anon): 1020 kB' 'Inactive(anon): 140632 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'SwapTotal: 0 kB' 'SwapFree: 0 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'AnonPages: 123012 kB' 'Mapped: 38444 kB' 'Shmem: 18516 kB' 'KReclaimable: 97992 kB' 'Slab: 167552 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69560 kB' 'KernelStack: 4880 kB' 'PageTables: 2896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 5074868 kB' 'Committed_AS: 382872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 23688 kB' 'VmallocChunk: 0 kB' 'Percpu: 4848 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 151404 kB' 'DirectMap2M: 6139904 kB' 'DirectMap1G: 8388608 kB' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.580 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.581 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=1 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:14:12.582 12:16:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 12246888 kB' 'MemFree: 6004272 kB' 'MemUsed: 6242616 kB' 'SwapCached: 0 kB' 'Active: 1429408 kB' 'Inactive: 2420940 kB' 'Active(anon): 1004 kB' 'Inactive(anon): 140524 kB' 'Active(file): 1428404 kB' 'Inactive(file): 2280416 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 16 kB' 'Writeback: 0 kB' 'FilePages: 3727336 kB' 'Mapped: 38428 kB' 'AnonPages: 123060 kB' 'Shmem: 18516 kB' 'KernelStack: 4880 kB' 'PageTables: 2892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97992 kB' 'Slab: 167520 kB' 'SReclaimable: 97992 kB' 'SUnreclaim: 69528 kB' 'AnonHugePages: 18432 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.582 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.583 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:14:12.584 node0=1024 expecting 1024 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:14:12.584 00:14:12.584 real 0m1.053s 00:14:12.584 user 0m0.562s 00:14:12.584 sys 0m0.569s 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:12.584 12:16:36 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:14:12.584 ************************************ 00:14:12.584 END TEST no_shrink_alloc 00:14:12.584 ************************************ 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:14:12.584 12:16:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:14:12.584 00:14:12.584 real 0m4.453s 00:14:12.584 user 0m2.167s 00:14:12.584 sys 0m2.468s 00:14:12.584 12:16:36 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:12.584 12:16:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:14:12.584 ************************************ 00:14:12.584 END TEST hugepages 00:14:12.584 ************************************ 00:14:12.584 12:16:36 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /home/vagrant/spdk_repo/spdk/test/setup/driver.sh 00:14:12.584 12:16:36 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:12.584 12:16:36 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:12.584 12:16:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:14:12.584 ************************************ 00:14:12.584 START TEST driver 00:14:12.584 ************************************ 00:14:12.584 12:16:36 setup.sh.driver -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/setup/driver.sh 00:14:12.843 * Looking for test storage... 00:14:12.843 * Found test storage at /home/vagrant/spdk_repo/spdk/test/setup 00:14:12.843 12:16:36 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:14:12.843 12:16:36 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:12.843 12:16:36 setup.sh.driver -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:13.102 12:16:36 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:14:13.102 12:16:36 setup.sh.driver -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:13.102 12:16:36 setup.sh.driver -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:13.102 12:16:36 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:14:13.102 ************************************ 00:14:13.102 START TEST guess_driver 00:14:13.102 ************************************ 00:14:13.102 12:16:36 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # guess_driver 00:14:13.102 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:14:13.102 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:14:13.102 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 0 > 0 )) 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # [[ '' == Y ]] 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@32 -- # return 1 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@38 -- # uio 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@17 -- # is_driver uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/5.14.0-362.24.1.el9_3.x86_64/kernel/drivers/uio/uio.ko.xz 00:14:13.103 insmod /lib/modules/5.14.0-362.24.1.el9_3.x86_64/kernel/drivers/uio/uio_pci_generic.ko.xz == *\.\k\o* ]] 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@39 -- # echo uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ uio_pci_generic == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:14:13.103 Looking for driver=uio_pci_generic 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=uio_pci_generic' 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:14:13.103 12:16:36 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ devices: == \-\> ]] 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ uio_pci_generic == uio_pci_generic ]] 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:13.669 12:16:37 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:14.236 00:14:14.236 real 0m1.032s 00:14:14.236 user 0m0.356s 00:14:14.236 sys 0m0.661s 00:14:14.236 12:16:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:14.236 ************************************ 00:14:14.236 12:16:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:14:14.236 END TEST guess_driver 00:14:14.236 ************************************ 00:14:14.236 00:14:14.236 real 0m1.637s 00:14:14.236 user 0m0.579s 00:14:14.236 sys 0m1.056s 00:14:14.236 12:16:37 setup.sh.driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:14.236 12:16:37 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:14:14.236 ************************************ 00:14:14.236 END TEST driver 00:14:14.236 ************************************ 00:14:14.236 12:16:37 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /home/vagrant/spdk_repo/spdk/test/setup/devices.sh 00:14:14.236 12:16:37 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:14.236 12:16:37 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:14.236 12:16:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:14:14.236 ************************************ 00:14:14.236 START TEST devices 00:14:14.236 ************************************ 00:14:14.236 12:16:37 setup.sh.devices -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/setup/devices.sh 00:14:14.495 * Looking for test storage... 00:14:14.495 * Found test storage at /home/vagrant/spdk_repo/spdk/test/setup 00:14:14.495 12:16:37 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:14:14.495 12:16:37 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:14:14.495 12:16:37 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:14:14.495 12:16:37 setup.sh.devices -- setup/common.sh@12 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1669 -- # local nvme bdf 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:14:14.788 12:16:38 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:00:10.0 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\0\0\:\1\0\.\0* ]] 00:14:14.788 12:16:38 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:14:14.788 12:16:38 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:14:14.788 12:16:38 setup.sh.devices -- scripts/common.sh@387 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py nvme0n1 00:14:15.057 No valid GPT data, bailing 00:14:15.057 12:16:38 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:14:15.057 12:16:38 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:14:15.057 12:16:38 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:14:15.057 12:16:38 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:14:15.057 12:16:38 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:14:15.057 12:16:38 setup.sh.devices -- setup/common.sh@80 -- # echo 5368709120 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@204 -- # (( 5368709120 >= min_disk_size )) 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:00:10.0 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:14:15.057 12:16:38 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:14:15.057 12:16:38 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:15.057 12:16:38 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:15.057 12:16:38 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:14:15.057 ************************************ 00:14:15.057 START TEST nvme_mount 00:14:15.057 ************************************ 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # nvme_mount 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:14:15.057 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:14:15.058 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 4096 )) 00:14:15.058 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:14:15.058 12:16:38 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /home/vagrant/spdk_repo/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:14:15.993 Creating new GPT entries in memory. 00:14:15.993 GPT data structures destroyed! You may now partition the disk using fdisk or 00:14:15.993 other utilities. 00:14:15.993 12:16:39 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:14:15.993 12:16:39 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:14:15.993 12:16:39 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:14:15.993 12:16:39 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:14:15.993 12:16:39 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:264191 00:14:16.930 Creating new GPT entries in memory. 00:14:16.930 The operation has completed successfully. 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 185582 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount size= 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:14:16.930 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:00:10.0 nvme0n1:nvme0n1p1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:00:10.0 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme ]] 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:00:10.0 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:14:17.189 12:16:40 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:10.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.447 12:16:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.447 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.447 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:14:17.706 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:14:17.706 /dev/nvme0n1: 8 bytes were erased at offset 0x00001000 (gpt): 45 46 49 20 50 41 52 54 00:14:17.706 /dev/nvme0n1: 8 bytes were erased at offset 0x13ffff000 (gpt): 45 46 49 20 50 41 52 54 00:14:17.706 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:14:17.706 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 1024M 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount size=1024M 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:00:10.0 nvme0n1:nvme0n1 /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:00:10.0 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:00:10.0 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:14:17.706 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:10.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:17.965 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount ]] 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme ]] 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount/test_nvme 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:00:10.0 data@nvme0n1 '' '' 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:00:10.0 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:14:18.224 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:00:10.0 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:14:18.225 12:16:41 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:18.483 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:10.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:18.483 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:18.484 12:16:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:14:18.484 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:14:18.484 00:14:18.484 real 0m3.595s 00:14:18.484 user 0m0.516s 00:14:18.484 sys 0m0.920s 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:18.484 ************************************ 00:14:18.484 END TEST nvme_mount 00:14:18.484 ************************************ 00:14:18.484 12:16:42 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:14:18.811 12:16:42 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:14:18.811 12:16:42 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:18.811 12:16:42 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:18.811 12:16:42 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:14:18.811 ************************************ 00:14:18.811 START TEST dm_mount 00:14:18.811 ************************************ 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # dm_mount 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 4096 )) 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:14:18.811 12:16:42 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /home/vagrant/spdk_repo/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:14:19.747 Creating new GPT entries in memory. 00:14:19.747 GPT data structures destroyed! You may now partition the disk using fdisk or 00:14:19.747 other utilities. 00:14:19.747 12:16:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:14:19.747 12:16:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:14:19.747 12:16:43 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:14:19.747 12:16:43 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:14:19.747 12:16:43 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:264191 00:14:20.682 Creating new GPT entries in memory. 00:14:20.682 The operation has completed successfully. 00:14:20.682 12:16:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:14:20.682 12:16:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:14:20.682 12:16:44 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:14:20.682 12:16:44 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:14:20.682 12:16:44 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:264192:526335 00:14:21.616 The operation has completed successfully. 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 185971 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm 00:14:21.616 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/home/vagrant/spdk_repo/spdk/test/setup/dm_mount size= 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:14:21.873 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:00:10.0 nvme0n1:nvme_dm_test /home/vagrant/spdk_repo/spdk/test/setup/dm_mount /home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:00:10.0 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm ]] 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:00:10.0 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:10.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:14:21.874 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /home/vagrant/spdk_repo/spdk/test/setup/dm_mount ]] 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm ]] 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /home/vagrant/spdk_repo/spdk/test/setup/dm_mount/test_dm 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:00:10.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:14:22.183 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:00:10.0 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:00:10.0 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:14:22.184 12:16:45 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh config 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:10.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:22.443 12:16:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:03.0 == \0\0\0\0\:\0\0\:\1\0\.\0 ]] 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:14:22.443 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:14:22.702 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:14:22.702 00:14:22.702 real 0m3.989s 00:14:22.702 user 0m0.328s 00:14:22.702 sys 0m0.660s 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:22.702 12:16:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:14:22.702 ************************************ 00:14:22.702 END TEST dm_mount 00:14:22.702 ************************************ 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/nvme_mount 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:14:22.702 /dev/nvme0n1: 8 bytes were erased at offset 0x00001000 (gpt): 45 46 49 20 50 41 52 54 00:14:22.702 /dev/nvme0n1: 8 bytes were erased at offset 0x13ffff000 (gpt): 45 46 49 20 50 41 52 54 00:14:22.702 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:14:22.702 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /home/vagrant/spdk_repo/spdk/test/setup/dm_mount 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:14:22.702 12:16:46 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:14:22.702 00:14:22.702 real 0m8.418s 00:14:22.702 user 0m1.193s 00:14:22.702 sys 0m2.054s 00:14:22.702 12:16:46 setup.sh.devices -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:22.702 12:16:46 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:14:22.702 ************************************ 00:14:22.702 END TEST devices 00:14:22.702 ************************************ 00:14:22.702 00:14:22.702 real 0m17.956s 00:14:22.702 user 0m5.273s 00:14:22.702 sys 0m7.775s 00:14:22.702 12:16:46 setup.sh -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:22.702 12:16:46 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:14:22.702 ************************************ 00:14:22.702 END TEST setup.sh 00:14:22.702 ************************************ 00:14:22.961 12:16:46 -- spdk/autotest.sh@128 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh status 00:14:23.221 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:23.221 Hugepages 00:14:23.221 node hugesize free / total 00:14:23.221 node0 1048576kB 0 / 0 00:14:23.221 node0 2048kB 2048 / 2048 00:14:23.221 00:14:23.221 Type BDF Vendor Device NUMA Driver Device Block devices 00:14:23.480 virtio 0000:00:03.0 1af4 1001 unknown virtio-pci - vda 00:14:23.480 NVMe 0000:00:10.0 1b36 0010 0 nvme nvme0 nvme0n1 00:14:23.480 12:16:46 -- spdk/autotest.sh@130 -- # uname -s 00:14:23.480 12:16:46 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:14:23.480 12:16:46 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:14:23.480 12:16:46 -- common/autotest_common.sh@1530 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:23.738 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:23.996 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:14:23.996 12:16:47 -- common/autotest_common.sh@1531 -- # sleep 1 00:14:24.931 12:16:48 -- common/autotest_common.sh@1532 -- # bdfs=() 00:14:24.931 12:16:48 -- common/autotest_common.sh@1532 -- # local bdfs 00:14:24.931 12:16:48 -- common/autotest_common.sh@1533 -- # bdfs=($(get_nvme_bdfs)) 00:14:24.931 12:16:48 -- common/autotest_common.sh@1533 -- # get_nvme_bdfs 00:14:24.931 12:16:48 -- common/autotest_common.sh@1512 -- # bdfs=() 00:14:24.931 12:16:48 -- common/autotest_common.sh@1512 -- # local bdfs 00:14:24.931 12:16:48 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:14:24.932 12:16:48 -- common/autotest_common.sh@1513 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:14:24.932 12:16:48 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:14:25.188 12:16:48 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:14:25.188 12:16:48 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:00:10.0 00:14:25.188 12:16:48 -- common/autotest_common.sh@1535 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:25.471 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:25.471 Waiting for block devices as requested 00:14:25.471 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:14:25.471 12:16:49 -- common/autotest_common.sh@1537 -- # for bdf in "${bdfs[@]}" 00:14:25.471 12:16:49 -- common/autotest_common.sh@1538 -- # get_nvme_ctrlr_from_bdf 0000:00:10.0 00:14:25.471 12:16:49 -- common/autotest_common.sh@1501 -- # readlink -f /sys/class/nvme/nvme0 00:14:25.471 12:16:49 -- common/autotest_common.sh@1501 -- # grep 0000:00:10.0/nvme/nvme 00:14:25.739 12:16:49 -- common/autotest_common.sh@1501 -- # bdf_sysfs_path=/sys/devices/pci0000:00/0000:00:10.0/nvme/nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1502 -- # [[ -z /sys/devices/pci0000:00/0000:00:10.0/nvme/nvme0 ]] 00:14:25.739 12:16:49 -- common/autotest_common.sh@1506 -- # basename /sys/devices/pci0000:00/0000:00:10.0/nvme/nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1506 -- # printf '%s\n' nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1538 -- # nvme_ctrlr=/dev/nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1539 -- # [[ -z /dev/nvme0 ]] 00:14:25.739 12:16:49 -- common/autotest_common.sh@1544 -- # nvme id-ctrl /dev/nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1544 -- # grep oacs 00:14:25.739 12:16:49 -- common/autotest_common.sh@1544 -- # cut -d: -f2 00:14:25.739 12:16:49 -- common/autotest_common.sh@1544 -- # oacs=' 0x12a' 00:14:25.739 12:16:49 -- common/autotest_common.sh@1545 -- # oacs_ns_manage=8 00:14:25.739 12:16:49 -- common/autotest_common.sh@1547 -- # [[ 8 -ne 0 ]] 00:14:25.739 12:16:49 -- common/autotest_common.sh@1553 -- # nvme id-ctrl /dev/nvme0 00:14:25.739 12:16:49 -- common/autotest_common.sh@1553 -- # cut -d: -f2 00:14:25.739 12:16:49 -- common/autotest_common.sh@1553 -- # grep unvmcap 00:14:25.739 12:16:49 -- common/autotest_common.sh@1553 -- # unvmcap=' 0' 00:14:25.739 12:16:49 -- common/autotest_common.sh@1554 -- # [[ 0 -eq 0 ]] 00:14:25.739 12:16:49 -- common/autotest_common.sh@1556 -- # continue 00:14:25.739 12:16:49 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:14:25.739 12:16:49 -- common/autotest_common.sh@729 -- # xtrace_disable 00:14:25.739 12:16:49 -- common/autotest_common.sh@10 -- # set +x 00:14:25.739 12:16:49 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:14:25.739 12:16:49 -- common/autotest_common.sh@723 -- # xtrace_disable 00:14:25.739 12:16:49 -- common/autotest_common.sh@10 -- # set +x 00:14:25.739 12:16:49 -- spdk/autotest.sh@139 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:14:25.997 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:14:26.255 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:14:26.255 12:16:49 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:14:26.255 12:16:49 -- common/autotest_common.sh@729 -- # xtrace_disable 00:14:26.255 12:16:49 -- common/autotest_common.sh@10 -- # set +x 00:14:26.255 12:16:49 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:14:26.255 12:16:49 -- common/autotest_common.sh@1590 -- # mapfile -t bdfs 00:14:26.255 12:16:49 -- common/autotest_common.sh@1590 -- # get_nvme_bdfs_by_id 0x0a54 00:14:26.255 12:16:49 -- common/autotest_common.sh@1576 -- # bdfs=() 00:14:26.255 12:16:49 -- common/autotest_common.sh@1576 -- # local bdfs 00:14:26.255 12:16:49 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs 00:14:26.255 12:16:49 -- common/autotest_common.sh@1512 -- # bdfs=() 00:14:26.255 12:16:49 -- common/autotest_common.sh@1512 -- # local bdfs 00:14:26.255 12:16:49 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:14:26.255 12:16:49 -- common/autotest_common.sh@1513 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:14:26.255 12:16:49 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:14:26.255 12:16:49 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:14:26.255 12:16:49 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:00:10.0 00:14:26.255 12:16:49 -- common/autotest_common.sh@1578 -- # for bdf in $(get_nvme_bdfs) 00:14:26.255 12:16:49 -- common/autotest_common.sh@1579 -- # cat /sys/bus/pci/devices/0000:00:10.0/device 00:14:26.255 12:16:49 -- common/autotest_common.sh@1579 -- # device=0x0010 00:14:26.255 12:16:49 -- common/autotest_common.sh@1580 -- # [[ 0x0010 == \0\x\0\a\5\4 ]] 00:14:26.255 12:16:49 -- common/autotest_common.sh@1585 -- # printf '%s\n' 00:14:26.255 12:16:49 -- common/autotest_common.sh@1591 -- # [[ -z '' ]] 00:14:26.255 12:16:49 -- common/autotest_common.sh@1592 -- # return 0 00:14:26.255 12:16:49 -- spdk/autotest.sh@150 -- # '[' 1 -eq 1 ']' 00:14:26.255 12:16:49 -- spdk/autotest.sh@151 -- # run_test unittest /home/vagrant/spdk_repo/spdk/test/unit/unittest.sh 00:14:26.255 12:16:49 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:14:26.255 12:16:49 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:26.255 12:16:49 -- common/autotest_common.sh@10 -- # set +x 00:14:26.255 ************************************ 00:14:26.255 START TEST unittest 00:14:26.255 ************************************ 00:14:26.255 12:16:49 unittest -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/unittest.sh 00:14:26.255 +++ dirname /home/vagrant/spdk_repo/spdk/test/unit/unittest.sh 00:14:26.255 ++ readlink -f /home/vagrant/spdk_repo/spdk/test/unit 00:14:26.255 + testdir=/home/vagrant/spdk_repo/spdk/test/unit 00:14:26.255 +++ dirname /home/vagrant/spdk_repo/spdk/test/unit/unittest.sh 00:14:26.255 ++ readlink -f /home/vagrant/spdk_repo/spdk/test/unit/../.. 00:14:26.255 + rootdir=/home/vagrant/spdk_repo/spdk 00:14:26.255 + source /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh 00:14:26.255 ++ rpc_py=rpc_cmd 00:14:26.255 ++ set -e 00:14:26.255 ++ shopt -s nullglob 00:14:26.255 ++ shopt -s extglob 00:14:26.255 ++ shopt -s inherit_errexit 00:14:26.255 ++ '[' -z /home/vagrant/spdk_repo/spdk/../output ']' 00:14:26.255 ++ [[ -e /home/vagrant/spdk_repo/spdk/test/common/build_config.sh ]] 00:14:26.255 ++ source /home/vagrant/spdk_repo/spdk/test/common/build_config.sh 00:14:26.255 +++ CONFIG_WPDK_DIR= 00:14:26.255 +++ CONFIG_ASAN=y 00:14:26.255 +++ CONFIG_VBDEV_COMPRESS=n 00:14:26.255 +++ CONFIG_HAVE_EXECINFO_H=y 00:14:26.255 +++ CONFIG_USDT=n 00:14:26.255 +++ CONFIG_CUSTOMOCF=n 00:14:26.255 +++ CONFIG_PREFIX=/usr/local 00:14:26.255 +++ CONFIG_RBD=n 00:14:26.255 +++ CONFIG_LIBDIR= 00:14:26.255 +++ CONFIG_IDXD=y 00:14:26.255 +++ CONFIG_NVME_CUSE=y 00:14:26.255 +++ CONFIG_SMA=n 00:14:26.255 +++ CONFIG_VTUNE=n 00:14:26.255 +++ CONFIG_TSAN=n 00:14:26.255 +++ CONFIG_RDMA_SEND_WITH_INVAL=y 00:14:26.255 +++ CONFIG_VFIO_USER_DIR= 00:14:26.255 +++ CONFIG_PGO_CAPTURE=n 00:14:26.255 +++ CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:14:26.255 +++ CONFIG_ENV=/home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:14:26.256 +++ CONFIG_LTO=n 00:14:26.256 +++ CONFIG_ISCSI_INITIATOR=y 00:14:26.256 +++ CONFIG_CET=n 00:14:26.256 +++ CONFIG_VBDEV_COMPRESS_MLX5=n 00:14:26.256 +++ CONFIG_OCF_PATH= 00:14:26.256 +++ CONFIG_RDMA_SET_TOS=y 00:14:26.256 +++ CONFIG_HAVE_ARC4RANDOM=n 00:14:26.256 +++ CONFIG_HAVE_LIBARCHIVE=n 00:14:26.256 +++ CONFIG_UBLK=n 00:14:26.256 +++ CONFIG_ISAL_CRYPTO=y 00:14:26.256 +++ CONFIG_OPENSSL_PATH= 00:14:26.256 +++ CONFIG_OCF=n 00:14:26.256 +++ CONFIG_FUSE=n 00:14:26.256 +++ CONFIG_VTUNE_DIR= 00:14:26.256 +++ CONFIG_FUZZER_LIB= 00:14:26.256 +++ CONFIG_FUZZER=n 00:14:26.256 +++ CONFIG_DPDK_DIR=/home/vagrant/spdk_repo/dpdk/build 00:14:26.256 +++ CONFIG_CRYPTO=n 00:14:26.256 +++ CONFIG_PGO_USE=n 00:14:26.256 +++ CONFIG_VHOST=y 00:14:26.256 +++ CONFIG_DAOS=n 00:14:26.256 +++ CONFIG_DPDK_INC_DIR=//home/vagrant/spdk_repo/dpdk/build/include 00:14:26.256 +++ CONFIG_DAOS_DIR= 00:14:26.256 +++ CONFIG_UNIT_TESTS=y 00:14:26.256 +++ CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:14:26.256 +++ CONFIG_VIRTIO=y 00:14:26.256 +++ CONFIG_DPDK_UADK=n 00:14:26.256 +++ CONFIG_COVERAGE=y 00:14:26.256 +++ CONFIG_RDMA=y 00:14:26.256 +++ CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:14:26.256 +++ CONFIG_URING_PATH= 00:14:26.256 +++ CONFIG_XNVME=n 00:14:26.256 +++ CONFIG_VFIO_USER=n 00:14:26.256 +++ CONFIG_ARCH=native 00:14:26.256 +++ CONFIG_HAVE_EVP_MAC=y 00:14:26.256 +++ CONFIG_URING_ZNS=n 00:14:26.256 +++ CONFIG_WERROR=y 00:14:26.256 +++ CONFIG_HAVE_LIBBSD=n 00:14:26.256 +++ CONFIG_UBSAN=n 00:14:26.256 +++ CONFIG_IPSEC_MB_DIR= 00:14:26.256 +++ CONFIG_GOLANG=n 00:14:26.256 +++ CONFIG_ISAL=y 00:14:26.256 +++ CONFIG_IDXD_KERNEL=n 00:14:26.256 +++ CONFIG_DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:14:26.256 +++ CONFIG_RDMA_PROV=verbs 00:14:26.256 +++ CONFIG_APPS=y 00:14:26.256 +++ CONFIG_SHARED=n 00:14:26.256 +++ CONFIG_HAVE_KEYUTILS=y 00:14:26.256 +++ CONFIG_FC_PATH= 00:14:26.256 +++ CONFIG_DPDK_PKG_CONFIG=n 00:14:26.256 +++ CONFIG_FC=n 00:14:26.256 +++ CONFIG_AVAHI=n 00:14:26.256 +++ CONFIG_FIO_PLUGIN=y 00:14:26.256 +++ CONFIG_RAID5F=n 00:14:26.256 +++ CONFIG_EXAMPLES=y 00:14:26.256 +++ CONFIG_TESTS=y 00:14:26.256 +++ CONFIG_CRYPTO_MLX5=n 00:14:26.256 +++ CONFIG_MAX_LCORES= 00:14:26.256 +++ CONFIG_IPSEC_MB=n 00:14:26.256 +++ CONFIG_PGO_DIR= 00:14:26.256 +++ CONFIG_DEBUG=y 00:14:26.256 +++ CONFIG_DPDK_COMPRESSDEV=n 00:14:26.256 +++ CONFIG_CROSS_PREFIX= 00:14:26.256 +++ CONFIG_URING=n 00:14:26.256 ++ source /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:14:26.256 +++++ dirname /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:14:26.516 ++++ readlink -f /home/vagrant/spdk_repo/spdk/test/common 00:14:26.516 +++ _root=/home/vagrant/spdk_repo/spdk/test/common 00:14:26.516 +++ _root=/home/vagrant/spdk_repo/spdk 00:14:26.516 +++ _app_dir=/home/vagrant/spdk_repo/spdk/build/bin 00:14:26.516 +++ _test_app_dir=/home/vagrant/spdk_repo/spdk/test/app 00:14:26.516 +++ _examples_dir=/home/vagrant/spdk_repo/spdk/build/examples 00:14:26.516 +++ VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:14:26.516 +++ ISCSI_APP=("$_app_dir/iscsi_tgt") 00:14:26.516 +++ NVMF_APP=("$_app_dir/nvmf_tgt") 00:14:26.516 +++ VHOST_APP=("$_app_dir/vhost") 00:14:26.516 +++ DD_APP=("$_app_dir/spdk_dd") 00:14:26.516 +++ SPDK_APP=("$_app_dir/spdk_tgt") 00:14:26.516 +++ [[ -e /home/vagrant/spdk_repo/spdk/include/spdk/config.h ]] 00:14:26.516 +++ [[ #ifndef SPDK_CONFIG_H 00:14:26.516 #define SPDK_CONFIG_H 00:14:26.516 #define SPDK_CONFIG_APPS 1 00:14:26.516 #define SPDK_CONFIG_ARCH native 00:14:26.516 #define SPDK_CONFIG_ASAN 1 00:14:26.516 #undef SPDK_CONFIG_AVAHI 00:14:26.516 #undef SPDK_CONFIG_CET 00:14:26.516 #define SPDK_CONFIG_COVERAGE 1 00:14:26.516 #define SPDK_CONFIG_CROSS_PREFIX 00:14:26.516 #undef SPDK_CONFIG_CRYPTO 00:14:26.516 #undef SPDK_CONFIG_CRYPTO_MLX5 00:14:26.516 #undef SPDK_CONFIG_CUSTOMOCF 00:14:26.516 #undef SPDK_CONFIG_DAOS 00:14:26.516 #define SPDK_CONFIG_DAOS_DIR 00:14:26.516 #define SPDK_CONFIG_DEBUG 1 00:14:26.516 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:14:26.516 #define SPDK_CONFIG_DPDK_DIR /home/vagrant/spdk_repo/dpdk/build 00:14:26.516 #define SPDK_CONFIG_DPDK_INC_DIR //home/vagrant/spdk_repo/dpdk/build/include 00:14:26.516 #define SPDK_CONFIG_DPDK_LIB_DIR /home/vagrant/spdk_repo/dpdk/build/lib 00:14:26.516 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:14:26.516 #undef SPDK_CONFIG_DPDK_UADK 00:14:26.516 #define SPDK_CONFIG_ENV /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:14:26.516 #define SPDK_CONFIG_EXAMPLES 1 00:14:26.516 #undef SPDK_CONFIG_FC 00:14:26.516 #define SPDK_CONFIG_FC_PATH 00:14:26.516 #define SPDK_CONFIG_FIO_PLUGIN 1 00:14:26.516 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:14:26.516 #undef SPDK_CONFIG_FUSE 00:14:26.516 #undef SPDK_CONFIG_FUZZER 00:14:26.516 #define SPDK_CONFIG_FUZZER_LIB 00:14:26.516 #undef SPDK_CONFIG_GOLANG 00:14:26.516 #undef SPDK_CONFIG_HAVE_ARC4RANDOM 00:14:26.516 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:14:26.516 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:14:26.516 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:14:26.516 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:14:26.516 #undef SPDK_CONFIG_HAVE_LIBBSD 00:14:26.516 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:14:26.516 #define SPDK_CONFIG_IDXD 1 00:14:26.516 #undef SPDK_CONFIG_IDXD_KERNEL 00:14:26.516 #undef SPDK_CONFIG_IPSEC_MB 00:14:26.516 #define SPDK_CONFIG_IPSEC_MB_DIR 00:14:26.516 #define SPDK_CONFIG_ISAL 1 00:14:26.516 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:14:26.516 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:14:26.516 #define SPDK_CONFIG_LIBDIR 00:14:26.516 #undef SPDK_CONFIG_LTO 00:14:26.516 #define SPDK_CONFIG_MAX_LCORES 00:14:26.516 #define SPDK_CONFIG_NVME_CUSE 1 00:14:26.516 #undef SPDK_CONFIG_OCF 00:14:26.516 #define SPDK_CONFIG_OCF_PATH 00:14:26.516 #define SPDK_CONFIG_OPENSSL_PATH 00:14:26.516 #undef SPDK_CONFIG_PGO_CAPTURE 00:14:26.516 #define SPDK_CONFIG_PGO_DIR 00:14:26.516 #undef SPDK_CONFIG_PGO_USE 00:14:26.516 #define SPDK_CONFIG_PREFIX /usr/local 00:14:26.516 #undef SPDK_CONFIG_RAID5F 00:14:26.516 #undef SPDK_CONFIG_RBD 00:14:26.516 #define SPDK_CONFIG_RDMA 1 00:14:26.516 #define SPDK_CONFIG_RDMA_PROV verbs 00:14:26.516 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:14:26.516 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:14:26.516 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:14:26.516 #undef SPDK_CONFIG_SHARED 00:14:26.516 #undef SPDK_CONFIG_SMA 00:14:26.516 #define SPDK_CONFIG_TESTS 1 00:14:26.516 #undef SPDK_CONFIG_TSAN 00:14:26.516 #undef SPDK_CONFIG_UBLK 00:14:26.516 #undef SPDK_CONFIG_UBSAN 00:14:26.516 #define SPDK_CONFIG_UNIT_TESTS 1 00:14:26.516 #undef SPDK_CONFIG_URING 00:14:26.516 #define SPDK_CONFIG_URING_PATH 00:14:26.516 #undef SPDK_CONFIG_URING_ZNS 00:14:26.516 #undef SPDK_CONFIG_USDT 00:14:26.516 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:14:26.516 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:14:26.516 #undef SPDK_CONFIG_VFIO_USER 00:14:26.517 #define SPDK_CONFIG_VFIO_USER_DIR 00:14:26.517 #define SPDK_CONFIG_VHOST 1 00:14:26.517 #define SPDK_CONFIG_VIRTIO 1 00:14:26.517 #undef SPDK_CONFIG_VTUNE 00:14:26.517 #define SPDK_CONFIG_VTUNE_DIR 00:14:26.517 #define SPDK_CONFIG_WERROR 1 00:14:26.517 #define SPDK_CONFIG_WPDK_DIR 00:14:26.517 #undef SPDK_CONFIG_XNVME 00:14:26.517 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:14:26.517 +++ (( SPDK_AUTOTEST_DEBUG_APPS )) 00:14:26.517 ++ source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:14:26.517 +++ [[ -e /bin/wpdk_common.sh ]] 00:14:26.517 +++ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:14:26.517 +++ source /etc/opt/spdk-pkgdep/paths/export.sh 00:14:26.517 ++++ PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:14:26.517 ++++ PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:14:26.517 ++++ PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:14:26.517 ++++ export PATH 00:14:26.517 ++++ echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:14:26.517 ++ source /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:14:26.517 +++++ dirname /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:14:26.517 ++++ readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:14:26.517 +++ _pmdir=/home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:14:26.517 ++++ readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm/../../../ 00:14:26.517 +++ _pmrootdir=/home/vagrant/spdk_repo/spdk 00:14:26.517 +++ TEST_TAG=N/A 00:14:26.517 +++ TEST_TAG_FILE=/home/vagrant/spdk_repo/spdk/.run_test_name 00:14:26.517 +++ PM_OUTPUTDIR=/home/vagrant/spdk_repo/spdk/../output/power 00:14:26.517 ++++ uname -s 00:14:26.517 +++ PM_OS=Linux 00:14:26.517 +++ MONITOR_RESOURCES_SUDO=() 00:14:26.517 +++ declare -A MONITOR_RESOURCES_SUDO 00:14:26.517 +++ MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:14:26.517 +++ MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:14:26.517 +++ MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:14:26.517 +++ MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:14:26.517 +++ SUDO[0]= 00:14:26.517 +++ SUDO[1]='sudo -E' 00:14:26.517 +++ MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:14:26.517 +++ [[ Linux == FreeBSD ]] 00:14:26.517 +++ [[ Linux == Linux ]] 00:14:26.517 +++ [[ QEMU != QEMU ]] 00:14:26.517 +++ [[ ! -d /home/vagrant/spdk_repo/spdk/../output/power ]] 00:14:26.517 ++ : 1 00:14:26.517 ++ export RUN_NIGHTLY 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_AUTOTEST_DEBUG_APPS 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_RUN_VALGRIND 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_RUN_FUNCTIONAL_TEST 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_TEST_UNITTEST 00:14:26.517 ++ : 00:14:26.517 ++ export SPDK_TEST_AUTOBUILD 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_TEST_RELEASE_BUILD 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_ISAL 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_ISCSI 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_ISCSI_INITIATOR 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME_PMR 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME_BP 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME_CLI 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME_CUSE 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVME_FDP 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVMF 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VFIOUSER 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VFIOUSER_QEMU 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_FUZZER 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_FUZZER_SHORT 00:14:26.517 ++ : rdma 00:14:26.517 ++ export SPDK_TEST_NVMF_TRANSPORT 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_RBD 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VHOST 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_TEST_BLOCKDEV 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_IOAT 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_BLOBFS 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VHOST_INIT 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_LVOL 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VBDEV_COMPRESS 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_RUN_ASAN 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_RUN_UBSAN 00:14:26.517 ++ : /home/vagrant/spdk_repo/dpdk/build 00:14:26.517 ++ export SPDK_RUN_EXTERNAL_DPDK 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_RUN_NON_ROOT 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_CRYPTO 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_FTL 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_OCF 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_VMD 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_OPAL 00:14:26.517 ++ : v22.11.4 00:14:26.517 ++ export SPDK_TEST_NATIVE_DPDK 00:14:26.517 ++ : true 00:14:26.517 ++ export SPDK_AUTOTEST_X 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_RAID5 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_URING 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_USDT 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_TEST_USE_IGB_UIO 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_SCHEDULER 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_SCANBUILD 00:14:26.517 ++ : 00:14:26.517 ++ export SPDK_TEST_NVMF_NICS 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_SMA 00:14:26.517 ++ : 1 00:14:26.517 ++ export SPDK_TEST_DAOS 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_XNVME 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_ACCEL_DSA 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_ACCEL_IAA 00:14:26.517 ++ : 00:14:26.517 ++ export SPDK_TEST_FUZZER_TARGET 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_TEST_NVMF_MDNS 00:14:26.517 ++ : 0 00:14:26.517 ++ export SPDK_JSONRPC_GO_CLIENT 00:14:26.517 ++ export SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:14:26.517 ++ SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:14:26.517 ++ export DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:14:26.517 ++ DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:14:26.517 ++ export VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:14:26.517 ++ VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:14:26.517 ++ export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:14:26.517 ++ LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:14:26.517 ++ export PCI_BLOCK_SYNC_ON_RESET=yes 00:14:26.517 ++ PCI_BLOCK_SYNC_ON_RESET=yes 00:14:26.517 ++ export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:14:26.517 ++ PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:14:26.517 ++ export PYTHONDONTWRITEBYTECODE=1 00:14:26.517 ++ PYTHONDONTWRITEBYTECODE=1 00:14:26.517 ++ export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:14:26.518 ++ ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:14:26.518 ++ export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:14:26.518 ++ UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:14:26.518 ++ asan_suppression_file=/var/tmp/asan_suppression_file 00:14:26.518 ++ rm -rf /var/tmp/asan_suppression_file 00:14:26.518 ++ cat 00:14:26.518 ++ echo leak:libfuse3.so 00:14:26.518 ++ export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:14:26.518 ++ LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:14:26.518 ++ export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:14:26.518 ++ DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:14:26.518 ++ '[' -z /var/spdk/dependencies ']' 00:14:26.518 ++ export DEPENDENCY_DIR 00:14:26.518 ++ export SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:14:26.518 ++ SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:14:26.518 ++ export SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:14:26.518 ++ SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:14:26.518 ++ export QEMU_BIN= 00:14:26.518 ++ QEMU_BIN= 00:14:26.518 ++ export 'VFIO_QEMU_BIN=/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:14:26.518 ++ VFIO_QEMU_BIN='/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:14:26.518 ++ export AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:14:26.518 ++ AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:14:26.518 ++ export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:14:26.518 ++ UNBIND_ENTIRE_IOMMU_GROUP=yes 00:14:26.518 ++ '[' 0 -eq 0 ']' 00:14:26.518 ++ export valgrind= 00:14:26.518 ++ valgrind= 00:14:26.518 +++ uname -s 00:14:26.518 ++ '[' Linux = Linux ']' 00:14:26.518 ++ HUGEMEM=4096 00:14:26.518 ++ export CLEAR_HUGE=yes 00:14:26.518 ++ CLEAR_HUGE=yes 00:14:26.518 ++ [[ 0 -eq 1 ]] 00:14:26.518 ++ [[ 0 -eq 1 ]] 00:14:26.518 ++ MAKE=make 00:14:26.518 +++ nproc 00:14:26.518 ++ MAKEFLAGS=-j10 00:14:26.518 ++ export HUGEMEM=4096 00:14:26.518 ++ HUGEMEM=4096 00:14:26.518 ++ NO_HUGE=() 00:14:26.518 ++ TEST_MODE= 00:14:26.518 ++ [[ -z '' ]] 00:14:26.518 ++ PYTHONPATH+=:/home/vagrant/spdk_repo/spdk/test/rpc_plugins 00:14:26.518 ++ exec 00:14:26.518 ++ PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins 00:14:26.518 ++ /home/vagrant/spdk_repo/spdk/scripts/rpc.py --server 00:14:26.518 ++ set_test_storage 2147483648 00:14:26.518 ++ [[ -v testdir ]] 00:14:26.518 ++ local requested_size=2147483648 00:14:26.518 ++ local mount target_dir 00:14:26.518 ++ local -A mounts fss sizes avails uses 00:14:26.518 ++ local source fs size avail mount use 00:14:26.518 ++ local storage_fallback storage_candidates 00:14:26.518 +++ mktemp -udt spdk.XXXXXX 00:14:26.518 ++ storage_fallback=/tmp/spdk.A6Kub1 00:14:26.518 ++ storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:14:26.518 ++ [[ -n '' ]] 00:14:26.518 ++ [[ -n '' ]] 00:14:26.518 ++ mkdir -p /home/vagrant/spdk_repo/spdk/test/unit /tmp/spdk.A6Kub1/tests/unit /tmp/spdk.A6Kub1 00:14:26.518 ++ requested_size=2214592512 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 +++ df -T 00:14:26.518 +++ grep -v Filesystem 00:14:26.518 ++ mounts["$mount"]=devtmpfs 00:14:26.518 ++ fss["$mount"]=devtmpfs 00:14:26.518 ++ avails["$mount"]=4194304 00:14:26.518 ++ sizes["$mount"]=4194304 00:14:26.518 ++ uses["$mount"]=0 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=tmpfs 00:14:26.518 ++ fss["$mount"]=tmpfs 00:14:26.518 ++ avails["$mount"]=6270406656 00:14:26.518 ++ sizes["$mount"]=6270406656 00:14:26.518 ++ uses["$mount"]=0 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=tmpfs 00:14:26.518 ++ fss["$mount"]=tmpfs 00:14:26.518 ++ avails["$mount"]=2490781696 00:14:26.518 ++ sizes["$mount"]=2508165120 00:14:26.518 ++ uses["$mount"]=17383424 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=/dev/vda5 00:14:26.518 ++ fss["$mount"]=xfs 00:14:26.518 ++ avails["$mount"]=12237733888 00:14:26.518 ++ sizes["$mount"]=20303577088 00:14:26.518 ++ uses["$mount"]=8065843200 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=/dev/vda2 00:14:26.518 ++ fss["$mount"]=xfs 00:14:26.518 ++ avails["$mount"]=896184320 00:14:26.518 ++ sizes["$mount"]=1042161664 00:14:26.518 ++ uses["$mount"]=145977344 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=/dev/vda1 00:14:26.518 ++ fss["$mount"]=vfat 00:14:26.518 ++ avails["$mount"]=97312768 00:14:26.518 ++ sizes["$mount"]=104607744 00:14:26.518 ++ uses["$mount"]=7294976 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=tmpfs 00:14:26.518 ++ fss["$mount"]=tmpfs 00:14:26.518 ++ avails["$mount"]=1254076416 00:14:26.518 ++ sizes["$mount"]=1254080512 00:14:26.518 ++ uses["$mount"]=4096 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ mounts["$mount"]=:/mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt/output 00:14:26.518 ++ fss["$mount"]=fuse.sshfs 00:14:26.518 ++ avails["$mount"]=91973607424 00:14:26.518 ++ sizes["$mount"]=105088212992 00:14:26.518 ++ uses["$mount"]=7729172480 00:14:26.518 ++ read -r source fs size use avail _ mount 00:14:26.518 ++ printf '* Looking for test storage...\n' 00:14:26.518 * Looking for test storage... 00:14:26.518 ++ local target_space new_size 00:14:26.518 ++ for target_dir in "${storage_candidates[@]}" 00:14:26.518 +++ df /home/vagrant/spdk_repo/spdk/test/unit 00:14:26.518 +++ awk '$1 !~ /Filesystem/{print $6}' 00:14:26.518 ++ mount=/ 00:14:26.518 ++ target_space=12237733888 00:14:26.518 ++ (( target_space == 0 || target_space < requested_size )) 00:14:26.518 ++ (( target_space >= requested_size )) 00:14:26.518 ++ [[ xfs == tmpfs ]] 00:14:26.518 ++ [[ xfs == ramfs ]] 00:14:26.518 ++ [[ / == / ]] 00:14:26.518 ++ new_size=10280435712 00:14:26.518 ++ (( new_size * 100 / sizes[/] > 95 )) 00:14:26.518 ++ export SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/unit 00:14:26.518 ++ SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/unit 00:14:26.518 ++ printf '* Found test storage at %s\n' /home/vagrant/spdk_repo/spdk/test/unit 00:14:26.518 * Found test storage at /home/vagrant/spdk_repo/spdk/test/unit 00:14:26.518 ++ return 0 00:14:26.518 ++ set -o errtrace 00:14:26.518 ++ shopt -s extdebug 00:14:26.518 ++ trap 'trap - ERR; print_backtrace >&2' ERR 00:14:26.518 ++ PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@1686 -- # true 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@1688 -- # xtrace_fd 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@25 -- # [[ -n '' ]] 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@29 -- # exec 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@31 -- # xtrace_restore 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:14:26.518 12:16:49 unittest -- common/autotest_common.sh@18 -- # set -x 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@17 -- # cd /home/vagrant/spdk_repo/spdk 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@153 -- # '[' 0 -eq 1 ']' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@160 -- # '[' -z x ']' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@167 -- # '[' 0 -eq 1 ']' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@180 -- # grep CC_TYPE /home/vagrant/spdk_repo/spdk/mk/cc.mk 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@180 -- # CC_TYPE=CC_TYPE=gcc 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@181 -- # hash lcov 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@181 -- # grep -q '#define SPDK_CONFIG_COVERAGE 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@181 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@182 -- # cov_avail=yes 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@186 -- # '[' yes = yes ']' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@188 -- # [[ -z /home/vagrant/spdk_repo/spdk/../output ]] 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@191 -- # UT_COVERAGE=/home/vagrant/spdk_repo/spdk/../output/ut_coverage 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@193 -- # mkdir -p /home/vagrant/spdk_repo/spdk/../output/ut_coverage 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@201 -- # export 'LCOV_OPTS= 00:14:26.518 --rc lcov_branch_coverage=1 00:14:26.518 --rc lcov_function_coverage=1 00:14:26.518 --rc genhtml_branch_coverage=1 00:14:26.518 --rc genhtml_function_coverage=1 00:14:26.518 --rc genhtml_legend=1 00:14:26.518 --rc geninfo_all_blocks=1 00:14:26.518 ' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@201 -- # LCOV_OPTS=' 00:14:26.518 --rc lcov_branch_coverage=1 00:14:26.518 --rc lcov_function_coverage=1 00:14:26.518 --rc genhtml_branch_coverage=1 00:14:26.518 --rc genhtml_function_coverage=1 00:14:26.518 --rc genhtml_legend=1 00:14:26.518 --rc geninfo_all_blocks=1 00:14:26.518 ' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@202 -- # export 'LCOV=lcov 00:14:26.518 --rc lcov_branch_coverage=1 00:14:26.518 --rc lcov_function_coverage=1 00:14:26.518 --rc genhtml_branch_coverage=1 00:14:26.518 --rc genhtml_function_coverage=1 00:14:26.518 --rc genhtml_legend=1 00:14:26.518 --rc geninfo_all_blocks=1 00:14:26.518 --no-external' 00:14:26.518 12:16:49 unittest -- unit/unittest.sh@202 -- # LCOV='lcov 00:14:26.518 --rc lcov_branch_coverage=1 00:14:26.518 --rc lcov_function_coverage=1 00:14:26.518 --rc genhtml_branch_coverage=1 00:14:26.518 --rc genhtml_function_coverage=1 00:14:26.518 --rc genhtml_legend=1 00:14:26.518 --rc geninfo_all_blocks=1 00:14:26.518 --no-external' 00:14:26.519 12:16:49 unittest -- unit/unittest.sh@204 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -d . -t Baseline -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_base.info 00:14:38.724 /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:14:38.724 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc32.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc32.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/accel_module.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_zone.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/assert.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/assert.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/barrier.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/barrier.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/base64.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/base64.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/dma.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/dma.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bdev_module.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_array.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/bit_pool.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob_bdev.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blob.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc16.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc16.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/blobfs_bdev.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/conf.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/conf.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/config.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/config.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/dif.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/dif.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/cpuset.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc64.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/crc64.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/endian.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/endian.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/env.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/env.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/env_dpdk.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/event.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/event.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/reduce.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/reduce.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd.gcno:no functions found 00:14:48.696 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd.gcno 00:14:48.696 /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/fd_group.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/file.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/file.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ftl.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ftl.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/gpt_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/hexlify.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/rpc.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/rpc.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/queue.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/queue.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/histogram_data.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/idxd_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/init.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/init.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ioat_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/iscsi_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/json.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/json.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/jsonrpc.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/opal_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/keyring_module.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/likely.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/likely.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/log.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/log.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/lvol.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/lvol.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/memory.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/memory.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/mmio.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/mmio.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/sock.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/sock.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/stdinc.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nbd.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nbd.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/notify.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/notify.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/string.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/string.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_intel.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/pci_ids.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvme_zns.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/pipe.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/pipe.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_cmd.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/nvmf_transport.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scheduler.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/scsi_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/thread.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/thread.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/trace_parser.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/tree.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/tree.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/ublk.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/ublk.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/util.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/util.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/uuid.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/uuid.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/version.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/version.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_pci.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vfio_user_spec.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vhost.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vhost.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/vmd.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/vmd.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/xor.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/xor.gcno 00:14:48.697 /home/vagrant/spdk_repo/spdk/test/cpp_headers/zipf.gcno:no functions found 00:14:48.697 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/test/cpp_headers/zipf.gcno 00:15:20.777 12:17:39 unittest -- unit/unittest.sh@208 -- # uname -m 00:15:20.777 12:17:39 unittest -- unit/unittest.sh@208 -- # '[' x86_64 = aarch64 ']' 00:15:20.777 12:17:39 unittest -- unit/unittest.sh@212 -- # run_test unittest_pci_event /home/vagrant/spdk_repo/spdk/test/unit/lib/env_dpdk/pci_event.c/pci_event_ut 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:20.777 ************************************ 00:15:20.777 START TEST unittest_pci_event 00:15:20.777 ************************************ 00:15:20.777 12:17:39 unittest.unittest_pci_event -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/env_dpdk/pci_event.c/pci_event_ut 00:15:20.777 00:15:20.777 00:15:20.777 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.777 http://cunit.sourceforge.net/ 00:15:20.777 00:15:20.777 00:15:20.777 Suite: pci_event 00:15:20.777 Test: test_pci_parse_event ...[2024-06-07 12:17:39.134405] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/pci_event.c: 162:parse_subsystem_event: *ERROR*: Invalid format for PCI device BDF: 0000 00:15:20.777 passed 00:15:20.777 00:15:20.777 [2024-06-07 12:17:39.134789] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/pci_event.c: 185:parse_subsystem_event: *ERROR*: Invalid format for PCI device BDF: 000000 00:15:20.777 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.777 suites 1 1 n/a 0 0 00:15:20.777 tests 1 1 1 0 0 00:15:20.777 asserts 15 15 15 0 n/a 00:15:20.777 00:15:20.777 Elapsed time = 0.001 seconds 00:15:20.777 00:15:20.777 real 0m0.040s 00:15:20.777 user 0m0.021s 00:15:20.777 sys 0m0.012s 00:15:20.777 12:17:39 unittest.unittest_pci_event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:20.777 ************************************ 00:15:20.777 END TEST unittest_pci_event 00:15:20.777 ************************************ 00:15:20.777 12:17:39 unittest.unittest_pci_event -- common/autotest_common.sh@10 -- # set +x 00:15:20.777 12:17:39 unittest -- unit/unittest.sh@213 -- # run_test unittest_include /home/vagrant/spdk_repo/spdk/test/unit/include/spdk/histogram_data.h/histogram_ut 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:20.777 12:17:39 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:20.777 ************************************ 00:15:20.777 START TEST unittest_include 00:15:20.777 ************************************ 00:15:20.777 12:17:39 unittest.unittest_include -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/include/spdk/histogram_data.h/histogram_ut 00:15:20.777 00:15:20.777 00:15:20.777 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.777 http://cunit.sourceforge.net/ 00:15:20.777 00:15:20.777 00:15:20.777 Suite: histogram 00:15:20.777 Test: histogram_test ...passed 00:15:20.777 Test: histogram_merge ...passed 00:15:20.777 00:15:20.777 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.777 suites 1 1 n/a 0 0 00:15:20.778 tests 2 2 2 0 0 00:15:20.778 asserts 50 50 50 0 n/a 00:15:20.778 00:15:20.778 Elapsed time = 0.003 seconds 00:15:20.778 00:15:20.778 real 0m0.032s 00:15:20.778 user 0m0.015s 00:15:20.778 sys 0m0.017s 00:15:20.778 12:17:39 unittest.unittest_include -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:20.778 12:17:39 unittest.unittest_include -- common/autotest_common.sh@10 -- # set +x 00:15:20.778 ************************************ 00:15:20.778 END TEST unittest_include 00:15:20.778 ************************************ 00:15:20.778 12:17:39 unittest -- unit/unittest.sh@214 -- # run_test unittest_bdev unittest_bdev 00:15:20.778 12:17:39 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:20.778 12:17:39 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:20.778 12:17:39 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:20.778 ************************************ 00:15:20.778 START TEST unittest_bdev 00:15:20.778 ************************************ 00:15:20.778 12:17:39 unittest.unittest_bdev -- common/autotest_common.sh@1124 -- # unittest_bdev 00:15:20.778 12:17:39 unittest.unittest_bdev -- unit/unittest.sh@20 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/bdev.c/bdev_ut 00:15:20.778 00:15:20.778 00:15:20.778 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.778 http://cunit.sourceforge.net/ 00:15:20.778 00:15:20.778 00:15:20.778 Suite: bdev 00:15:20.778 Test: bytes_to_blocks_test ...passed 00:15:20.778 Test: num_blocks_test ...passed 00:15:20.778 Test: io_valid_test ...passed 00:15:20.778 Test: open_write_test ...[2024-06-07 12:17:39.436030] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev1 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 [2024-06-07 12:17:39.436430] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev4 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 [2024-06-07 12:17:39.436561] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev5 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 passed 00:15:20.778 Test: claim_test ...passed 00:15:20.778 Test: alias_add_del_test ...[2024-06-07 12:17:39.583061] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:4580:bdev_name_add: *ERROR*: Bdev name bdev0 already exists 00:15:20.778 [2024-06-07 12:17:39.583287] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:4610:spdk_bdev_alias_add: *ERROR*: Empty alias passed 00:15:20.778 [2024-06-07 12:17:39.583368] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:4580:bdev_name_add: *ERROR*: Bdev name proper alias 0 already exists 00:15:20.778 passed 00:15:20.778 Test: get_device_stat_test ...passed 00:15:20.778 Test: bdev_io_types_test ...passed 00:15:20.778 Test: bdev_io_wait_test ...passed 00:15:20.778 Test: bdev_io_spans_split_test ...passed 00:15:20.778 Test: bdev_io_boundary_split_test ...passed 00:15:20.778 Test: bdev_io_max_size_and_segment_split_test ...[2024-06-07 12:17:39.832681] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:3208:_bdev_rw_split: *ERROR*: The first child io was less than a block size 00:15:20.778 passed 00:15:20.778 Test: bdev_io_mix_split_test ...passed 00:15:20.778 Test: bdev_io_split_with_io_wait ...passed 00:15:20.778 Test: bdev_io_write_unit_split_test ...[2024-06-07 12:17:39.990036] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:2759:bdev_io_do_submit: *ERROR*: IO num_blocks 31 does not match the write_unit_size 32 00:15:20.778 [2024-06-07 12:17:39.990149] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:2759:bdev_io_do_submit: *ERROR*: IO num_blocks 31 does not match the write_unit_size 32 00:15:20.778 [2024-06-07 12:17:39.990181] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:2759:bdev_io_do_submit: *ERROR*: IO num_blocks 1 does not match the write_unit_size 32 00:15:20.778 [2024-06-07 12:17:39.990207] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:2759:bdev_io_do_submit: *ERROR*: IO num_blocks 32 does not match the write_unit_size 64 00:15:20.778 passed 00:15:20.778 Test: bdev_io_alignment_with_boundary ...passed 00:15:20.778 Test: bdev_io_alignment ...passed 00:15:20.778 Test: bdev_histograms ...passed 00:15:20.778 Test: bdev_write_zeroes ...passed 00:15:20.778 Test: bdev_compare_and_write ...passed 00:15:20.778 Test: bdev_compare ...passed 00:15:20.778 Test: bdev_compare_emulated ...passed 00:15:20.778 Test: bdev_zcopy_write ...passed 00:15:20.778 Test: bdev_zcopy_read ...passed 00:15:20.778 Test: bdev_open_while_hotremove ...passed 00:15:20.778 Test: bdev_close_while_hotremove ...passed 00:15:20.778 Test: bdev_open_ext_test ...passed 00:15:20.778 Test: bdev_open_ext_unregister ...[2024-06-07 12:17:40.604896] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8141:spdk_bdev_open_ext: *ERROR*: Missing event callback function 00:15:20.778 passed 00:15:20.778 Test: bdev_set_io_timeout ...[2024-06-07 12:17:40.605058] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8141:spdk_bdev_open_ext: *ERROR*: Missing event callback function 00:15:20.778 passed 00:15:20.778 Test: bdev_set_qd_sampling ...passed 00:15:20.778 Test: lba_range_overlap ...passed 00:15:20.778 Test: lock_lba_range_check_ranges ...passed 00:15:20.778 Test: lock_lba_range_with_io_outstanding ...passed 00:15:20.778 Test: lock_lba_range_overlapped ...passed 00:15:20.778 Test: bdev_quiesce ...[2024-06-07 12:17:40.887495] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:10064:_spdk_bdev_quiesce: *ERROR*: The range to unquiesce was not found. 00:15:20.778 passed 00:15:20.778 Test: bdev_io_abort ...passed 00:15:20.778 Test: bdev_unmap ...passed 00:15:20.778 Test: bdev_write_zeroes_split_test ...passed 00:15:20.778 Test: bdev_set_options_test ...[2024-06-07 12:17:41.068361] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c: 502:spdk_bdev_set_opts: *ERROR*: opts_size inside opts cannot be zero value 00:15:20.778 passed 00:15:20.778 Test: bdev_get_memory_domains ...passed 00:15:20.778 Test: bdev_io_ext ...passed 00:15:20.778 Test: bdev_io_ext_no_opts ...passed 00:15:20.778 Test: bdev_io_ext_invalid_opts ...passed 00:15:20.778 Test: bdev_io_ext_split ...passed 00:15:20.778 Test: bdev_io_ext_bounce_buffer ...passed 00:15:20.778 Test: bdev_register_uuid_alias ...[2024-06-07 12:17:41.352596] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:4580:bdev_name_add: *ERROR*: Bdev name 0785afc3-882c-4c18-a1d8-71b62bf992af already exists 00:15:20.778 [2024-06-07 12:17:41.352677] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:7696:bdev_register: *ERROR*: Unable to add uuid:0785afc3-882c-4c18-a1d8-71b62bf992af alias for bdev bdev0 00:15:20.778 passed 00:15:20.778 Test: bdev_unregister_by_name ...[2024-06-07 12:17:41.380988] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:7931:spdk_bdev_unregister_by_name: *ERROR*: Failed to open bdev with name: bdev1 00:15:20.778 [2024-06-07 12:17:41.381072] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:7939:spdk_bdev_unregister_by_name: *ERROR*: Bdev bdev was not registered by the specified module. 00:15:20.778 passed 00:15:20.778 Test: for_each_bdev_test ...passed 00:15:20.778 Test: bdev_seek_test ...passed 00:15:20.778 Test: bdev_copy ...passed 00:15:20.778 Test: bdev_copy_split_test ...passed 00:15:20.778 Test: examine_locks ...passed 00:15:20.778 Test: claim_v2_rwo ...[2024-06-07 12:17:41.538277] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538366] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8665:claim_verify_rwo: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538383] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538450] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538467] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538514] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8660:claim_verify_rwo: *ERROR*: bdev0: key option not supported with read-write-once claims 00:15:20.778 passed 00:15:20.778 Test: claim_v2_rom ...[2024-06-07 12:17:41.538604] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev0 already claimed: type read_many_write_none by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538645] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_none by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538663] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_none by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538685] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_none by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538735] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8703:claim_verify_rom: *ERROR*: bdev0: key option not supported with read-only-may claims 00:15:20.778 passed 00:15:20.778 Test: claim_v2_rwm ...passed 00:15:20.778 Test: claim_v2_existing_writer ...[2024-06-07 12:17:41.538772] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8698:claim_verify_rom: *ERROR*: bdev0: Cannot obtain read-only-many claim with writable descriptor 00:15:20.778 [2024-06-07 12:17:41.538849] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8733:claim_verify_rwm: *ERROR*: bdev0: shared_claim_key option required with read-write-may claims 00:15:20.778 [2024-06-07 12:17:41.538896] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8035:bdev_open: *ERROR*: bdev bdev0 already claimed: type read_many_write_many by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538918] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_many by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538940] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_many by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538956] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_many by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.538983] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8753:claim_verify_rwm: *ERROR*: bdev bdev0 already claimed with another key: type read_many_write_many by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.539009] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8733:claim_verify_rwm: *ERROR*: bdev0: shared_claim_key option required with read-write-may claims 00:15:20.778 passed 00:15:20.778 Test: claim_v2_existing_v1 ...[2024-06-07 12:17:41.539096] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8698:claim_verify_rom: *ERROR*: bdev0: Cannot obtain read-only-many claim with writable descriptor 00:15:20.778 [2024-06-07 12:17:41.539119] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8698:claim_verify_rom: *ERROR*: bdev0: Cannot obtain read-only-many claim with writable descriptor 00:15:20.778 [2024-06-07 12:17:41.539206] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.539248] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 [2024-06-07 12:17:41.539264] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type exclusive_write by module bdev_ut 00:15:20.778 passed 00:15:20.778 Test: claim_v1_existing_v2 ...[2024-06-07 12:17:41.539378] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module bdev_ut 00:15:20.779 passed 00:15:20.779 Test: examine_claimed ...passed 00:15:20.779 00:15:20.779 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.779 suites 1 1 n/a 0 0 00:15:20.779 tests 59 59 59 0 0 00:15:20.779 asserts 4599 4599 4599 0 n/a 00:15:20.779 00:15:20.779 Elapsed time = 2.199 seconds 00:15:20.779 [2024-06-07 12:17:41.539420] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_many by module bdev_ut 00:15:20.779 [2024-06-07 12:17:41.539461] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8502:spdk_bdev_module_claim_bdev: *ERROR*: bdev bdev0 already claimed: type read_many_write_none by module bdev_ut 00:15:20.779 [2024-06-07 12:17:41.539640] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8830:spdk_bdev_module_claim_bdev_desc: *ERROR*: bdev bdev0 already claimed: type read_many_write_one by module vbdev_ut_examine1 00:15:20.779 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@21 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/nvme/bdev_nvme.c/bdev_nvme_ut 00:15:20.779 00:15:20.779 00:15:20.779 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.779 http://cunit.sourceforge.net/ 00:15:20.779 00:15:20.779 00:15:20.779 Suite: nvme 00:15:20.779 Test: test_create_ctrlr ...passed 00:15:20.779 Test: test_reset_ctrlr ...[2024-06-07 12:17:41.594105] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_race_between_reset_and_destruct_ctrlr ...passed 00:15:20.779 Test: test_failover_ctrlr ...passed 00:15:20.779 Test: test_race_between_failover_and_add_secondary_trid ...[2024-06-07 12:17:41.596500] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.596740] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.596958] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_pending_reset ...[2024-06-07 12:17:41.598437] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.598661] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_attach_ctrlr ...[2024-06-07 12:17:41.599797] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:4308:nvme_bdev_create: *ERROR*: spdk_bdev_register() failed 00:15:20.779 passed 00:15:20.779 Test: test_aer_cb ...passed 00:15:20.779 Test: test_submit_nvme_cmd ...passed 00:15:20.779 Test: test_add_remove_trid ...passed 00:15:20.779 Test: test_abort ...[2024-06-07 12:17:41.602950] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:7447:bdev_nvme_comparev_and_writev_done: *ERROR*: Unexpected write success after compare failure. 00:15:20.779 passed 00:15:20.779 Test: test_get_io_qpair ...passed 00:15:20.779 Test: test_bdev_unregister ...passed 00:15:20.779 Test: test_compare_ns ...passed 00:15:20.779 Test: test_init_ana_log_page ...passed 00:15:20.779 Test: test_get_memory_domains ...passed 00:15:20.779 Test: test_reconnect_qpair ...[2024-06-07 12:17:41.605380] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_create_bdev_ctrlr ...[2024-06-07 12:17:41.605886] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:5373:bdev_nvme_check_multipath: *ERROR*: cntlid 18 are duplicated. 00:15:20.779 passed 00:15:20.779 Test: test_add_multi_ns_to_bdev ...[2024-06-07 12:17:41.607105] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:4564:nvme_bdev_add_ns: *ERROR*: Namespaces are not identical. 00:15:20.779 passed 00:15:20.779 Test: test_add_multi_io_paths_to_nbdev_ch ...passed 00:15:20.779 Test: test_admin_path ...passed 00:15:20.779 Test: test_reset_bdev_ctrlr ...passed 00:15:20.779 Test: test_find_io_path ...passed 00:15:20.779 Test: test_retry_io_if_ana_state_is_updating ...passed 00:15:20.779 Test: test_retry_io_for_io_path_error ...passed 00:15:20.779 Test: test_retry_io_count ...passed 00:15:20.779 Test: test_concurrent_read_ana_log_page ...passed 00:15:20.779 Test: test_retry_io_for_ana_error ...passed 00:15:20.779 Test: test_check_io_error_resiliency_params ...[2024-06-07 12:17:41.613313] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6067:bdev_nvme_check_io_error_resiliency_params: *ERROR*: ctrlr_loss_timeout_sec can't be less than -1. 00:15:20.779 [2024-06-07 12:17:41.613405] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6071:bdev_nvme_check_io_error_resiliency_params: *ERROR*: reconnect_delay_sec can't be 0 if ctrlr_loss_timeout_sec is not 0. 00:15:20.779 [2024-06-07 12:17:41.613445] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6080:bdev_nvme_check_io_error_resiliency_params: *ERROR*: reconnect_delay_sec can't be 0 if ctrlr_loss_timeout_sec is not 0. 00:15:20.779 [2024-06-07 12:17:41.613483] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6083:bdev_nvme_check_io_error_resiliency_params: *ERROR*: reconnect_delay_sec can't be more than ctrlr_loss_timeout_sec. 00:15:20.779 [2024-06-07 12:17:41.613517] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6095:bdev_nvme_check_io_error_resiliency_params: *ERROR*: Both reconnect_delay_sec and fast_io_fail_timeout_sec must be 0 if ctrlr_loss_timeout_sec is 0. 00:15:20.779 [2024-06-07 12:17:41.613559] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6095:bdev_nvme_check_io_error_resiliency_params: *ERROR*: Both reconnect_delay_sec and fast_io_fail_timeout_sec must be 0 if ctrlr_loss_timeout_sec is 0. 00:15:20.779 [2024-06-07 12:17:41.613586] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6075:bdev_nvme_check_io_error_resiliency_params: *ERROR*: reconnect_delay_sec can't be more than fast_io-fail_timeout_sec. 00:15:20.779 [2024-06-07 12:17:41.613651] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6090:bdev_nvme_check_io_error_resiliency_params: *ERROR*: fast_io_fail_timeout_sec can't be more than ctrlr_loss_timeout_sec. 00:15:20.779 passed 00:15:20.779 Test: test_retry_io_if_ctrlr_is_resetting ...[2024-06-07 12:17:41.613692] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:6087:bdev_nvme_check_io_error_resiliency_params: *ERROR*: reconnect_delay_sec can't be more than fast_io_fail_timeout_sec. 00:15:20.779 passed 00:15:20.779 Test: test_reconnect_ctrlr ...[2024-06-07 12:17:41.614388] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.614505] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.614699] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.614795] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.614895] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_retry_failover_ctrlr ...[2024-06-07 12:17:41.615207] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_fail_path ...[2024-06-07 12:17:41.615761] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.615914] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_nvme_ns_cmp ...passed 00:15:20.779 Test: test_ana_transition ...passed 00:15:20.779 Test: test_set_preferred_path ...[2024-06-07 12:17:41.616060] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.616149] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 [2024-06-07 12:17:41.616267] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_find_next_io_path ...passed 00:15:20.779 Test: test_find_io_path_min_qd ...passed 00:15:20.779 Test: test_disable_auto_failback ...[2024-06-07 12:17:41.617627] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_set_multipath_policy ...passed 00:15:20.779 Test: test_uuid_generation ...passed 00:15:20.779 Test: test_retry_io_to_same_path ...passed 00:15:20.779 Test: test_race_between_reset_and_disconnected ...passed 00:15:20.779 Test: test_ctrlr_op_rpc ...passed 00:15:20.779 Test: test_bdev_ctrlr_op_rpc ...passed 00:15:20.779 Test: test_disable_enable_ctrlr ...[2024-06-07 12:17:41.620676] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_delete_ctrlr_done ...[2024-06-07 12:17:41.620828] /home/vagrant/spdk_repo/spdk/module/bdev/nvme/bdev_nvme.c:2062:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:15:20.779 passed 00:15:20.779 Test: test_ns_remove_during_reset ...passed 00:15:20.779 00:15:20.779 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.779 suites 1 1 n/a 0 0 00:15:20.779 tests 48 48 48 0 0 00:15:20.779 asserts 3565 3565 3565 0 n/a 00:15:20.779 00:15:20.779 Elapsed time = 0.029 seconds 00:15:20.779 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@22 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/raid/bdev_raid.c/bdev_raid_ut 00:15:20.779 00:15:20.779 00:15:20.779 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.779 http://cunit.sourceforge.net/ 00:15:20.779 00:15:20.779 Test Options 00:15:20.779 blocklen = 4096, strip_size = 64, max_io_size = 1024, g_max_base_drives = 32, g_max_raids = 2 00:15:20.779 00:15:20.779 Suite: raid 00:15:20.779 Test: test_create_raid ...passed 00:15:20.779 Test: test_create_raid_superblock ...passed 00:15:20.779 Test: test_delete_raid ...passed 00:15:20.779 Test: test_create_raid_invalid_args ...[2024-06-07 12:17:41.663023] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:1481:_raid_bdev_create: *ERROR*: Unsupported raid level '-1' 00:15:20.779 [2024-06-07 12:17:41.663560] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:1475:_raid_bdev_create: *ERROR*: Invalid strip size 1231 00:15:20.779 [2024-06-07 12:17:41.664047] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:1465:_raid_bdev_create: *ERROR*: Duplicate raid bdev name found: raid1 00:15:20.779 [2024-06-07 12:17:41.664325] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:3193:raid_bdev_configure_base_bdev: *ERROR*: Unable to claim this bdev as it is already claimed 00:15:20.779 [2024-06-07 12:17:41.664444] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:3369:raid_bdev_add_base_bdev: *ERROR*: base bdev 'Nvme0n1' configure failed: (null) 00:15:20.779 [2024-06-07 12:17:41.665313] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:3193:raid_bdev_configure_base_bdev: *ERROR*: Unable to claim this bdev as it is already claimed 00:15:20.780 [2024-06-07 12:17:41.665370] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid.c:3369:raid_bdev_add_base_bdev: *ERROR*: base bdev 'Nvme0n1' configure failed: (null) 00:15:20.780 passed 00:15:20.780 Test: test_delete_raid_invalid_args ...passed 00:15:20.780 Test: test_io_channel ...passed 00:15:20.780 Test: test_reset_io ...passed 00:15:20.780 Test: test_multi_raid ...passed 00:15:20.780 Test: test_io_type_supported ...passed 00:15:20.780 Test: test_raid_json_dump_info ...passed 00:15:20.780 Test: test_context_size ...passed 00:15:20.780 Test: test_raid_level_conversions ...passed 00:15:20.780 Test: test_raid_io_split ...passed 00:15:20.780 Test: test_raid_process ...passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 1 1 n/a 0 0 00:15:20.780 tests 14 14 14 0 0 00:15:20.780 asserts 6183 6183 6183 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.023 seconds 00:15:20.780 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@23 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/raid/bdev_raid_sb.c/bdev_raid_sb_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: raid_sb 00:15:20.780 Test: test_raid_bdev_write_superblock ...passed 00:15:20.780 Test: test_raid_bdev_load_base_bdev_superblock ...passed 00:15:20.780 Test: test_raid_bdev_parse_superblock ...[2024-06-07 12:17:41.721364] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid_sb.c: 165:raid_bdev_parse_superblock: *ERROR*: Not supported superblock major version 9999 on bdev test_bdev 00:15:20.780 passed 00:15:20.780 Suite: raid_sb_md 00:15:20.780 Test: test_raid_bdev_write_superblock ...passed 00:15:20.780 Test: test_raid_bdev_load_base_bdev_superblock ...passed 00:15:20.780 Test: test_raid_bdev_parse_superblock ...passed[2024-06-07 12:17:41.721844] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid_sb.c: 165:raid_bdev_parse_superblock: *ERROR*: Not supported superblock major version 9999 on bdev test_bdev 00:15:20.780 00:15:20.780 Suite: raid_sb_md_interleaved 00:15:20.780 Test: test_raid_bdev_write_superblock ...passed 00:15:20.780 Test: test_raid_bdev_load_base_bdev_superblock ...passed 00:15:20.780 Test: test_raid_bdev_parse_superblock ...passed 00:15:20.780 00:15:20.780 [2024-06-07 12:17:41.722140] /home/vagrant/spdk_repo/spdk/module/bdev/raid/bdev_raid_sb.c: 165:raid_bdev_parse_superblock: *ERROR*: Not supported superblock major version 9999 on bdev test_bdev 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 3 3 n/a 0 0 00:15:20.780 tests 9 9 9 0 0 00:15:20.780 asserts 139 139 139 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.002 seconds 00:15:20.780 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@24 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/raid/concat.c/concat_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: concat 00:15:20.780 Test: test_concat_start ...passed 00:15:20.780 Test: test_concat_rw ...passed 00:15:20.780 Test: test_concat_null_payload ...passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 1 1 n/a 0 0 00:15:20.780 tests 3 3 3 0 0 00:15:20.780 asserts 8460 8460 8460 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.006 seconds 00:15:20.780 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@25 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/raid/raid0.c/raid0_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: raid0 00:15:20.780 Test: test_write_io ...passed 00:15:20.780 Test: test_read_io ...passed 00:15:20.780 Test: test_unmap_io ...passed 00:15:20.780 Test: test_io_failure ...passed 00:15:20.780 Suite: raid0_dif 00:15:20.780 Test: test_write_io ...passed 00:15:20.780 Test: test_read_io ...passed 00:15:20.780 Test: test_unmap_io ...passed 00:15:20.780 Test: test_io_failure ...passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 2 2 n/a 0 0 00:15:20.780 tests 8 8 8 0 0 00:15:20.780 asserts 368291 368291 368291 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.118 seconds 00:15:20.780 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@26 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/raid/raid1.c/raid1_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: raid1 00:15:20.780 Test: test_raid1_start ...passed 00:15:20.780 Test: test_raid1_read_balancing ...passed 00:15:20.780 Test: test_raid1_write_error ...passed 00:15:20.780 Test: test_raid1_read_error ...passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 1 1 n/a 0 0 00:15:20.780 tests 4 4 4 0 0 00:15:20.780 asserts 4374 4374 4374 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.004 seconds 00:15:20.780 12:17:41 unittest.unittest_bdev -- unit/unittest.sh@27 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/bdev_zone.c/bdev_zone_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: zone 00:15:20.780 Test: test_zone_get_operation ...passed 00:15:20.780 Test: test_bdev_zone_get_info ...passed 00:15:20.780 Test: test_bdev_zone_management ...passed 00:15:20.780 Test: test_bdev_zone_append ...passed 00:15:20.780 Test: test_bdev_zone_append_with_md ...passed 00:15:20.780 Test: test_bdev_zone_appendv ...passed 00:15:20.780 Test: test_bdev_zone_appendv_with_md ...passed 00:15:20.780 Test: test_bdev_io_get_append_location ...passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 1 1 n/a 0 0 00:15:20.780 tests 8 8 8 0 0 00:15:20.780 asserts 94 94 94 0 n/a 00:15:20.780 00:15:20.780 Elapsed time = 0.000 seconds 00:15:20.780 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@28 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/gpt/gpt.c/gpt_ut 00:15:20.780 00:15:20.780 00:15:20.780 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.780 http://cunit.sourceforge.net/ 00:15:20.780 00:15:20.780 00:15:20.780 Suite: gpt_parse 00:15:20.780 Test: test_parse_mbr_and_primary ...[2024-06-07 12:17:42.051734] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 259:gpt_parse_mbr: *ERROR*: Gpt and the related buffer should not be NULL 00:15:20.780 [2024-06-07 12:17:42.052261] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 259:gpt_parse_mbr: *ERROR*: Gpt and the related buffer should not be NULL 00:15:20.780 [2024-06-07 12:17:42.052359] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 165:gpt_read_header: *ERROR*: head_size=1633771873 00:15:20.780 [2024-06-07 12:17:42.052498] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 279:gpt_parse_partition_table: *ERROR*: Failed to read gpt header 00:15:20.780 [2024-06-07 12:17:42.052567] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 88:gpt_read_partitions: *ERROR*: Num_partition_entries=1633771873 which exceeds max=128 00:15:20.780 [2024-06-07 12:17:42.052703] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 285:gpt_parse_partition_table: *ERROR*: Failed to read gpt partitions 00:15:20.780 passed 00:15:20.780 Test: test_parse_secondary ...[2024-06-07 12:17:42.053058] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 165:gpt_read_header: *ERROR*: head_size=1633771873 00:15:20.780 [2024-06-07 12:17:42.053140] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 279:gpt_parse_partition_table: *ERROR*: Failed to read gpt header 00:15:20.780 [2024-06-07 12:17:42.053195] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 88:gpt_read_partitions: *ERROR*: Num_partition_entries=1633771873 which exceeds max=128 00:15:20.780 [2024-06-07 12:17:42.053280] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 285:gpt_parse_partition_table: *ERROR*: Failed to read gpt partitions 00:15:20.780 passed 00:15:20.780 Test: test_check_mbr ...[2024-06-07 12:17:42.053624] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 259:gpt_parse_mbr: *ERROR*: Gpt and the related buffer should not be NULL 00:15:20.780 [2024-06-07 12:17:42.053696] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 259:gpt_parse_mbr: *ERROR*: Gpt and the related buffer should not be NULL 00:15:20.780 passed 00:15:20.780 Test: test_read_header ...[2024-06-07 12:17:42.053793] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 165:gpt_read_header: *ERROR*: head_size=600 00:15:20.780 [2024-06-07 12:17:42.053964] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 177:gpt_read_header: *ERROR*: head crc32 does not match, provided=584158336, calculated=3316781438 00:15:20.780 [2024-06-07 12:17:42.054122] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 184:gpt_read_header: *ERROR*: signature did not match 00:15:20.780 [2024-06-07 12:17:42.054194] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 191:gpt_read_header: *ERROR*: head my_lba(7016996765293437281) != expected(1) 00:15:20.780 [2024-06-07 12:17:42.054285] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 135:gpt_lba_range_check: *ERROR*: Head's usable_lba_end(7016996765293437281) > lba_end(0) 00:15:20.780 passed 00:15:20.780 Test: test_read_partitions ...[2024-06-07 12:17:42.054357] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 197:gpt_read_header: *ERROR*: lba range check error 00:15:20.780 [2024-06-07 12:17:42.054447] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 88:gpt_read_partitions: *ERROR*: Num_partition_entries=256 which exceeds max=128 00:15:20.780 [2024-06-07 12:17:42.054552] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 95:gpt_read_partitions: *ERROR*: Partition_entry_size(0) != expected(80) 00:15:20.780 [2024-06-07 12:17:42.054619] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 59:gpt_get_partitions_buf: *ERROR*: Buffer size is not enough 00:15:20.780 [2024-06-07 12:17:42.054671] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 105:gpt_read_partitions: *ERROR*: Failed to get gpt partitions buf 00:15:20.780 [2024-06-07 12:17:42.054859] /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.c: 113:gpt_read_partitions: *ERROR*: GPT partition entry array crc32 did not match 00:15:20.780 passed 00:15:20.780 00:15:20.780 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.780 suites 1 1 n/a 0 0 00:15:20.780 tests 5 5 5 0 0 00:15:20.781 asserts 33 33 33 0 n/a 00:15:20.781 00:15:20.781 Elapsed time = 0.003 seconds 00:15:20.781 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@29 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/part.c/part_ut 00:15:20.781 00:15:20.781 00:15:20.781 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.781 http://cunit.sourceforge.net/ 00:15:20.781 00:15:20.781 00:15:20.781 Suite: bdev_part 00:15:20.781 Test: part_test ...[2024-06-07 12:17:42.096936] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:4580:bdev_name_add: *ERROR*: Bdev name test1 already exists 00:15:20.781 passed 00:15:20.781 Test: part_free_test ...passed 00:15:20.781 Test: part_get_io_channel_test ...passed 00:15:20.781 Test: part_construct_ext ...passed 00:15:20.781 00:15:20.781 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.781 suites 1 1 n/a 0 0 00:15:20.781 tests 4 4 4 0 0 00:15:20.781 asserts 48 48 48 0 n/a 00:15:20.781 00:15:20.781 Elapsed time = 0.073 seconds 00:15:20.781 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@30 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/scsi_nvme.c/scsi_nvme_ut 00:15:20.781 00:15:20.781 00:15:20.781 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.781 http://cunit.sourceforge.net/ 00:15:20.781 00:15:20.781 00:15:20.781 Suite: scsi_nvme_suite 00:15:20.781 Test: scsi_nvme_translate_test ...passed 00:15:20.781 00:15:20.781 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.781 suites 1 1 n/a 0 0 00:15:20.781 tests 1 1 1 0 0 00:15:20.781 asserts 104 104 104 0 n/a 00:15:20.781 00:15:20.781 Elapsed time = 0.000 seconds 00:15:20.781 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@31 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/vbdev_lvol.c/vbdev_lvol_ut 00:15:20.781 00:15:20.781 00:15:20.781 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.781 http://cunit.sourceforge.net/ 00:15:20.781 00:15:20.781 00:15:20.781 Suite: lvol 00:15:20.781 Test: ut_lvs_init ...[2024-06-07 12:17:42.245296] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c: 180:_vbdev_lvs_create_cb: *ERROR*: Cannot create lvol store bdev 00:15:20.781 [2024-06-07 12:17:42.245761] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c: 264:vbdev_lvs_create: *ERROR*: Cannot create blobstore device 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_init ...passed 00:15:20.781 Test: ut_lvol_snapshot ...passed 00:15:20.781 Test: ut_lvol_clone ...passed 00:15:20.781 Test: ut_lvs_destroy ...passed 00:15:20.781 Test: ut_lvs_unload ...passed 00:15:20.781 Test: ut_lvol_resize ...[2024-06-07 12:17:42.247508] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1394:vbdev_lvol_resize: *ERROR*: lvol does not exist 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_set_read_only ...passed 00:15:20.781 Test: ut_lvol_hotremove ...passed 00:15:20.781 Test: ut_vbdev_lvol_get_io_channel ...passed 00:15:20.781 Test: ut_vbdev_lvol_io_type_supported ...passed 00:15:20.781 Test: ut_lvol_read_write ...passed 00:15:20.781 Test: ut_vbdev_lvol_submit_request ...passed 00:15:20.781 Test: ut_lvol_examine_config ...passed 00:15:20.781 Test: ut_lvol_examine_disk ...[2024-06-07 12:17:42.248413] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1536:_vbdev_lvs_examine_finish: *ERROR*: Error opening lvol UNIT_TEST_UUID 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_rename ...[2024-06-07 12:17:42.249521] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c: 105:_vbdev_lvol_change_bdev_alias: *ERROR*: cannot add alias 'lvs/new_lvol_name' 00:15:20.781 [2024-06-07 12:17:42.249660] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1344:vbdev_lvol_rename: *ERROR*: renaming lvol to 'new_lvol_name' does not succeed 00:15:20.781 passed 00:15:20.781 Test: ut_bdev_finish ...passed 00:15:20.781 Test: ut_lvs_rename ...passed 00:15:20.781 Test: ut_lvol_seek ...passed 00:15:20.781 Test: ut_esnap_dev_create ...[2024-06-07 12:17:42.250553] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1879:vbdev_lvol_esnap_dev_create: *ERROR*: lvol : NULL esnap ID 00:15:20.781 [2024-06-07 12:17:42.250669] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1885:vbdev_lvol_esnap_dev_create: *ERROR*: lvol : Invalid esnap ID length (36) 00:15:20.781 [2024-06-07 12:17:42.250728] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1890:vbdev_lvol_esnap_dev_create: *ERROR*: lvol : Invalid esnap ID: not a UUID 00:15:20.781 [2024-06-07 12:17:42.250791] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1911:vbdev_lvol_esnap_dev_create: *ERROR*: lvol : unable to claim esnap bdev 'a27fd8fe-d4b9-431e-a044-271016228ce4': -1 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_esnap_clone_bad_args ...[2024-06-07 12:17:42.251008] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1280:vbdev_lvol_create_bdev_clone: *ERROR*: lvol store not specified 00:15:20.781 [2024-06-07 12:17:42.251070] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1287:vbdev_lvol_create_bdev_clone: *ERROR*: bdev '255f4236-9427-42d0-a9d1-aa17f37dd8db' could not be opened: error -19 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_shallow_copy ...[2024-06-07 12:17:42.251446] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1977:vbdev_lvol_shallow_copy: *ERROR*: lvol must not be NULL 00:15:20.781 [2024-06-07 12:17:42.251521] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:1982:vbdev_lvol_shallow_copy: *ERROR*: lvol lvol_sc, bdev name must not be NULL 00:15:20.781 passed 00:15:20.781 Test: ut_lvol_set_external_parent ...[2024-06-07 12:17:42.251675] /home/vagrant/spdk_repo/spdk/module/bdev/lvol/vbdev_lvol.c:2037:vbdev_lvol_set_external_parent: *ERROR*: bdev '255f4236-9427-42d0-a9d1-aa17f37dd8db' could not be opened: error -19 00:15:20.781 passed 00:15:20.781 00:15:20.781 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.781 suites 1 1 n/a 0 0 00:15:20.781 tests 23 23 23 0 0 00:15:20.781 asserts 798 798 798 0 n/a 00:15:20.781 00:15:20.781 Elapsed time = 0.007 seconds 00:15:20.781 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@32 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/vbdev_zone_block.c/vbdev_zone_block_ut 00:15:20.781 00:15:20.781 00:15:20.781 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.781 http://cunit.sourceforge.net/ 00:15:20.781 00:15:20.781 00:15:20.781 Suite: zone_block 00:15:20.781 Test: test_zone_block_create ...passed 00:15:20.781 Test: test_zone_block_create_invalid ...[2024-06-07 12:17:42.305354] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 624:zone_block_insert_name: *ERROR*: base bdev Nvme0n1 already claimed 00:15:20.781 [2024-06-07 12:17:42.305657] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block_rpc.c: 58:rpc_zone_block_create: *ERROR*: Failed to create block zoned vbdev: File exists[2024-06-07 12:17:42.305865] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 721:zone_block_register: *ERROR*: Base bdev zone_dev1 is already a zoned bdev 00:15:20.781 [2024-06-07 12:17:42.305954] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block_rpc.c: 58:rpc_zone_block_create: *ERROR*: Failed to create block zoned vbdev: File exists[2024-06-07 12:17:42.306214] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 860:vbdev_zone_block_create: *ERROR*: Zone capacity can't be 0 00:15:20.781 [2024-06-07 12:17:42.306290] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block_rpc.c: 58:rpc_zone_block_create: *ERROR*: Failed to create block zoned vbdev: Invalid argument[2024-06-07 12:17:42.306397] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 865:vbdev_zone_block_create: *ERROR*: Optimal open zones can't be 0 00:15:20.781 passed 00:15:20.781 Test: test_get_zone_info ...[2024-06-07 12:17:42.306466] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block_rpc.c: 58:rpc_zone_block_create: *ERROR*: Failed to create block zoned vbdev: Invalid argument[2024-06-07 12:17:42.307018] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.307117] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.307189] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 passed 00:15:20.781 Test: test_supported_io_types ...passed 00:15:20.781 Test: test_reset_zone ...[2024-06-07 12:17:42.308028] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.308104] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 passed 00:15:20.781 Test: test_open_zone ...[2024-06-07 12:17:42.308598] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.309389] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.309495] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 passed 00:15:20.781 Test: test_zone_write ...[2024-06-07 12:17:42.310034] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 391:zone_block_write: *ERROR*: Trying to write to zone in invalid state 2 00:15:20.781 [2024-06-07 12:17:42.310114] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.310184] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 378:zone_block_write: *ERROR*: Trying to write to invalid zone (lba 0x5000) 00:15:20.781 [2024-06-07 12:17:42.310254] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.317191] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 401:zone_block_write: *ERROR*: Trying to write to zone with invalid address (lba 0x407, wp 0x405) 00:15:20.781 [2024-06-07 12:17:42.317285] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.317388] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 401:zone_block_write: *ERROR*: Trying to write to zone with invalid address (lba 0x400, wp 0x405) 00:15:20.781 [2024-06-07 12:17:42.317425] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.323979] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 410:zone_block_write: *ERROR*: Write exceeds zone capacity (lba 0x3f0, len 0x20, wp 0x3f0) 00:15:20.781 [2024-06-07 12:17:42.324052] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 passed 00:15:20.781 Test: test_zone_read ...[2024-06-07 12:17:42.324564] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 465:zone_block_read: *ERROR*: Read exceeds zone capacity (lba 0x4ff8, len 0x10) 00:15:20.781 [2024-06-07 12:17:42.324615] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.781 [2024-06-07 12:17:42.324697] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 460:zone_block_read: *ERROR*: Trying to read from invalid zone (lba 0x5000) 00:15:20.782 [2024-06-07 12:17:42.324735] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.325204] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 465:zone_block_read: *ERROR*: Read exceeds zone capacity (lba 0x3f8, len 0x10) 00:15:20.782 passed 00:15:20.782 Test: test_close_zone ...[2024-06-07 12:17:42.325276] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.325627] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.325702] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.325986] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.326048] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 passed 00:15:20.782 Test: test_finish_zone ...[2024-06-07 12:17:42.326685] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.326739] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 passed 00:15:20.782 Test: test_append_zone ...[2024-06-07 12:17:42.327135] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 391:zone_block_write: *ERROR*: Trying to write to zone in invalid state 2 00:15:20.782 [2024-06-07 12:17:42.327178] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.327272] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 378:zone_block_write: *ERROR*: Trying to write to invalid zone (lba 0x5000) 00:15:20.782 [2024-06-07 12:17:42.327316] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 [2024-06-07 12:17:42.341355] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 410:zone_block_write: *ERROR*: Write exceeds zone capacity (lba 0x3f0, len 0x20, wp 0x3f0) 00:15:20.782 [2024-06-07 12:17:42.341470] /home/vagrant/spdk_repo/spdk/module/bdev/zone_block/vbdev_zone_block.c: 510:zone_block_submit_request: *ERROR*: ERROR on bdev_io submission! 00:15:20.782 passed 00:15:20.782 00:15:20.782 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.782 suites 1 1 n/a 0 0 00:15:20.782 tests 11 11 11 0 0 00:15:20.782 asserts 3437 3437 3437 0 n/a 00:15:20.782 00:15:20.782 Elapsed time = 0.037 seconds 00:15:20.782 12:17:42 unittest.unittest_bdev -- unit/unittest.sh@33 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/bdev/mt/bdev.c/bdev_ut 00:15:20.782 00:15:20.782 00:15:20.782 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.782 http://cunit.sourceforge.net/ 00:15:20.782 00:15:20.782 00:15:20.782 Suite: bdev 00:15:20.782 Test: basic ...[2024-06-07 12:17:42.464950] thread.c:2369:spdk_get_io_channel: *ERROR*: could not create io_channel for io_device bdev_ut_bdev (0x530241): Operation not permitted (rc=-1) 00:15:20.782 [2024-06-07 12:17:42.465421] thread.c:2369:spdk_get_io_channel: *ERROR*: could not create io_channel for io_device 0x6130000003c0 (0x530200): Operation not permitted (rc=-1) 00:15:20.782 [2024-06-07 12:17:42.465489] thread.c:2369:spdk_get_io_channel: *ERROR*: could not create io_channel for io_device bdev_ut_bdev (0x530241): Operation not permitted (rc=-1) 00:15:20.782 passed 00:15:20.782 Test: unregister_and_close ...passed 00:15:20.782 Test: unregister_and_close_different_threads ...passed 00:15:20.782 Test: basic_qos ...passed 00:15:20.782 Test: put_channel_during_reset ...passed 00:15:20.782 Test: aborted_reset ...passed 00:15:20.782 Test: aborted_reset_no_outstanding_io ...passed 00:15:20.782 Test: io_during_reset ...passed 00:15:20.782 Test: reset_completions ...passed 00:15:20.782 Test: io_during_qos_queue ...passed 00:15:20.782 Test: io_during_qos_reset ...passed 00:15:20.782 Test: enomem ...passed 00:15:20.782 Test: enomem_multi_bdev ...passed 00:15:20.782 Test: enomem_multi_bdev_unregister ...passed 00:15:20.782 Test: enomem_multi_io_target ...passed 00:15:20.782 Test: qos_dynamic_enable ...passed 00:15:20.782 Test: bdev_histograms_mt ...passed 00:15:20.782 Test: bdev_set_io_timeout_mt ...[2024-06-07 12:17:43.632472] thread.c: 471:spdk_thread_lib_fini: *ERROR*: io_device 0x6130000003c0 not unregistered 00:15:20.782 passed 00:15:20.782 Test: lock_lba_range_then_submit_io ...[2024-06-07 12:17:43.655295] thread.c:2173:spdk_io_device_register: *ERROR*: io_device 0x5301c0 already registered (old:0x6130000003c0 new:0x613000000c80) 00:15:20.782 passed 00:15:20.782 Test: unregister_during_reset ...passed 00:15:20.782 Test: event_notify_and_close ...passed 00:15:20.782 Test: unregister_and_qos_poller ...passed 00:15:20.782 Suite: bdev_wrong_thread 00:15:20.782 Test: spdk_bdev_register_wt ...[2024-06-07 12:17:43.870385] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c:8459:spdk_bdev_register: *ERROR*: Cannot register bdev wt_bdev on thread 0x618000001480 (0x618000001480) 00:15:20.782 passed 00:15:20.782 Test: spdk_bdev_examine_wt ...[2024-06-07 12:17:43.870719] /home/vagrant/spdk_repo/spdk/lib/bdev/bdev.c: 810:spdk_bdev_examine: *ERROR*: Cannot examine bdev ut_bdev_wt on thread 0x618000001480 (0x618000001480) 00:15:20.782 passed 00:15:20.782 00:15:20.782 Run Summary: Type Total Ran Passed Failed Inactive 00:15:20.782 suites 2 2 n/a 0 0 00:15:20.782 tests 24 24 24 0 0 00:15:20.782 asserts 621 621 621 0 n/a 00:15:20.782 00:15:20.782 Elapsed time = 1.438 seconds 00:15:20.782 00:15:20.782 real 0m4.585s 00:15:20.782 user 0m1.867s 00:15:20.782 sys 0m2.696s 00:15:20.782 12:17:43 unittest.unittest_bdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:20.782 12:17:43 unittest.unittest_bdev -- common/autotest_common.sh@10 -- # set +x 00:15:20.782 ************************************ 00:15:20.782 END TEST unittest_bdev 00:15:20.782 ************************************ 00:15:20.782 12:17:43 unittest -- unit/unittest.sh@215 -- # grep -q '#define SPDK_CONFIG_CRYPTO 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:20.782 12:17:43 unittest -- unit/unittest.sh@220 -- # grep -q '#define SPDK_CONFIG_VBDEV_COMPRESS 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:20.782 12:17:43 unittest -- unit/unittest.sh@225 -- # grep -q '#define SPDK_CONFIG_DPDK_COMPRESSDEV 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:20.782 12:17:43 unittest -- unit/unittest.sh@229 -- # grep -q '#define SPDK_CONFIG_RAID5F 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:20.782 12:17:43 unittest -- unit/unittest.sh@233 -- # run_test unittest_blob_blobfs unittest_blob 00:15:20.782 12:17:43 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:20.782 12:17:43 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:20.782 12:17:43 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:20.782 ************************************ 00:15:20.782 START TEST unittest_blob_blobfs 00:15:20.782 ************************************ 00:15:20.782 12:17:43 unittest.unittest_blob_blobfs -- common/autotest_common.sh@1124 -- # unittest_blob 00:15:20.782 12:17:43 unittest.unittest_blob_blobfs -- unit/unittest.sh@39 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/unit/lib/blob/blob.c/blob_ut ]] 00:15:20.782 12:17:43 unittest.unittest_blob_blobfs -- unit/unittest.sh@40 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blob/blob.c/blob_ut 00:15:20.782 00:15:20.782 00:15:20.782 CUnit - A unit testing framework for C - Version 2.1-3 00:15:20.782 http://cunit.sourceforge.net/ 00:15:20.782 00:15:20.782 00:15:20.782 Suite: blob_nocopy_noextent 00:15:20.782 Test: blob_init ...[2024-06-07 12:17:44.019812] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5490:spdk_bs_init: *ERROR*: unsupported dev block length of 500 00:15:20.782 passed 00:15:20.782 Test: blob_thin_provision ...passed 00:15:20.782 Test: blob_read_only ...passed 00:15:20.782 Test: bs_load ...[2024-06-07 12:17:44.125888] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c: 965:blob_parse: *ERROR*: Blobid (0x0) doesn't match what's in metadata (0x100000000) 00:15:20.782 passed 00:15:20.782 Test: bs_load_custom_cluster_size ...passed 00:15:20.782 Test: bs_load_after_failed_grow ...passed 00:15:20.782 Test: bs_cluster_sz ...[2024-06-07 12:17:44.169137] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3824:bs_opts_verify: *ERROR*: Blobstore options cannot be set to 0 00:15:20.782 [2024-06-07 12:17:44.169481] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5621:spdk_bs_init: *ERROR*: Blobstore metadata cannot use more clusters than is available, please decrease number of pages reserved for metadata or increase cluster size. 00:15:20.782 [2024-06-07 12:17:44.169643] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3883:bs_alloc: *ERROR*: Cluster size 4095 is smaller than page size 4096 00:15:20.782 passed 00:15:20.783 Test: bs_resize_md ...passed 00:15:20.783 Test: bs_destroy ...passed 00:15:20.783 Test: bs_type ...passed 00:15:20.783 Test: bs_super_block ...passed 00:15:20.783 Test: bs_test_recover_cluster_count ...passed 00:15:20.783 Test: bs_grow_live ...passed 00:15:20.783 Test: bs_grow_live_no_space ...passed 00:15:20.783 Test: bs_test_grow ...passed 00:15:20.783 Test: blob_serialize_test ...passed 00:15:20.783 Test: super_block_crc ...passed 00:15:20.783 Test: blob_thin_prov_write_count_io ...passed 00:15:21.044 Test: blob_thin_prov_unmap_cluster ...passed 00:15:21.044 Test: bs_load_iter_test ...passed 00:15:21.044 Test: blob_relations ...[2024-06-07 12:17:44.459903] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.460029] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 [2024-06-07 12:17:44.460751] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.460823] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 passed 00:15:21.044 Test: blob_relations2 ...[2024-06-07 12:17:44.483045] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.483161] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 [2024-06-07 12:17:44.483202] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.483241] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 [2024-06-07 12:17:44.484362] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.484418] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 [2024-06-07 12:17:44.484736] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:21.044 [2024-06-07 12:17:44.484786] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.044 passed 00:15:21.044 Test: blob_relations3 ...passed 00:15:21.302 Test: blobstore_clean_power_failure ...passed 00:15:21.302 Test: blob_delete_snapshot_power_failure ...[2024-06-07 12:17:44.749921] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:21.302 [2024-06-07 12:17:44.769797] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:21.302 [2024-06-07 12:17:44.769917] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:21.302 [2024-06-07 12:17:44.769966] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.302 [2024-06-07 12:17:44.789437] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:21.302 [2024-06-07 12:17:44.789548] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:21.302 [2024-06-07 12:17:44.789578] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:21.303 [2024-06-07 12:17:44.789634] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.303 [2024-06-07 12:17:44.809084] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8227:delete_snapshot_sync_snapshot_xattr_cpl: *ERROR*: Failed to sync MD with xattr on blob 00:15:21.303 [2024-06-07 12:17:44.809216] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.303 [2024-06-07 12:17:44.828894] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8096:delete_snapshot_sync_clone_cpl: *ERROR*: Failed to sync MD on clone 00:15:21.303 [2024-06-07 12:17:44.829007] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.303 [2024-06-07 12:17:44.848720] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8040:delete_snapshot_sync_snapshot_cpl: *ERROR*: Failed to sync MD on blob 00:15:21.303 [2024-06-07 12:17:44.848834] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:21.303 passed 00:15:21.303 Test: blob_create_snapshot_power_failure ...[2024-06-07 12:17:44.907794] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:21.560 [2024-06-07 12:17:44.947096] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:21.560 [2024-06-07 12:17:44.967417] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6446:bs_clone_snapshot_origblob_cleanup: *ERROR*: Cleanup error -5 00:15:21.560 passed 00:15:21.560 Test: blob_io_unit ...passed 00:15:21.560 Test: blob_io_unit_compatibility ...passed 00:15:21.560 Test: blob_ext_md_pages ...passed 00:15:21.560 Test: blob_esnap_io_4096_4096 ...passed 00:15:21.560 Test: blob_esnap_io_512_512 ...passed 00:15:21.818 Test: blob_esnap_io_4096_512 ...passed 00:15:21.818 Test: blob_esnap_io_512_4096 ...passed 00:15:21.818 Test: blob_esnap_clone_resize ...passed 00:15:21.818 Suite: blob_bs_nocopy_noextent 00:15:21.818 Test: blob_open ...passed 00:15:21.818 Test: blob_create ...[2024-06-07 12:17:45.377463] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -28, size in clusters/size: 65 (clusters) 00:15:21.818 passed 00:15:22.076 Test: blob_create_loop ...passed 00:15:22.076 Test: blob_create_fail ...[2024-06-07 12:17:45.512443] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:22.076 passed 00:15:22.076 Test: blob_create_internal ...passed 00:15:22.076 Test: blob_create_zero_extent ...passed 00:15:22.076 Test: blob_snapshot ...passed 00:15:22.334 Test: blob_clone ...passed 00:15:22.334 Test: blob_inflate ...[2024-06-07 12:17:45.813699] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7109:bs_inflate_blob_open_cpl: *ERROR*: Cannot decouple parent of blob with no parent. 00:15:22.334 passed 00:15:22.334 Test: blob_delete ...passed 00:15:22.334 Test: blob_resize_test ...[2024-06-07 12:17:45.925085] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7845:bs_resize_unfreeze_cpl: *ERROR*: Unfreeze failed, ctx->rc=-28 00:15:22.334 passed 00:15:22.592 Test: blob_resize_thin_test ...passed 00:15:22.592 Test: channel_ops ...passed 00:15:22.592 Test: blob_super ...passed 00:15:22.592 Test: blob_rw_verify_iov ...passed 00:15:22.592 Test: blob_unmap ...passed 00:15:22.849 Test: blob_iter ...passed 00:15:22.849 Test: blob_parse_md ...passed 00:15:22.849 Test: bs_load_pending_removal ...passed 00:15:22.849 Test: bs_unload ...[2024-06-07 12:17:46.423523] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5878:spdk_bs_unload: *ERROR*: Blobstore still has open blobs 00:15:22.849 passed 00:15:23.107 Test: bs_usable_clusters ...passed 00:15:23.107 Test: blob_crc ...[2024-06-07 12:17:46.535355] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:23.107 [2024-06-07 12:17:46.535520] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:23.107 passed 00:15:23.107 Test: blob_flags ...passed 00:15:23.107 Test: bs_version ...passed 00:15:23.107 Test: blob_set_xattrs_test ...[2024-06-07 12:17:46.702812] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:23.107 [2024-06-07 12:17:46.702952] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:23.107 passed 00:15:23.365 Test: blob_thin_prov_alloc ...passed 00:15:23.365 Test: blob_insert_cluster_msg_test ...passed 00:15:23.365 Test: blob_thin_prov_rw ...passed 00:15:23.365 Test: blob_thin_prov_rle ...passed 00:15:23.624 Test: blob_thin_prov_rw_iov ...passed 00:15:23.624 Test: blob_snapshot_rw ...passed 00:15:23.624 Test: blob_snapshot_rw_iov ...passed 00:15:23.882 Test: blob_inflate_rw ...passed 00:15:23.882 Test: blob_snapshot_freeze_io ...passed 00:15:24.142 Test: blob_operation_split_rw ...passed 00:15:24.142 Test: blob_operation_split_rw_iov ...passed 00:15:24.142 Test: blob_simultaneous_operations ...[2024-06-07 12:17:47.704618] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:24.142 [2024-06-07 12:17:47.704752] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:24.142 [2024-06-07 12:17:47.706520] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:24.142 [2024-06-07 12:17:47.706617] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:24.142 [2024-06-07 12:17:47.723723] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:24.142 [2024-06-07 12:17:47.723836] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:24.142 [2024-06-07 12:17:47.723959] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:24.142 [2024-06-07 12:17:47.724005] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:24.142 passed 00:15:24.400 Test: blob_persist_test ...passed 00:15:24.400 Test: blob_decouple_snapshot ...passed 00:15:24.400 Test: blob_seek_io_unit ...passed 00:15:24.658 Test: blob_nested_freezes ...passed 00:15:24.658 Test: blob_clone_resize ...passed 00:15:24.658 Test: blob_shallow_copy ...[2024-06-07 12:17:48.186431] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7332:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, blob must be read only 00:15:24.658 [2024-06-07 12:17:48.186884] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7342:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device must have at least blob size 00:15:24.658 [2024-06-07 12:17:48.187247] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7350:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device block size is not compatible with blobstore block size 00:15:24.658 passed 00:15:24.658 Suite: blob_blob_nocopy_noextent 00:15:24.658 Test: blob_write ...passed 00:15:24.916 Test: blob_read ...passed 00:15:24.916 Test: blob_rw_verify ...passed 00:15:24.916 Test: blob_rw_verify_iov_nomem ...passed 00:15:24.916 Test: blob_rw_iov_read_only ...passed 00:15:25.174 Test: blob_xattr ...passed 00:15:25.174 Test: blob_dirty_shutdown ...passed 00:15:25.174 Test: blob_is_degraded ...passed 00:15:25.174 Suite: blob_esnap_bs_nocopy_noextent 00:15:25.174 Test: blob_esnap_create ...passed 00:15:25.174 Test: blob_esnap_thread_add_remove ...passed 00:15:25.431 Test: blob_esnap_clone_snapshot ...passed 00:15:25.431 Test: blob_esnap_clone_inflate ...passed 00:15:25.431 Test: blob_esnap_clone_decouple ...passed 00:15:25.431 Test: blob_esnap_clone_reload ...passed 00:15:25.689 Test: blob_esnap_hotplug ...passed 00:15:25.689 Test: blob_set_parent ...[2024-06-07 12:17:49.127852] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7613:spdk_bs_blob_set_parent: *ERROR*: snapshot id not valid 00:15:25.689 [2024-06-07 12:17:49.127990] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7619:spdk_bs_blob_set_parent: *ERROR*: blob id and snapshot id cannot be the same 00:15:25.689 [2024-06-07 12:17:49.128132] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7548:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob is not a snapshot 00:15:25.689 [2024-06-07 12:17:49.128183] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7555:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob has a number of clusters different from child's ones 00:15:25.689 [2024-06-07 12:17:49.128712] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7594:bs_set_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:25.689 passed 00:15:25.689 Test: blob_set_external_parent ...[2024-06-07 12:17:49.186053] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7787:spdk_bs_blob_set_external_parent: *ERROR*: blob id and external snapshot id cannot be the same 00:15:25.689 [2024-06-07 12:17:49.186203] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7795:spdk_bs_blob_set_external_parent: *ERROR*: Esnap device size 61440 is not an integer multiple of cluster size 16384 00:15:25.689 [2024-06-07 12:17:49.186260] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7748:bs_set_external_parent_blob_open_cpl: *ERROR*: external snapshot is already the parent of blob 00:15:25.690 [2024-06-07 12:17:49.186670] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7754:bs_set_external_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:25.690 passed 00:15:25.690 Suite: blob_nocopy_extent 00:15:25.690 Test: blob_init ...[2024-06-07 12:17:49.206058] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5490:spdk_bs_init: *ERROR*: unsupported dev block length of 500 00:15:25.690 passed 00:15:25.690 Test: blob_thin_provision ...passed 00:15:25.690 Test: blob_read_only ...passed 00:15:25.690 Test: bs_load ...[2024-06-07 12:17:49.283681] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c: 965:blob_parse: *ERROR*: Blobid (0x0) doesn't match what's in metadata (0x100000000) 00:15:25.690 passed 00:15:25.690 Test: bs_load_custom_cluster_size ...passed 00:15:25.690 Test: bs_load_after_failed_grow ...passed 00:15:25.690 Test: bs_cluster_sz ...[2024-06-07 12:17:49.325246] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3824:bs_opts_verify: *ERROR*: Blobstore options cannot be set to 0 00:15:25.690 [2024-06-07 12:17:49.325542] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5621:spdk_bs_init: *ERROR*: Blobstore metadata cannot use more clusters than is available, please decrease number of pages reserved for metadata or increase cluster size. 00:15:25.690 [2024-06-07 12:17:49.325603] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3883:bs_alloc: *ERROR*: Cluster size 4095 is smaller than page size 4096 00:15:25.948 passed 00:15:25.948 Test: bs_resize_md ...passed 00:15:25.948 Test: bs_destroy ...passed 00:15:25.948 Test: bs_type ...passed 00:15:25.948 Test: bs_super_block ...passed 00:15:25.948 Test: bs_test_recover_cluster_count ...passed 00:15:25.948 Test: bs_grow_live ...passed 00:15:25.948 Test: bs_grow_live_no_space ...passed 00:15:25.948 Test: bs_test_grow ...passed 00:15:25.948 Test: blob_serialize_test ...passed 00:15:25.948 Test: super_block_crc ...passed 00:15:25.948 Test: blob_thin_prov_write_count_io ...passed 00:15:25.948 Test: blob_thin_prov_unmap_cluster ...passed 00:15:25.948 Test: bs_load_iter_test ...passed 00:15:26.207 Test: blob_relations ...[2024-06-07 12:17:49.611759] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.611924] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 [2024-06-07 12:17:49.612836] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.612899] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 passed 00:15:26.207 Test: blob_relations2 ...[2024-06-07 12:17:49.634089] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.634188] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 [2024-06-07 12:17:49.634218] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.634276] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 [2024-06-07 12:17:49.635564] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.635644] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 [2024-06-07 12:17:49.635998] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:26.207 [2024-06-07 12:17:49.636051] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.207 passed 00:15:26.207 Test: blob_relations3 ...passed 00:15:26.466 Test: blobstore_clean_power_failure ...passed 00:15:26.466 Test: blob_delete_snapshot_power_failure ...[2024-06-07 12:17:49.903119] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:26.466 [2024-06-07 12:17:49.923249] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:26.466 [2024-06-07 12:17:49.943526] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:26.466 [2024-06-07 12:17:49.943641] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:26.466 [2024-06-07 12:17:49.943687] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 [2024-06-07 12:17:49.963841] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:26.466 [2024-06-07 12:17:49.963974] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:26.466 [2024-06-07 12:17:49.964024] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:26.466 [2024-06-07 12:17:49.964075] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 [2024-06-07 12:17:49.984340] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:26.466 [2024-06-07 12:17:49.984463] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:26.466 [2024-06-07 12:17:49.984513] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:26.466 [2024-06-07 12:17:49.984566] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 [2024-06-07 12:17:50.005072] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8227:delete_snapshot_sync_snapshot_xattr_cpl: *ERROR*: Failed to sync MD with xattr on blob 00:15:26.466 [2024-06-07 12:17:50.005247] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 [2024-06-07 12:17:50.026119] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8096:delete_snapshot_sync_clone_cpl: *ERROR*: Failed to sync MD on clone 00:15:26.466 [2024-06-07 12:17:50.026305] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 [2024-06-07 12:17:50.046938] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8040:delete_snapshot_sync_snapshot_cpl: *ERROR*: Failed to sync MD on blob 00:15:26.466 [2024-06-07 12:17:50.047086] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:26.466 passed 00:15:26.466 Test: blob_create_snapshot_power_failure ...[2024-06-07 12:17:50.107184] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:26.724 [2024-06-07 12:17:50.127161] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:26.724 [2024-06-07 12:17:50.166573] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:26.724 [2024-06-07 12:17:50.186988] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6446:bs_clone_snapshot_origblob_cleanup: *ERROR*: Cleanup error -5 00:15:26.724 passed 00:15:26.724 Test: blob_io_unit ...passed 00:15:26.724 Test: blob_io_unit_compatibility ...passed 00:15:26.724 Test: blob_ext_md_pages ...passed 00:15:26.724 Test: blob_esnap_io_4096_4096 ...passed 00:15:26.984 Test: blob_esnap_io_512_512 ...passed 00:15:26.984 Test: blob_esnap_io_4096_512 ...passed 00:15:26.984 Test: blob_esnap_io_512_4096 ...passed 00:15:26.984 Test: blob_esnap_clone_resize ...passed 00:15:26.984 Suite: blob_bs_nocopy_extent 00:15:26.984 Test: blob_open ...passed 00:15:26.984 Test: blob_create ...[2024-06-07 12:17:50.605185] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -28, size in clusters/size: 65 (clusters) 00:15:26.984 passed 00:15:27.243 Test: blob_create_loop ...passed 00:15:27.243 Test: blob_create_fail ...[2024-06-07 12:17:50.748649] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:27.243 passed 00:15:27.243 Test: blob_create_internal ...passed 00:15:27.243 Test: blob_create_zero_extent ...passed 00:15:27.502 Test: blob_snapshot ...passed 00:15:27.502 Test: blob_clone ...passed 00:15:27.502 Test: blob_inflate ...[2024-06-07 12:17:51.058209] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7109:bs_inflate_blob_open_cpl: *ERROR*: Cannot decouple parent of blob with no parent. 00:15:27.502 passed 00:15:27.502 Test: blob_delete ...passed 00:15:27.784 Test: blob_resize_test ...[2024-06-07 12:17:51.179918] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7845:bs_resize_unfreeze_cpl: *ERROR*: Unfreeze failed, ctx->rc=-28 00:15:27.784 passed 00:15:27.784 Test: blob_resize_thin_test ...passed 00:15:27.784 Test: channel_ops ...passed 00:15:27.784 Test: blob_super ...passed 00:15:28.043 Test: blob_rw_verify_iov ...passed 00:15:28.043 Test: blob_unmap ...passed 00:15:28.043 Test: blob_iter ...passed 00:15:28.043 Test: blob_parse_md ...passed 00:15:28.043 Test: bs_load_pending_removal ...passed 00:15:28.302 Test: bs_unload ...[2024-06-07 12:17:51.704026] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5878:spdk_bs_unload: *ERROR*: Blobstore still has open blobs 00:15:28.302 passed 00:15:28.302 Test: bs_usable_clusters ...passed 00:15:28.302 Test: blob_crc ...[2024-06-07 12:17:51.818604] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:28.302 [2024-06-07 12:17:51.818741] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:28.302 passed 00:15:28.302 Test: blob_flags ...passed 00:15:28.560 Test: bs_version ...passed 00:15:28.560 Test: blob_set_xattrs_test ...[2024-06-07 12:17:51.993364] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:28.560 [2024-06-07 12:17:51.993499] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:28.560 passed 00:15:28.560 Test: blob_thin_prov_alloc ...passed 00:15:28.561 Test: blob_insert_cluster_msg_test ...passed 00:15:28.819 Test: blob_thin_prov_rw ...passed 00:15:28.819 Test: blob_thin_prov_rle ...passed 00:15:28.819 Test: blob_thin_prov_rw_iov ...passed 00:15:28.819 Test: blob_snapshot_rw ...passed 00:15:28.819 Test: blob_snapshot_rw_iov ...passed 00:15:29.077 Test: blob_inflate_rw ...passed 00:15:29.336 Test: blob_snapshot_freeze_io ...passed 00:15:29.336 Test: blob_operation_split_rw ...passed 00:15:29.336 Test: blob_operation_split_rw_iov ...passed 00:15:29.595 Test: blob_simultaneous_operations ...[2024-06-07 12:17:53.016281] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:29.595 [2024-06-07 12:17:53.016397] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:29.595 [2024-06-07 12:17:53.017710] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:29.595 [2024-06-07 12:17:53.017768] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:29.595 [2024-06-07 12:17:53.031881] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:29.595 [2024-06-07 12:17:53.031938] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:29.595 [2024-06-07 12:17:53.032034] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:29.595 [2024-06-07 12:17:53.032054] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:29.595 passed 00:15:29.595 Test: blob_persist_test ...passed 00:15:29.595 Test: blob_decouple_snapshot ...passed 00:15:29.853 Test: blob_seek_io_unit ...passed 00:15:29.853 Test: blob_nested_freezes ...passed 00:15:29.853 Test: blob_clone_resize ...passed 00:15:29.853 Test: blob_shallow_copy ...[2024-06-07 12:17:53.479986] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7332:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, blob must be read only 00:15:29.853 [2024-06-07 12:17:53.480404] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7342:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device must have at least blob size 00:15:29.853 [2024-06-07 12:17:53.480658] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7350:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device block size is not compatible with blobstore block size 00:15:30.112 passed 00:15:30.112 Suite: blob_blob_nocopy_extent 00:15:30.112 Test: blob_write ...passed 00:15:30.112 Test: blob_read ...passed 00:15:30.112 Test: blob_rw_verify ...passed 00:15:30.112 Test: blob_rw_verify_iov_nomem ...passed 00:15:30.370 Test: blob_rw_iov_read_only ...passed 00:15:30.370 Test: blob_xattr ...passed 00:15:30.370 Test: blob_dirty_shutdown ...passed 00:15:30.370 Test: blob_is_degraded ...passed 00:15:30.370 Suite: blob_esnap_bs_nocopy_extent 00:15:30.628 Test: blob_esnap_create ...passed 00:15:30.628 Test: blob_esnap_thread_add_remove ...passed 00:15:30.628 Test: blob_esnap_clone_snapshot ...passed 00:15:30.628 Test: blob_esnap_clone_inflate ...passed 00:15:30.628 Test: blob_esnap_clone_decouple ...passed 00:15:30.887 Test: blob_esnap_clone_reload ...passed 00:15:30.887 Test: blob_esnap_hotplug ...passed 00:15:30.887 Test: blob_set_parent ...[2024-06-07 12:17:54.422392] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7613:spdk_bs_blob_set_parent: *ERROR*: snapshot id not valid 00:15:30.887 [2024-06-07 12:17:54.422558] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7619:spdk_bs_blob_set_parent: *ERROR*: blob id and snapshot id cannot be the same 00:15:30.887 [2024-06-07 12:17:54.422676] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7548:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob is not a snapshot 00:15:30.887 [2024-06-07 12:17:54.422721] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7555:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob has a number of clusters different from child's ones 00:15:30.887 [2024-06-07 12:17:54.423067] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7594:bs_set_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:30.887 passed 00:15:30.887 Test: blob_set_external_parent ...[2024-06-07 12:17:54.479969] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7787:spdk_bs_blob_set_external_parent: *ERROR*: blob id and external snapshot id cannot be the same 00:15:30.887 [2024-06-07 12:17:54.480080] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7795:spdk_bs_blob_set_external_parent: *ERROR*: Esnap device size 61440 is not an integer multiple of cluster size 16384 00:15:30.887 [2024-06-07 12:17:54.480106] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7748:bs_set_external_parent_blob_open_cpl: *ERROR*: external snapshot is already the parent of blob 00:15:30.887 [2024-06-07 12:17:54.480383] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7754:bs_set_external_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:30.887 passed 00:15:30.887 Suite: blob_copy_noextent 00:15:30.887 Test: blob_init ...[2024-06-07 12:17:54.499789] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5490:spdk_bs_init: *ERROR*: unsupported dev block length of 500 00:15:30.887 passed 00:15:31.146 Test: blob_thin_provision ...passed 00:15:31.146 Test: blob_read_only ...passed 00:15:31.146 Test: bs_load ...[2024-06-07 12:17:54.576804] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c: 965:blob_parse: *ERROR*: Blobid (0x0) doesn't match what's in metadata (0x100000000) 00:15:31.146 passed 00:15:31.146 Test: bs_load_custom_cluster_size ...passed 00:15:31.146 Test: bs_load_after_failed_grow ...passed 00:15:31.146 Test: bs_cluster_sz ...[2024-06-07 12:17:54.615920] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3824:bs_opts_verify: *ERROR*: Blobstore options cannot be set to 0 00:15:31.146 [2024-06-07 12:17:54.616089] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5621:spdk_bs_init: *ERROR*: Blobstore metadata cannot use more clusters than is available, please decrease number of pages reserved for metadata or increase cluster size. 00:15:31.146 [2024-06-07 12:17:54.616128] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3883:bs_alloc: *ERROR*: Cluster size 4095 is smaller than page size 4096 00:15:31.146 passed 00:15:31.146 Test: bs_resize_md ...passed 00:15:31.146 Test: bs_destroy ...passed 00:15:31.146 Test: bs_type ...passed 00:15:31.146 Test: bs_super_block ...passed 00:15:31.146 Test: bs_test_recover_cluster_count ...passed 00:15:31.146 Test: bs_grow_live ...passed 00:15:31.146 Test: bs_grow_live_no_space ...passed 00:15:31.146 Test: bs_test_grow ...passed 00:15:31.146 Test: blob_serialize_test ...passed 00:15:31.146 Test: super_block_crc ...passed 00:15:31.405 Test: blob_thin_prov_write_count_io ...passed 00:15:31.405 Test: blob_thin_prov_unmap_cluster ...passed 00:15:31.405 Test: bs_load_iter_test ...passed 00:15:31.405 Test: blob_relations ...[2024-06-07 12:17:54.895125] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.895261] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 [2024-06-07 12:17:54.895677] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.895711] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 passed 00:15:31.405 Test: blob_relations2 ...[2024-06-07 12:17:54.917188] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.917307] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 [2024-06-07 12:17:54.917335] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.917353] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 [2024-06-07 12:17:54.918121] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.918169] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 [2024-06-07 12:17:54.918420] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:31.405 [2024-06-07 12:17:54.918472] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.405 passed 00:15:31.405 Test: blob_relations3 ...passed 00:15:31.663 Test: blobstore_clean_power_failure ...passed 00:15:31.663 Test: blob_delete_snapshot_power_failure ...[2024-06-07 12:17:55.187697] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:31.663 [2024-06-07 12:17:55.206867] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:31.663 [2024-06-07 12:17:55.206978] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:31.663 [2024-06-07 12:17:55.207004] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.663 [2024-06-07 12:17:55.226069] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:31.663 [2024-06-07 12:17:55.226166] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:31.664 [2024-06-07 12:17:55.226191] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:31.664 [2024-06-07 12:17:55.226233] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.664 [2024-06-07 12:17:55.245222] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8227:delete_snapshot_sync_snapshot_xattr_cpl: *ERROR*: Failed to sync MD with xattr on blob 00:15:31.664 [2024-06-07 12:17:55.245359] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.664 [2024-06-07 12:17:55.264443] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8096:delete_snapshot_sync_clone_cpl: *ERROR*: Failed to sync MD on clone 00:15:31.664 [2024-06-07 12:17:55.264570] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.664 [2024-06-07 12:17:55.283828] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8040:delete_snapshot_sync_snapshot_cpl: *ERROR*: Failed to sync MD on blob 00:15:31.664 [2024-06-07 12:17:55.283942] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:31.923 passed 00:15:31.923 Test: blob_create_snapshot_power_failure ...[2024-06-07 12:17:55.342403] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:31.923 [2024-06-07 12:17:55.380781] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 1 read failed for blobid 0x100000001: -5 00:15:31.923 [2024-06-07 12:17:55.400129] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6446:bs_clone_snapshot_origblob_cleanup: *ERROR*: Cleanup error -5 00:15:31.923 passed 00:15:31.923 Test: blob_io_unit ...passed 00:15:31.923 Test: blob_io_unit_compatibility ...passed 00:15:31.923 Test: blob_ext_md_pages ...passed 00:15:31.923 Test: blob_esnap_io_4096_4096 ...passed 00:15:32.185 Test: blob_esnap_io_512_512 ...passed 00:15:32.185 Test: blob_esnap_io_4096_512 ...passed 00:15:32.185 Test: blob_esnap_io_512_4096 ...passed 00:15:32.185 Test: blob_esnap_clone_resize ...passed 00:15:32.185 Suite: blob_bs_copy_noextent 00:15:32.185 Test: blob_open ...passed 00:15:32.185 Test: blob_create ...[2024-06-07 12:17:55.810988] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -28, size in clusters/size: 65 (clusters) 00:15:32.446 passed 00:15:32.446 Test: blob_create_loop ...passed 00:15:32.446 Test: blob_create_fail ...[2024-06-07 12:17:55.942791] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:32.446 passed 00:15:32.446 Test: blob_create_internal ...passed 00:15:32.446 Test: blob_create_zero_extent ...passed 00:15:32.704 Test: blob_snapshot ...passed 00:15:32.704 Test: blob_clone ...passed 00:15:32.704 Test: blob_inflate ...[2024-06-07 12:17:56.229440] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7109:bs_inflate_blob_open_cpl: *ERROR*: Cannot decouple parent of blob with no parent. 00:15:32.704 passed 00:15:32.704 Test: blob_delete ...passed 00:15:32.704 Test: blob_resize_test ...[2024-06-07 12:17:56.338105] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7845:bs_resize_unfreeze_cpl: *ERROR*: Unfreeze failed, ctx->rc=-28 00:15:32.962 passed 00:15:32.962 Test: blob_resize_thin_test ...passed 00:15:32.962 Test: channel_ops ...passed 00:15:32.962 Test: blob_super ...passed 00:15:32.962 Test: blob_rw_verify_iov ...passed 00:15:33.221 Test: blob_unmap ...passed 00:15:33.221 Test: blob_iter ...passed 00:15:33.221 Test: blob_parse_md ...passed 00:15:33.221 Test: bs_load_pending_removal ...passed 00:15:33.221 Test: bs_unload ...[2024-06-07 12:17:56.855026] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5878:spdk_bs_unload: *ERROR*: Blobstore still has open blobs 00:15:33.480 passed 00:15:33.480 Test: bs_usable_clusters ...passed 00:15:33.480 Test: blob_crc ...[2024-06-07 12:17:56.972503] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:33.480 [2024-06-07 12:17:56.972653] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:33.480 passed 00:15:33.480 Test: blob_flags ...passed 00:15:33.480 Test: bs_version ...passed 00:15:33.739 Test: blob_set_xattrs_test ...[2024-06-07 12:17:57.144460] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:33.739 [2024-06-07 12:17:57.144621] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:33.739 passed 00:15:33.739 Test: blob_thin_prov_alloc ...passed 00:15:33.739 Test: blob_insert_cluster_msg_test ...passed 00:15:33.739 Test: blob_thin_prov_rw ...passed 00:15:33.997 Test: blob_thin_prov_rle ...passed 00:15:33.997 Test: blob_thin_prov_rw_iov ...passed 00:15:33.997 Test: blob_snapshot_rw ...passed 00:15:33.997 Test: blob_snapshot_rw_iov ...passed 00:15:34.256 Test: blob_inflate_rw ...passed 00:15:34.256 Test: blob_snapshot_freeze_io ...passed 00:15:34.514 Test: blob_operation_split_rw ...passed 00:15:34.514 Test: blob_operation_split_rw_iov ...passed 00:15:34.514 Test: blob_simultaneous_operations ...[2024-06-07 12:17:58.142410] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:34.514 [2024-06-07 12:17:58.142495] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:34.514 [2024-06-07 12:17:58.142919] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:34.514 [2024-06-07 12:17:58.142956] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:34.514 [2024-06-07 12:17:58.145645] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:34.514 [2024-06-07 12:17:58.145688] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:34.514 [2024-06-07 12:17:58.145756] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:34.514 [2024-06-07 12:17:58.145773] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:34.773 passed 00:15:34.773 Test: blob_persist_test ...passed 00:15:34.773 Test: blob_decouple_snapshot ...passed 00:15:34.773 Test: blob_seek_io_unit ...passed 00:15:34.773 Test: blob_nested_freezes ...passed 00:15:35.032 Test: blob_clone_resize ...passed 00:15:35.032 Test: blob_shallow_copy ...[2024-06-07 12:17:58.519599] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7332:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, blob must be read only 00:15:35.032 [2024-06-07 12:17:58.519982] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7342:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device must have at least blob size 00:15:35.032 [2024-06-07 12:17:58.520234] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7350:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device block size is not compatible with blobstore block size 00:15:35.032 passed 00:15:35.032 Suite: blob_blob_copy_noextent 00:15:35.032 Test: blob_write ...passed 00:15:35.032 Test: blob_read ...passed 00:15:35.290 Test: blob_rw_verify ...passed 00:15:35.290 Test: blob_rw_verify_iov_nomem ...passed 00:15:35.290 Test: blob_rw_iov_read_only ...passed 00:15:35.290 Test: blob_xattr ...passed 00:15:35.548 Test: blob_dirty_shutdown ...passed 00:15:35.548 Test: blob_is_degraded ...passed 00:15:35.548 Suite: blob_esnap_bs_copy_noextent 00:15:35.548 Test: blob_esnap_create ...passed 00:15:35.548 Test: blob_esnap_thread_add_remove ...passed 00:15:35.548 Test: blob_esnap_clone_snapshot ...passed 00:15:35.807 Test: blob_esnap_clone_inflate ...passed 00:15:35.807 Test: blob_esnap_clone_decouple ...passed 00:15:35.807 Test: blob_esnap_clone_reload ...passed 00:15:35.807 Test: blob_esnap_hotplug ...passed 00:15:35.807 Test: blob_set_parent ...[2024-06-07 12:17:59.420017] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7613:spdk_bs_blob_set_parent: *ERROR*: snapshot id not valid 00:15:35.807 [2024-06-07 12:17:59.420157] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7619:spdk_bs_blob_set_parent: *ERROR*: blob id and snapshot id cannot be the same 00:15:35.807 [2024-06-07 12:17:59.420241] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7548:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob is not a snapshot 00:15:35.807 [2024-06-07 12:17:59.420266] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7555:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob has a number of clusters different from child's ones 00:15:35.807 [2024-06-07 12:17:59.420515] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7594:bs_set_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:35.807 passed 00:15:36.066 Test: blob_set_external_parent ...[2024-06-07 12:17:59.476675] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7787:spdk_bs_blob_set_external_parent: *ERROR*: blob id and external snapshot id cannot be the same 00:15:36.066 [2024-06-07 12:17:59.476775] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7795:spdk_bs_blob_set_external_parent: *ERROR*: Esnap device size 61440 is not an integer multiple of cluster size 16384 00:15:36.066 [2024-06-07 12:17:59.476794] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7748:bs_set_external_parent_blob_open_cpl: *ERROR*: external snapshot is already the parent of blob 00:15:36.066 [2024-06-07 12:17:59.476998] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7754:bs_set_external_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:36.066 passed 00:15:36.066 Suite: blob_copy_extent 00:15:36.066 Test: blob_init ...[2024-06-07 12:17:59.495476] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5490:spdk_bs_init: *ERROR*: unsupported dev block length of 500 00:15:36.066 passed 00:15:36.066 Test: blob_thin_provision ...passed 00:15:36.066 Test: blob_read_only ...passed 00:15:36.066 Test: bs_load ...[2024-06-07 12:17:59.569015] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c: 965:blob_parse: *ERROR*: Blobid (0x0) doesn't match what's in metadata (0x100000000) 00:15:36.066 passed 00:15:36.066 Test: bs_load_custom_cluster_size ...passed 00:15:36.066 Test: bs_load_after_failed_grow ...passed 00:15:36.067 Test: bs_cluster_sz ...[2024-06-07 12:17:59.606746] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3824:bs_opts_verify: *ERROR*: Blobstore options cannot be set to 0 00:15:36.067 [2024-06-07 12:17:59.606911] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5621:spdk_bs_init: *ERROR*: Blobstore metadata cannot use more clusters than is available, please decrease number of pages reserved for metadata or increase cluster size. 00:15:36.067 [2024-06-07 12:17:59.606944] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:3883:bs_alloc: *ERROR*: Cluster size 4095 is smaller than page size 4096 00:15:36.067 passed 00:15:36.067 Test: bs_resize_md ...passed 00:15:36.067 Test: bs_destroy ...passed 00:15:36.067 Test: bs_type ...passed 00:15:36.325 Test: bs_super_block ...passed 00:15:36.325 Test: bs_test_recover_cluster_count ...passed 00:15:36.325 Test: bs_grow_live ...passed 00:15:36.325 Test: bs_grow_live_no_space ...passed 00:15:36.325 Test: bs_test_grow ...passed 00:15:36.325 Test: blob_serialize_test ...passed 00:15:36.325 Test: super_block_crc ...passed 00:15:36.325 Test: blob_thin_prov_write_count_io ...passed 00:15:36.325 Test: blob_thin_prov_unmap_cluster ...passed 00:15:36.325 Test: bs_load_iter_test ...passed 00:15:36.325 Test: blob_relations ...[2024-06-07 12:17:59.879587] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.879697] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 [2024-06-07 12:17:59.880097] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.880121] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 passed 00:15:36.325 Test: blob_relations2 ...[2024-06-07 12:17:59.899873] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.899959] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 [2024-06-07 12:17:59.899981] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.899997] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 [2024-06-07 12:17:59.900612] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.900646] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 [2024-06-07 12:17:59.900845] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8386:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot with more than one clone 00:15:36.325 [2024-06-07 12:17:59.900865] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.325 passed 00:15:36.325 Test: blob_relations3 ...passed 00:15:36.583 Test: blobstore_clean_power_failure ...passed 00:15:36.583 Test: blob_delete_snapshot_power_failure ...[2024-06-07 12:18:00.158972] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:36.583 [2024-06-07 12:18:00.177819] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:36.583 [2024-06-07 12:18:00.196519] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:36.583 [2024-06-07 12:18:00.196620] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:36.583 [2024-06-07 12:18:00.196643] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.583 [2024-06-07 12:18:00.215612] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:36.583 [2024-06-07 12:18:00.215683] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:36.583 [2024-06-07 12:18:00.215707] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:36.583 [2024-06-07 12:18:00.215730] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.841 [2024-06-07 12:18:00.234955] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:36.841 [2024-06-07 12:18:00.238094] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1466:blob_load_snapshot_cpl: *ERROR*: Snapshot fail 00:15:36.841 [2024-06-07 12:18:00.238141] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8300:delete_snapshot_open_clone_cb: *ERROR*: Failed to open clone 00:15:36.841 [2024-06-07 12:18:00.238169] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.841 [2024-06-07 12:18:00.257433] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8227:delete_snapshot_sync_snapshot_xattr_cpl: *ERROR*: Failed to sync MD with xattr on blob 00:15:36.841 [2024-06-07 12:18:00.257521] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.841 [2024-06-07 12:18:00.276576] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8096:delete_snapshot_sync_clone_cpl: *ERROR*: Failed to sync MD on clone 00:15:36.841 [2024-06-07 12:18:00.276679] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.841 [2024-06-07 12:18:00.295756] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8040:delete_snapshot_sync_snapshot_cpl: *ERROR*: Failed to sync MD on blob 00:15:36.841 [2024-06-07 12:18:00.295840] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:36.841 passed 00:15:36.841 Test: blob_create_snapshot_power_failure ...[2024-06-07 12:18:00.351982] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 0 read failed for blobid 0x100000000: -5 00:15:36.841 [2024-06-07 12:18:00.370656] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1579:blob_load_cpl_extents_cpl: *ERROR*: Extent page read failed: -5 00:15:36.841 [2024-06-07 12:18:00.407619] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1669:blob_load_cpl: *ERROR*: Metadata page 2 read failed for blobid 0x100000002: -5 00:15:36.841 [2024-06-07 12:18:00.426311] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6446:bs_clone_snapshot_origblob_cleanup: *ERROR*: Cleanup error -5 00:15:36.841 passed 00:15:37.100 Test: blob_io_unit ...passed 00:15:37.100 Test: blob_io_unit_compatibility ...passed 00:15:37.100 Test: blob_ext_md_pages ...passed 00:15:37.100 Test: blob_esnap_io_4096_4096 ...passed 00:15:37.100 Test: blob_esnap_io_512_512 ...passed 00:15:37.100 Test: blob_esnap_io_4096_512 ...passed 00:15:37.100 Test: blob_esnap_io_512_4096 ...passed 00:15:37.100 Test: blob_esnap_clone_resize ...passed 00:15:37.100 Suite: blob_bs_copy_extent 00:15:37.359 Test: blob_open ...passed 00:15:37.359 Test: blob_create ...[2024-06-07 12:18:00.831146] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -28, size in clusters/size: 65 (clusters) 00:15:37.359 passed 00:15:37.359 Test: blob_create_loop ...passed 00:15:37.359 Test: blob_create_fail ...[2024-06-07 12:18:00.964522] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:37.359 passed 00:15:37.617 Test: blob_create_internal ...passed 00:15:37.617 Test: blob_create_zero_extent ...passed 00:15:37.617 Test: blob_snapshot ...passed 00:15:37.617 Test: blob_clone ...passed 00:15:37.617 Test: blob_inflate ...[2024-06-07 12:18:01.246029] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7109:bs_inflate_blob_open_cpl: *ERROR*: Cannot decouple parent of blob with no parent. 00:15:37.617 passed 00:15:37.875 Test: blob_delete ...passed 00:15:37.875 Test: blob_resize_test ...[2024-06-07 12:18:01.353734] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7845:bs_resize_unfreeze_cpl: *ERROR*: Unfreeze failed, ctx->rc=-28 00:15:37.875 passed 00:15:37.875 Test: blob_resize_thin_test ...passed 00:15:37.875 Test: channel_ops ...passed 00:15:38.133 Test: blob_super ...passed 00:15:38.133 Test: blob_rw_verify_iov ...passed 00:15:38.133 Test: blob_unmap ...passed 00:15:38.133 Test: blob_iter ...passed 00:15:38.133 Test: blob_parse_md ...passed 00:15:38.391 Test: bs_load_pending_removal ...passed 00:15:38.391 Test: bs_unload ...[2024-06-07 12:18:01.857505] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:5878:spdk_bs_unload: *ERROR*: Blobstore still has open blobs 00:15:38.391 passed 00:15:38.391 Test: bs_usable_clusters ...passed 00:15:38.391 Test: blob_crc ...[2024-06-07 12:18:01.967926] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:38.391 [2024-06-07 12:18:01.968066] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:1678:blob_load_cpl: *ERROR*: Metadata page 0 crc mismatch for blobid 0x100000000 00:15:38.391 passed 00:15:38.650 Test: blob_flags ...passed 00:15:38.650 Test: bs_version ...passed 00:15:38.650 Test: blob_set_xattrs_test ...[2024-06-07 12:18:02.137396] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:38.650 [2024-06-07 12:18:02.137528] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:6327:bs_create_blob: *ERROR*: Failed to create blob: Unknown error -22, size in clusters/size: 0 (clusters) 00:15:38.650 passed 00:15:38.650 Test: blob_thin_prov_alloc ...passed 00:15:38.925 Test: blob_insert_cluster_msg_test ...passed 00:15:38.925 Test: blob_thin_prov_rw ...passed 00:15:38.925 Test: blob_thin_prov_rle ...passed 00:15:38.925 Test: blob_thin_prov_rw_iov ...passed 00:15:38.925 Test: blob_snapshot_rw ...passed 00:15:39.182 Test: blob_snapshot_rw_iov ...passed 00:15:39.182 Test: blob_inflate_rw ...passed 00:15:39.182 Test: blob_snapshot_freeze_io ...passed 00:15:39.440 Test: blob_operation_split_rw ...passed 00:15:39.698 Test: blob_operation_split_rw_iov ...passed 00:15:39.698 Test: blob_simultaneous_operations ...[2024-06-07 12:18:03.124835] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:39.698 [2024-06-07 12:18:03.124949] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:39.698 [2024-06-07 12:18:03.125425] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:39.698 [2024-06-07 12:18:03.125468] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:39.698 [2024-06-07 12:18:03.128282] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:39.698 [2024-06-07 12:18:03.128327] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:39.698 [2024-06-07 12:18:03.128409] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8413:bs_is_blob_deletable: *ERROR*: Cannot remove snapshot because it is open 00:15:39.698 [2024-06-07 12:18:03.128428] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:8353:bs_delete_blob_finish: *ERROR*: Failed to remove blob 00:15:39.698 passed 00:15:39.698 Test: blob_persist_test ...passed 00:15:39.698 Test: blob_decouple_snapshot ...passed 00:15:39.698 Test: blob_seek_io_unit ...passed 00:15:39.956 Test: blob_nested_freezes ...passed 00:15:39.956 Test: blob_clone_resize ...passed 00:15:39.956 Test: blob_shallow_copy ...[2024-06-07 12:18:03.505397] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7332:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, blob must be read only 00:15:39.956 [2024-06-07 12:18:03.505745] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7342:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device must have at least blob size 00:15:39.956 [2024-06-07 12:18:03.506038] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7350:bs_shallow_copy_blob_open_cpl: *ERROR*: blob 0x100000000 shallow copy, external device block size is not compatible with blobstore block size 00:15:39.956 passed 00:15:39.956 Suite: blob_blob_copy_extent 00:15:39.956 Test: blob_write ...passed 00:15:40.214 Test: blob_read ...passed 00:15:40.214 Test: blob_rw_verify ...passed 00:15:40.214 Test: blob_rw_verify_iov_nomem ...passed 00:15:40.214 Test: blob_rw_iov_read_only ...passed 00:15:40.473 Test: blob_xattr ...passed 00:15:40.473 Test: blob_dirty_shutdown ...passed 00:15:40.473 Test: blob_is_degraded ...passed 00:15:40.473 Suite: blob_esnap_bs_copy_extent 00:15:40.473 Test: blob_esnap_create ...passed 00:15:40.473 Test: blob_esnap_thread_add_remove ...passed 00:15:40.732 Test: blob_esnap_clone_snapshot ...passed 00:15:40.732 Test: blob_esnap_clone_inflate ...passed 00:15:40.732 Test: blob_esnap_clone_decouple ...passed 00:15:40.732 Test: blob_esnap_clone_reload ...passed 00:15:40.990 Test: blob_esnap_hotplug ...passed 00:15:40.990 Test: blob_set_parent ...[2024-06-07 12:18:04.430343] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7613:spdk_bs_blob_set_parent: *ERROR*: snapshot id not valid 00:15:40.990 [2024-06-07 12:18:04.430436] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7619:spdk_bs_blob_set_parent: *ERROR*: blob id and snapshot id cannot be the same 00:15:40.990 [2024-06-07 12:18:04.430541] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7548:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob is not a snapshot 00:15:40.990 [2024-06-07 12:18:04.430585] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7555:bs_set_parent_snapshot_open_cpl: *ERROR*: parent blob has a number of clusters different from child's ones 00:15:40.990 [2024-06-07 12:18:04.430961] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7594:bs_set_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:40.990 passed 00:15:40.990 Test: blob_set_external_parent ...[2024-06-07 12:18:04.488648] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7787:spdk_bs_blob_set_external_parent: *ERROR*: blob id and external snapshot id cannot be the same 00:15:40.990 [2024-06-07 12:18:04.488804] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7795:spdk_bs_blob_set_external_parent: *ERROR*: Esnap device size 61440 is not an integer multiple of cluster size 16384 00:15:40.990 [2024-06-07 12:18:04.488831] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7748:bs_set_external_parent_blob_open_cpl: *ERROR*: external snapshot is already the parent of blob 00:15:40.990 [2024-06-07 12:18:04.489153] /home/vagrant/spdk_repo/spdk/lib/blob/blobstore.c:7754:bs_set_external_parent_blob_open_cpl: *ERROR*: blob is not thin-provisioned 00:15:40.990 passed 00:15:40.990 00:15:40.990 Run Summary: Type Total Ran Passed Failed Inactive 00:15:40.990 suites 16 16 n/a 0 0 00:15:40.990 tests 376 376 376 0 0 00:15:40.990 asserts 143965 143965 143965 0 n/a 00:15:40.990 00:15:40.990 Elapsed time = 20.402 seconds 00:15:40.990 12:18:04 unittest.unittest_blob_blobfs -- unit/unittest.sh@42 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blob/blob_bdev.c/blob_bdev_ut 00:15:40.990 00:15:40.990 00:15:40.990 CUnit - A unit testing framework for C - Version 2.1-3 00:15:40.990 http://cunit.sourceforge.net/ 00:15:40.990 00:15:40.990 00:15:40.990 Suite: blob_bdev 00:15:40.990 Test: create_bs_dev ...passed 00:15:40.990 Test: create_bs_dev_ro ...[2024-06-07 12:18:04.620184] /home/vagrant/spdk_repo/spdk/module/blob/bdev/blob_bdev.c: 529:spdk_bdev_create_bs_dev: *ERROR*: bdev name 'nope': unsupported options 00:15:40.990 passed 00:15:40.990 Test: create_bs_dev_rw ...passed 00:15:40.990 Test: claim_bs_dev ...[2024-06-07 12:18:04.620840] /home/vagrant/spdk_repo/spdk/module/blob/bdev/blob_bdev.c: 340:spdk_bs_bdev_claim: *ERROR*: could not claim bs dev 00:15:40.990 passed 00:15:40.990 Test: claim_bs_dev_ro ...passed 00:15:40.990 Test: deferred_destroy_refs ...passed 00:15:40.990 Test: deferred_destroy_channels ...passed 00:15:40.990 Test: deferred_destroy_threads ...passed 00:15:40.990 00:15:40.990 Run Summary: Type Total Ran Passed Failed Inactive 00:15:40.990 suites 1 1 n/a 0 0 00:15:40.990 tests 8 8 8 0 0 00:15:40.990 asserts 119 119 119 0 n/a 00:15:40.990 00:15:40.990 Elapsed time = 0.001 seconds 00:15:40.990 12:18:04 unittest.unittest_blob_blobfs -- unit/unittest.sh@43 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blobfs/tree.c/tree_ut 00:15:41.249 00:15:41.249 00:15:41.249 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.249 http://cunit.sourceforge.net/ 00:15:41.249 00:15:41.249 00:15:41.249 Suite: tree 00:15:41.249 Test: blobfs_tree_op_test ...passed 00:15:41.249 00:15:41.249 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.249 suites 1 1 n/a 0 0 00:15:41.249 tests 1 1 1 0 0 00:15:41.249 asserts 27 27 27 0 n/a 00:15:41.249 00:15:41.249 Elapsed time = 0.000 seconds 00:15:41.249 12:18:04 unittest.unittest_blob_blobfs -- unit/unittest.sh@44 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blobfs/blobfs_async_ut/blobfs_async_ut 00:15:41.249 00:15:41.249 00:15:41.249 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.249 http://cunit.sourceforge.net/ 00:15:41.249 00:15:41.249 00:15:41.249 Suite: blobfs_async_ut 00:15:41.249 Test: fs_init ...passed 00:15:41.249 Test: fs_open ...passed 00:15:41.249 Test: fs_create ...passed 00:15:41.249 Test: fs_truncate ...passed 00:15:41.249 Test: fs_rename ...[2024-06-07 12:18:04.826769] /home/vagrant/spdk_repo/spdk/lib/blobfs/blobfs.c:1478:spdk_fs_delete_file_async: *ERROR*: Cannot find the file=file1 to deleted 00:15:41.249 passed 00:15:41.249 Test: fs_rw_async ...passed 00:15:41.249 Test: fs_writev_readv_async ...passed 00:15:41.249 Test: tree_find_buffer_ut ...passed 00:15:41.249 Test: channel_ops ...passed 00:15:41.507 Test: channel_ops_sync ...passed 00:15:41.507 00:15:41.507 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.507 suites 1 1 n/a 0 0 00:15:41.507 tests 10 10 10 0 0 00:15:41.507 asserts 292 292 292 0 n/a 00:15:41.507 00:15:41.507 Elapsed time = 0.217 seconds 00:15:41.507 12:18:04 unittest.unittest_blob_blobfs -- unit/unittest.sh@46 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blobfs/blobfs_sync_ut/blobfs_sync_ut 00:15:41.507 00:15:41.507 00:15:41.507 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.507 http://cunit.sourceforge.net/ 00:15:41.507 00:15:41.507 00:15:41.507 Suite: blobfs_sync_ut 00:15:41.507 Test: cache_read_after_write ...[2024-06-07 12:18:05.024298] /home/vagrant/spdk_repo/spdk/lib/blobfs/blobfs.c:1478:spdk_fs_delete_file_async: *ERROR*: Cannot find the file=testfile to deleted 00:15:41.507 passed 00:15:41.507 Test: file_length ...passed 00:15:41.507 Test: append_write_to_extend_blob ...passed 00:15:41.507 Test: partial_buffer ...passed 00:15:41.507 Test: cache_write_null_buffer ...passed 00:15:41.507 Test: fs_create_sync ...passed 00:15:41.767 Test: fs_rename_sync ...passed 00:15:41.767 Test: cache_append_no_cache ...passed 00:15:41.767 Test: fs_delete_file_without_close ...passed 00:15:41.767 00:15:41.767 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.767 suites 1 1 n/a 0 0 00:15:41.767 tests 9 9 9 0 0 00:15:41.767 asserts 345 345 345 0 n/a 00:15:41.767 00:15:41.767 Elapsed time = 0.437 seconds 00:15:41.767 12:18:05 unittest.unittest_blob_blobfs -- unit/unittest.sh@47 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/blobfs/blobfs_bdev.c/blobfs_bdev_ut 00:15:41.767 00:15:41.767 00:15:41.767 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.767 http://cunit.sourceforge.net/ 00:15:41.767 00:15:41.767 00:15:41.767 Suite: blobfs_bdev_ut 00:15:41.767 Test: spdk_blobfs_bdev_detect_test ...[2024-06-07 12:18:05.263895] /home/vagrant/spdk_repo/spdk/module/blobfs/bdev/blobfs_bdev.c: 59:_blobfs_bdev_unload_cb: *ERROR*: Failed to unload blobfs on bdev ut_bdev: errno -1 00:15:41.767 passed 00:15:41.767 Test: spdk_blobfs_bdev_create_test ...passed 00:15:41.767 Test: spdk_blobfs_bdev_mount_test ...passed 00:15:41.767 00:15:41.767 [2024-06-07 12:18:05.264337] /home/vagrant/spdk_repo/spdk/module/blobfs/bdev/blobfs_bdev.c: 59:_blobfs_bdev_unload_cb: *ERROR*: Failed to unload blobfs on bdev ut_bdev: errno -1 00:15:41.767 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.767 suites 1 1 n/a 0 0 00:15:41.767 tests 3 3 3 0 0 00:15:41.767 asserts 9 9 9 0 n/a 00:15:41.767 00:15:41.767 Elapsed time = 0.001 seconds 00:15:41.767 00:15:41.767 real 0m21.294s 00:15:41.767 user 0m20.587s 00:15:41.767 sys 0m0.855s 00:15:41.767 12:18:05 unittest.unittest_blob_blobfs -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:41.767 12:18:05 unittest.unittest_blob_blobfs -- common/autotest_common.sh@10 -- # set +x 00:15:41.767 ************************************ 00:15:41.767 END TEST unittest_blob_blobfs 00:15:41.767 ************************************ 00:15:41.767 12:18:05 unittest -- unit/unittest.sh@234 -- # run_test unittest_event unittest_event 00:15:41.767 12:18:05 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:41.767 12:18:05 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:41.767 12:18:05 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:41.767 ************************************ 00:15:41.767 START TEST unittest_event 00:15:41.767 ************************************ 00:15:41.767 12:18:05 unittest.unittest_event -- common/autotest_common.sh@1124 -- # unittest_event 00:15:41.767 12:18:05 unittest.unittest_event -- unit/unittest.sh@51 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/event/app.c/app_ut 00:15:41.767 00:15:41.767 00:15:41.767 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.767 http://cunit.sourceforge.net/ 00:15:41.767 00:15:41.767 00:15:41.767 Suite: app_suite 00:15:41.767 Test: test_spdk_app_parse_args ...app_ut [options] 00:15:41.767 00:15:41.767 CPU options: 00:15:41.767 -m, --cpumask core mask (like 0xF) or core list of '[]' embraced for DPDK 00:15:41.767 (like [0,1,10]) 00:15:41.767 --lcores lcore to CPU mapping list. The list is in the format: 00:15:41.767 [<,lcores[@CPUs]>...] 00:15:41.767 lcores and cpus list are grouped by '(' and ')', e.g '--lcores "(5-7)@(10-12)"' 00:15:41.767 Within the group, '-' is used for range separator, 00:15:41.767 ',' is used for single number separator. 00:15:41.767 '( )' can be omitted for single element group, 00:15:41.767 '@' can be omitted if cpus and lcores have the same value 00:15:41.767 --disable-cpumask-locks Disable CPU core lock files. 00:15:41.767 --interrupt-mode set app to interrupt mode (Warning: CPU usage will be reduced only if all 00:15:41.767 pollers in the app support interrupt mode) 00:15:41.767 -p, --main-core main (primary) core for DPDK 00:15:41.767 00:15:41.767 Configuration options: 00:15:41.767 -c, --config, --json JSON config file 00:15:41.767 -r, --rpc-socket RPC listen address (default /var/tmp/spdk.sock) 00:15:41.767 --no-rpc-server skip RPC server initialization. This option ignores '--rpc-socket' value. 00:15:41.767 --wait-for-rpc wait for RPCs to initialize subsystems 00:15:41.767 --rpcs-allowed comma-separated list of permitted RPCS 00:15:41.767 --json-ignore-init-errors don't exit on invalid config entry 00:15:41.767 00:15:41.767 Memory options: 00:15:41.767 --iova-mode set IOVA mode ('pa' for IOVA_PA and 'va' for IOVA_VA) 00:15:41.767 --base-virtaddr the base virtual address for DPDK (default: 0x200000000000) 00:15:41.767 --huge-dir use a specific hugetlbfs mount to reserve memory from 00:15:41.767 -R, --huge-unlink unlink huge files after initialization 00:15:41.767 -n, --mem-channels number of memory channels used for DPDK 00:15:41.767 -s, --mem-size memory size in MB for DPDK (default: 0MB) 00:15:41.767 --msg-mempool-size global message memory pool size in count (default: 262143) 00:15:41.767 --no-huge run without using hugepages 00:15:41.767 -i, --shm-id shared memory ID (optional) 00:15:41.767 -g, --single-file-segments force creating just one hugetlbfs file 00:15:41.767 00:15:41.767 PCI options: 00:15:41.767 -A, --pci-allowed pci addr to allow (-B and -A cannot be used at the same time) 00:15:41.767 -B, --pci-blocked pci addr to block (can be used more than once) 00:15:41.767 app_ut: invalid option -- 'z' 00:15:41.767 -u, --no-pci disable PCI access 00:15:41.767 --vfio-vf-token VF token (UUID) shared between SR-IOV PF and VFs for vfio_pci driver 00:15:41.767 00:15:41.767 Log options: 00:15:41.767 -L, --logflag enable log flag (all, app_rpc, json_util, rpc, thread, trace) 00:15:41.767 --silence-noticelog disable notice level logging to stderr 00:15:41.767 00:15:41.767 Trace options: 00:15:41.767 --num-trace-entries number of trace entries for each core, must be power of 2, 00:15:41.767 setting 0 to disable trace (default 32768) 00:15:41.767 Tracepoints vary in size and can use more than one trace entry. 00:15:41.767 -e, --tpoint-group [:] 00:15:41.767 group_name - tracepoint group name for spdk trace buffers (thread, all). 00:15:41.767 tpoint_mask - tracepoint mask for enabling individual tpoints inside 00:15:41.767 a tracepoint group. First tpoint inside a group can be enabled by 00:15:41.767 setting tpoint_mask to 1 (e.g. bdev:0x1). Groups and masks can be 00:15:41.767 combined (e.g. thread,bdev:0x1). All available tpoints can be found 00:15:41.767 in /include/spdk_internal/trace_defs.h 00:15:41.767 00:15:41.767 Other options: 00:15:41.767 -h, --help show this usage 00:15:41.767 -v, --version print SPDK version 00:15:41.767 -d, --limit-coredump do not set max coredump size to RLIM_INFINITY 00:15:41.767 --env-context Opaque context for use of the env implementation 00:15:41.767 app_ut [options] 00:15:41.767 00:15:41.767 CPU options: 00:15:41.767 -m, --cpumask core mask (like 0xF) or core list of '[]' embraced for DPDK 00:15:41.767 (like [0,1,10]) 00:15:41.767 --lcores lcore to CPU mapping list. The list is in the format: 00:15:41.767 [<,lcores[@CPUs]>...] 00:15:41.767 lcores and cpus list are grouped by '(' and ')', e.g '--lcores "(5-7)@(10-12)"' 00:15:41.767 Within the group, '-' is used for range separator, 00:15:41.767 ',' is used for single number separator. 00:15:41.767 '( )' can be omitted for single element group, 00:15:41.767 '@' can be omitted if cpus and lcores have the same value 00:15:41.767 --disable-cpumask-locks Disable CPU core lock files. 00:15:41.767 --interrupt-mode set app to interrupt mode (Warning: CPU usage will be reduced only if all 00:15:41.767 pollers in the app support interrupt mode) 00:15:41.767 -p, --main-core main (primary) core for DPDK 00:15:41.767 00:15:41.767 Configuration options: 00:15:41.767 -c, --config, --json JSON config file 00:15:41.767 -r, --rpc-socket RPC listen address (default /var/tmp/spdk.sock) 00:15:41.767 --no-rpc-server skip RPC server initialization. This option ignores '--rpc-socket' value. 00:15:41.767 --wait-for-rpc wait for RPCs to initialize subsystems 00:15:41.767 --rpcs-allowed comma-separated list of permitted RPCS 00:15:41.767 --json-ignore-init-errors don't exit on invalid config entry 00:15:41.767 00:15:41.767 Memory options: 00:15:41.767 --iova-mode set IOVA mode ('pa' for IOVA_PA and 'va' for IOVA_VA) 00:15:41.767 --base-virtaddr the base virtual address for DPDK (default: 0x200000000000) 00:15:41.767 --huge-dir use a specific hugetlbfs mount to reserve memory from 00:15:41.767 -R, --huge-unlink unlink huge files after initialization 00:15:41.767 -n, --mem-channels number of memory channels used for DPDK 00:15:41.767 -s, --mem-size memory size in MB for DPDK (default: 0MB) 00:15:41.767 --msg-mempool-size global message memory pool size in count (default: 262143) 00:15:41.767 --no-huge run without using hugepages 00:15:41.767 -i, --shm-id shared memory ID (optional) 00:15:41.767 -g, --single-file-segments force creating just one hugetlbfs file 00:15:41.767 00:15:41.767 PCI options: 00:15:41.767 -A, --pci-allowed pci addr to allow (-B and -A cannot be used at the same time) 00:15:41.767 -B, --pci-blocked pci addr to block (can be used more than once) 00:15:41.767 -u, --no-pci disable PCI access 00:15:41.767 --vfio-vf-token VF token (UUID) shared between SR-IOV PF and VFs for vfio_pci driver 00:15:41.767 00:15:41.767 Log options: 00:15:41.767 -L, --logflag enable log flag (all, app_rpc, json_util, rpc, thread, trace) 00:15:41.767 --silence-noticelog disable notice level logging to stderr 00:15:41.767 00:15:41.767 Trace options:app_ut: unrecognized option '--test-long-opt' 00:15:41.767 00:15:41.767 --num-trace-entries number of trace entries for each core, must be power of 2, 00:15:41.767 setting 0 to disable trace (default 32768) 00:15:41.767 Tracepoints vary in size and can use more than one trace entry. 00:15:41.767 -e, --tpoint-group [:] 00:15:41.767 group_name - tracepoint group name for spdk trace buffers (thread, all). 00:15:41.767 tpoint_mask - tracepoint mask for enabling individual tpoints inside 00:15:41.767 a tracepoint group. First tpoint inside a group can be enabled by 00:15:41.767 setting tpoint_mask to 1 (e.g. bdev:0x1). Groups and masks can be 00:15:41.767 combined (e.g. thread,bdev:0x1). All available tpoints can be found 00:15:41.767 in /include/spdk_internal/trace_defs.h 00:15:41.767 00:15:41.767 Other options: 00:15:41.767 -h, --help show this usage 00:15:41.767 -v, --version print SPDK version 00:15:41.767 -d, --limit-coredump do not set max coredump size to RLIM_INFINITY 00:15:41.767 --env-context Opaque context for use of the env implementation 00:15:41.767 [2024-06-07 12:18:05.352671] /home/vagrant/spdk_repo/spdk/lib/event/app.c:1192:spdk_app_parse_args: *ERROR*: Duplicated option 'c' between app-specific command line parameter and generic spdk opts. 00:15:41.767 app_ut [options] 00:15:41.767 00:15:41.767 CPU options: 00:15:41.767 -m, --cpumask core mask (like 0xF) or core list of '[]' embraced for DPDK 00:15:41.767 (like [0,1,10]) 00:15:41.767 --lcores lcore to CPU mapping list. The list is in the format: 00:15:41.767 [<,lcores[@CPUs]>...] 00:15:41.767 lcores and cpus list are grouped by '(' and ')', e.g '--lcores "(5-7)@(10-12)"' 00:15:41.767 Within the group, '-' is used for range separator, 00:15:41.767 ',' is used for single number separator. 00:15:41.767 '( )' can be omitted for single element group, 00:15:41.767 '@' can be omitted if cpus and lcores have the same value 00:15:41.767 --disable-cpumask-locks Disable CPU core lock files. 00:15:41.767 --interrupt-mode set app to interrupt mode (Warning: CPU usage will be reduced only if all 00:15:41.767 [2024-06-07 12:18:05.352943] /home/vagrant/spdk_repo/spdk/lib/event/app.c:1373:spdk_app_parse_args: *ERROR*: -B and -W cannot be used at the same time 00:15:41.767 pollers in the app support interrupt mode) 00:15:41.767 -p, --main-core main (primary) core for DPDK 00:15:41.767 00:15:41.767 Configuration options: 00:15:41.767 -c, --config, --json JSON config file 00:15:41.767 -r, --rpc-socket RPC listen address (default /var/tmp/spdk.sock) 00:15:41.768 --no-rpc-server skip RPC server initialization. This option ignores '--rpc-socket' value. 00:15:41.768 --wait-for-rpc wait for RPCs to initialize subsystems 00:15:41.768 --rpcs-allowed comma-separated list of permitted RPCS 00:15:41.768 --json-ignore-init-errors don't exit on invalid config entry 00:15:41.768 00:15:41.768 Memory options: 00:15:41.768 --iova-mode set IOVA mode ('pa' for IOVA_PA and 'va' for IOVA_VA) 00:15:41.768 --base-virtaddr the base virtual address for DPDK (default: 0x200000000000) 00:15:41.768 --huge-dir use a specific hugetlbfs mount to reserve memory from 00:15:41.768 -R, --huge-unlink unlink huge files after initialization 00:15:41.768 -n, --mem-channels number of memory channels used for DPDK 00:15:41.768 -s, --mem-size memory size in MB for DPDK (default: 0MB) 00:15:41.768 --msg-mempool-size global message memory pool size in count (default: 262143) 00:15:41.768 --no-huge run without using hugepages 00:15:41.768 -i, --shm-id shared memory ID (optional) 00:15:41.768 -g, --single-file-segments force creating just one hugetlbfs file 00:15:41.768 00:15:41.768 PCI options: 00:15:41.768 -A, --pci-allowed pci addr to allow (-B and -A cannot be used at the same time) 00:15:41.768 -B, --pci-blocked pci addr to block (can be used more than once) 00:15:41.768 -u, --no-pci disable PCI access 00:15:41.768 --vfio-vf-token VF token (UUID) shared between SR-IOV PF and VFs for vfio_pci driver 00:15:41.768 00:15:41.768 Log options: 00:15:41.768 -L, --logflag enable log flag (all, app_rpc, json_util, rpc, thread, trace) 00:15:41.768 --silence-noticelog disable notice level logging to stderr 00:15:41.768 00:15:41.768 Trace options: 00:15:41.768 --num-trace-entries number of trace entries for each core, must be power of 2, 00:15:41.768 setting 0 to disable trace (default 32768) 00:15:41.768 Tracepoints vary in size and can use more than one trace entry. 00:15:41.768 -e, --tpoint-group [:] 00:15:41.768 group_name - tracepoint group name for spdk trace buffers (thread, all). 00:15:41.768 tpoint_mask - tracepoint mask for enabling individual tpoints inside 00:15:41.768 a tracepoint group. First tpoint inside a group can be enabled by 00:15:41.768 setting tpoint_mask to 1 (e.g. bdev:0x1). Groups and masks can be 00:15:41.768 combined (e.g. thread,bdev:0x1). All available tpoints can be found 00:15:41.768 in /include/spdk_internal/trace_defs.h 00:15:41.768 00:15:41.768 Other options: 00:15:41.768 -h, --help show this usage 00:15:41.768 -v, --version print SPDK version 00:15:41.768 -d, --limit-coredump do not set max coredump size to RLIM_INFINITY 00:15:41.768 --env-context Opaque context for use of the env implementation 00:15:41.768 passed 00:15:41.768 00:15:41.768 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.768 suites 1 1 n/a 0 0 00:15:41.768 tests 1 1 1 0 0 00:15:41.768 asserts 8 8 8 0 n/a 00:15:41.768 00:15:41.768 Elapsed time = 0.001 seconds 00:15:41.768 [2024-06-07 12:18:05.353197] /home/vagrant/spdk_repo/spdk/lib/event/app.c:1278:spdk_app_parse_args: *ERROR*: Invalid main core --single-file-segments 00:15:41.768 12:18:05 unittest.unittest_event -- unit/unittest.sh@52 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/event/reactor.c/reactor_ut 00:15:41.768 00:15:41.768 00:15:41.768 CUnit - A unit testing framework for C - Version 2.1-3 00:15:41.768 http://cunit.sourceforge.net/ 00:15:41.768 00:15:41.768 00:15:41.768 Suite: app_suite 00:15:41.768 Test: test_create_reactor ...passed 00:15:41.768 Test: test_init_reactors ...passed 00:15:41.768 Test: test_event_call ...passed 00:15:41.768 Test: test_schedule_thread ...passed 00:15:41.768 Test: test_reschedule_thread ...passed 00:15:41.768 Test: test_bind_thread ...passed 00:15:41.768 Test: test_for_each_reactor ...passed 00:15:41.768 Test: test_reactor_stats ...passed 00:15:41.768 Test: test_scheduler ...passed 00:15:41.768 Test: test_governor ...passed 00:15:41.768 00:15:41.768 Run Summary: Type Total Ran Passed Failed Inactive 00:15:41.768 suites 1 1 n/a 0 0 00:15:41.768 tests 10 10 10 0 0 00:15:41.768 asserts 344 344 344 0 n/a 00:15:41.768 00:15:41.768 Elapsed time = 0.016 seconds 00:15:42.026 00:15:42.026 real 0m0.091s 00:15:42.026 user 0m0.055s 00:15:42.026 sys 0m0.036s 00:15:42.026 12:18:05 unittest.unittest_event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:42.026 12:18:05 unittest.unittest_event -- common/autotest_common.sh@10 -- # set +x 00:15:42.026 ************************************ 00:15:42.026 END TEST unittest_event 00:15:42.026 ************************************ 00:15:42.026 12:18:05 unittest -- unit/unittest.sh@235 -- # uname -s 00:15:42.026 12:18:05 unittest -- unit/unittest.sh@235 -- # '[' Linux = Linux ']' 00:15:42.026 12:18:05 unittest -- unit/unittest.sh@236 -- # run_test unittest_ftl unittest_ftl 00:15:42.026 12:18:05 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:42.026 12:18:05 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:42.026 12:18:05 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:42.026 ************************************ 00:15:42.026 START TEST unittest_ftl 00:15:42.026 ************************************ 00:15:42.026 12:18:05 unittest.unittest_ftl -- common/autotest_common.sh@1124 -- # unittest_ftl 00:15:42.026 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@56 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_band.c/ftl_band_ut 00:15:42.026 00:15:42.026 00:15:42.026 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.026 http://cunit.sourceforge.net/ 00:15:42.026 00:15:42.026 00:15:42.026 Suite: ftl_band_suite 00:15:42.026 Test: test_band_block_offset_from_addr_base ...passed 00:15:42.026 Test: test_band_block_offset_from_addr_offset ...passed 00:15:42.026 Test: test_band_addr_from_block_offset ...passed 00:15:42.284 Test: test_band_set_addr ...passed 00:15:42.284 Test: test_invalidate_addr ...passed 00:15:42.284 Test: test_next_xfer_addr ...passed 00:15:42.284 00:15:42.284 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.284 suites 1 1 n/a 0 0 00:15:42.284 tests 6 6 6 0 0 00:15:42.284 asserts 30356 30356 30356 0 n/a 00:15:42.284 00:15:42.284 Elapsed time = 0.243 seconds 00:15:42.284 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@57 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_bitmap.c/ftl_bitmap_ut 00:15:42.284 00:15:42.284 00:15:42.284 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.284 http://cunit.sourceforge.net/ 00:15:42.284 00:15:42.284 00:15:42.284 Suite: ftl_bitmap 00:15:42.284 Test: test_ftl_bitmap_create ...[2024-06-07 12:18:05.864077] /home/vagrant/spdk_repo/spdk/lib/ftl/utils/ftl_bitmap.c: 52:ftl_bitmap_create: *ERROR*: Buffer for bitmap must be aligned to 8 bytes 00:15:42.284 passed 00:15:42.284 Test: test_ftl_bitmap_get ...passed 00:15:42.284 Test: test_ftl_bitmap_set ...passed 00:15:42.285 Test: test_ftl_bitmap_clear ...[2024-06-07 12:18:05.864419] /home/vagrant/spdk_repo/spdk/lib/ftl/utils/ftl_bitmap.c: 58:ftl_bitmap_create: *ERROR*: Size of buffer for bitmap must be divisible by 8 bytes 00:15:42.285 passed 00:15:42.285 Test: test_ftl_bitmap_find_first_set ...passed 00:15:42.285 Test: test_ftl_bitmap_find_first_clear ...passed 00:15:42.285 Test: test_ftl_bitmap_count_set ...passed 00:15:42.285 00:15:42.285 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.285 suites 1 1 n/a 0 0 00:15:42.285 tests 7 7 7 0 0 00:15:42.285 asserts 137 137 137 0 n/a 00:15:42.285 00:15:42.285 Elapsed time = 0.001 seconds 00:15:42.285 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@58 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_io.c/ftl_io_ut 00:15:42.285 00:15:42.285 00:15:42.285 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.285 http://cunit.sourceforge.net/ 00:15:42.285 00:15:42.285 00:15:42.285 Suite: ftl_io_suite 00:15:42.285 Test: test_completion ...passed 00:15:42.285 Test: test_multiple_ios ...passed 00:15:42.285 00:15:42.285 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.285 suites 1 1 n/a 0 0 00:15:42.285 tests 2 2 2 0 0 00:15:42.285 asserts 47 47 47 0 n/a 00:15:42.285 00:15:42.285 Elapsed time = 0.003 seconds 00:15:42.285 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@59 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_mngt/ftl_mngt_ut 00:15:42.566 00:15:42.566 00:15:42.566 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.566 http://cunit.sourceforge.net/ 00:15:42.566 00:15:42.566 00:15:42.566 Suite: ftl_mngt 00:15:42.566 Test: test_next_step ...passed 00:15:42.566 Test: test_continue_step ...passed 00:15:42.566 Test: test_get_func_and_step_cntx_alloc ...passed 00:15:42.566 Test: test_fail_step ...passed 00:15:42.566 Test: test_mngt_call_and_call_rollback ...passed 00:15:42.566 Test: test_nested_process_failure ...passed 00:15:42.566 Test: test_call_init_success ...passed 00:15:42.566 Test: test_call_init_failure ...passed 00:15:42.566 00:15:42.566 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.566 suites 1 1 n/a 0 0 00:15:42.566 tests 8 8 8 0 0 00:15:42.566 asserts 196 196 196 0 n/a 00:15:42.566 00:15:42.566 Elapsed time = 0.002 seconds 00:15:42.566 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@60 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_mempool.c/ftl_mempool_ut 00:15:42.566 00:15:42.566 00:15:42.567 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.567 http://cunit.sourceforge.net/ 00:15:42.567 00:15:42.567 00:15:42.567 Suite: ftl_mempool 00:15:42.567 Test: test_ftl_mempool_create ...passed 00:15:42.567 Test: test_ftl_mempool_get_put ...passed 00:15:42.567 00:15:42.567 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.567 suites 1 1 n/a 0 0 00:15:42.567 tests 2 2 2 0 0 00:15:42.567 asserts 36 36 36 0 n/a 00:15:42.567 00:15:42.567 Elapsed time = 0.000 seconds 00:15:42.567 12:18:05 unittest.unittest_ftl -- unit/unittest.sh@61 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_l2p/ftl_l2p_ut 00:15:42.567 00:15:42.567 00:15:42.567 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.567 http://cunit.sourceforge.net/ 00:15:42.567 00:15:42.567 00:15:42.567 Suite: ftl_addr64_suite 00:15:42.567 Test: test_addr_cached ...passed 00:15:42.567 00:15:42.567 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.567 suites 1 1 n/a 0 0 00:15:42.567 tests 1 1 1 0 0 00:15:42.567 asserts 1536 1536 1536 0 n/a 00:15:42.567 00:15:42.567 Elapsed time = 0.000 seconds 00:15:42.567 12:18:06 unittest.unittest_ftl -- unit/unittest.sh@62 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_sb/ftl_sb_ut 00:15:42.567 00:15:42.567 00:15:42.567 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.567 http://cunit.sourceforge.net/ 00:15:42.567 00:15:42.567 00:15:42.567 Suite: ftl_sb 00:15:42.567 Test: test_sb_crc_v2 ...passed 00:15:42.567 Test: test_sb_crc_v3 ...passed 00:15:42.567 Test: test_sb_v3_md_layout ...[2024-06-07 12:18:06.039187] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 143:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Missing regions 00:15:42.567 [2024-06-07 12:18:06.039526] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 131:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Buffer overflow 00:15:42.567 [2024-06-07 12:18:06.039601] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 115:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Buffer overflow 00:15:42.567 [2024-06-07 12:18:06.039662] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 115:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Buffer overflow 00:15:42.567 [2024-06-07 12:18:06.039727] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 125:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Looping regions found 00:15:42.567 [2024-06-07 12:18:06.039898] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 93:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Unsupported MD region type found 00:15:42.567 [2024-06-07 12:18:06.039984] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 88:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Invalid MD region type found 00:15:42.567 [2024-06-07 12:18:06.040090] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 88:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Invalid MD region type found 00:15:42.567 [2024-06-07 12:18:06.040172] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 125:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Looping regions found 00:15:42.567 [2024-06-07 12:18:06.040270] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 105:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Multiple/looping regions found 00:15:42.567 passed 00:15:42.567 Test: test_sb_v5_md_layout ...[2024-06-07 12:18:06.040343] /home/vagrant/spdk_repo/spdk/lib/ftl/upgrade/ftl_sb_v3.c: 105:ftl_superblock_v3_md_layout_load_all: *ERROR*: [FTL][(null)] Multiple/looping regions found 00:15:42.567 passed 00:15:42.567 00:15:42.567 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.567 suites 1 1 n/a 0 0 00:15:42.567 tests 4 4 4 0 0 00:15:42.567 asserts 160 160 160 0 n/a 00:15:42.567 00:15:42.567 Elapsed time = 0.002 seconds 00:15:42.567 12:18:06 unittest.unittest_ftl -- unit/unittest.sh@63 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_layout_upgrade/ftl_layout_upgrade_ut 00:15:42.567 00:15:42.567 00:15:42.567 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.567 http://cunit.sourceforge.net/ 00:15:42.567 00:15:42.567 00:15:42.567 Suite: ftl_layout_upgrade 00:15:42.567 Test: test_l2p_upgrade ...passed 00:15:42.567 00:15:42.567 Run Summary: Type Total Ran Passed Failed Inactive 00:15:42.567 suites 1 1 n/a 0 0 00:15:42.567 tests 1 1 1 0 0 00:15:42.567 asserts 152 152 152 0 n/a 00:15:42.567 00:15:42.567 Elapsed time = 0.001 seconds 00:15:42.567 12:18:06 unittest.unittest_ftl -- unit/unittest.sh@64 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ftl/ftl_p2l.c/ftl_p2l_ut 00:15:42.567 00:15:42.567 00:15:42.567 CUnit - A unit testing framework for C - Version 2.1-3 00:15:42.567 http://cunit.sourceforge.net/ 00:15:42.567 00:15:42.567 00:15:42.567 Suite: ftl_p2l_suite 00:15:42.567 Test: test_p2l_num_pages ...passed 00:15:43.151 Test: test_ckpt_issue ...passed 00:15:43.716 Test: test_persist_band_p2l ...passed 00:15:44.649 Test: test_clean_restore_p2l ...passed 00:15:46.026 Test: test_dirty_restore_p2l ...passed 00:15:46.026 00:15:46.026 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.026 suites 1 1 n/a 0 0 00:15:46.026 tests 5 5 5 0 0 00:15:46.026 asserts 10020 10020 10020 0 n/a 00:15:46.026 00:15:46.026 Elapsed time = 3.275 seconds 00:15:46.026 00:15:46.026 real 0m3.916s 00:15:46.026 user 0m1.284s 00:15:46.026 sys 0m2.616s 00:15:46.026 12:18:09 unittest.unittest_ftl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.026 ************************************ 00:15:46.026 END TEST unittest_ftl 00:15:46.026 12:18:09 unittest.unittest_ftl -- common/autotest_common.sh@10 -- # set +x 00:15:46.026 ************************************ 00:15:46.026 12:18:09 unittest -- unit/unittest.sh@239 -- # run_test unittest_accel /home/vagrant/spdk_repo/spdk/test/unit/lib/accel/accel.c/accel_ut 00:15:46.026 12:18:09 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.026 12:18:09 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.026 12:18:09 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.026 ************************************ 00:15:46.026 START TEST unittest_accel 00:15:46.026 ************************************ 00:15:46.026 12:18:09 unittest.unittest_accel -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/accel/accel.c/accel_ut 00:15:46.026 00:15:46.026 00:15:46.026 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.026 http://cunit.sourceforge.net/ 00:15:46.026 00:15:46.026 00:15:46.026 Suite: accel_sequence 00:15:46.026 Test: test_sequence_fill_copy ...passed 00:15:46.026 Test: test_sequence_abort ...passed 00:15:46.026 Test: test_sequence_append_error ...passed 00:15:46.026 Test: test_sequence_completion_error ...[2024-06-07 12:18:09.506380] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1931:accel_sequence_task_cb: *ERROR*: Failed to execute fill operation, sequence: 0x7fc5842547c0 00:15:46.026 [2024-06-07 12:18:09.506676] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1931:accel_sequence_task_cb: *ERROR*: Failed to execute decompress operation, sequence: 0x7fc5842547c0 00:15:46.026 [2024-06-07 12:18:09.506783] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1841:accel_process_sequence: *ERROR*: Failed to submit fill operation, sequence: 0x7fc5842547c0 00:15:46.026 [2024-06-07 12:18:09.506825] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1841:accel_process_sequence: *ERROR*: Failed to submit decompress operation, sequence: 0x7fc5842547c0 00:15:46.026 passed 00:15:46.026 Test: test_sequence_decompress ...passed 00:15:46.026 Test: test_sequence_reverse ...passed 00:15:46.026 Test: test_sequence_copy_elision ...passed 00:15:46.026 Test: test_sequence_accel_buffers ...passed 00:15:46.026 Test: test_sequence_memory_domain ...[2024-06-07 12:18:09.515688] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1733:accel_task_pull_data: *ERROR*: Failed to pull data from memory domain: UT_DMA, rc: -7 00:15:46.026 passed 00:15:46.027 Test: test_sequence_module_memory_domain ...[2024-06-07 12:18:09.515853] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1772:accel_task_push_data: *ERROR*: Failed to push data to memory domain: UT_DMA, rc: -98 00:15:46.027 passed 00:15:46.027 Test: test_sequence_crypto ...passed 00:15:46.027 Test: test_sequence_driver ...[2024-06-07 12:18:09.520969] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1880:accel_process_sequence: *ERROR*: Failed to execute sequence: 0x7fc5830ca7c0 using driver: ut 00:15:46.027 [2024-06-07 12:18:09.521082] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c:1944:accel_sequence_task_cb: *ERROR*: Failed to execute fill operation, sequence: 0x7fc5830ca7c0 through driver: ut 00:15:46.027 passed 00:15:46.027 Test: test_sequence_same_iovs ...passed 00:15:46.027 Test: test_sequence_crc32 ...passed 00:15:46.027 Suite: accel 00:15:46.027 Test: test_spdk_accel_task_complete ...passed 00:15:46.027 Test: test_get_task ...passed 00:15:46.027 Test: test_spdk_accel_submit_copy ...passed 00:15:46.027 Test: test_spdk_accel_submit_dualcast ...[2024-06-07 12:18:09.524947] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c: 416:spdk_accel_submit_dualcast: *ERROR*: Dualcast requires 4K alignment on dst addresses 00:15:46.027 passed 00:15:46.027 Test: test_spdk_accel_submit_compare ...passed 00:15:46.027 Test: test_spdk_accel_submit_fill ...passed 00:15:46.027 Test: test_spdk_accel_submit_crc32c ...passed 00:15:46.027 Test: test_spdk_accel_submit_crc32cv ...[2024-06-07 12:18:09.525008] /home/vagrant/spdk_repo/spdk/lib/accel/accel.c: 416:spdk_accel_submit_dualcast: *ERROR*: Dualcast requires 4K alignment on dst addresses 00:15:46.027 passed 00:15:46.027 Test: test_spdk_accel_submit_copy_crc32c ...passed 00:15:46.027 Test: test_spdk_accel_submit_xor ...passed 00:15:46.027 Test: test_spdk_accel_module_find_by_name ...passed 00:15:46.027 Test: test_spdk_accel_module_register ...passed 00:15:46.027 00:15:46.027 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.027 suites 2 2 n/a 0 0 00:15:46.027 tests 26 26 26 0 0 00:15:46.027 asserts 830 830 830 0 n/a 00:15:46.027 00:15:46.027 Elapsed time = 0.026 seconds 00:15:46.027 00:15:46.027 real 0m0.068s 00:15:46.027 user 0m0.027s 00:15:46.027 sys 0m0.041s 00:15:46.027 12:18:09 unittest.unittest_accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.027 12:18:09 unittest.unittest_accel -- common/autotest_common.sh@10 -- # set +x 00:15:46.027 ************************************ 00:15:46.027 END TEST unittest_accel 00:15:46.027 ************************************ 00:15:46.027 12:18:09 unittest -- unit/unittest.sh@240 -- # run_test unittest_ioat /home/vagrant/spdk_repo/spdk/test/unit/lib/ioat/ioat.c/ioat_ut 00:15:46.027 12:18:09 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.027 12:18:09 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.027 12:18:09 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.027 ************************************ 00:15:46.027 START TEST unittest_ioat 00:15:46.027 ************************************ 00:15:46.027 12:18:09 unittest.unittest_ioat -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/ioat/ioat.c/ioat_ut 00:15:46.027 00:15:46.027 00:15:46.027 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.027 http://cunit.sourceforge.net/ 00:15:46.027 00:15:46.027 00:15:46.027 Suite: ioat 00:15:46.027 Test: ioat_state_check ...passed 00:15:46.027 00:15:46.027 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.027 suites 1 1 n/a 0 0 00:15:46.027 tests 1 1 1 0 0 00:15:46.027 asserts 32 32 32 0 n/a 00:15:46.027 00:15:46.027 Elapsed time = 0.000 seconds 00:15:46.027 00:15:46.027 real 0m0.030s 00:15:46.027 user 0m0.022s 00:15:46.027 sys 0m0.008s 00:15:46.027 12:18:09 unittest.unittest_ioat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.027 12:18:09 unittest.unittest_ioat -- common/autotest_common.sh@10 -- # set +x 00:15:46.027 ************************************ 00:15:46.027 END TEST unittest_ioat 00:15:46.027 ************************************ 00:15:46.333 12:18:09 unittest -- unit/unittest.sh@241 -- # grep -q '#define SPDK_CONFIG_IDXD 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:46.333 12:18:09 unittest -- unit/unittest.sh@242 -- # run_test unittest_idxd_user /home/vagrant/spdk_repo/spdk/test/unit/lib/idxd/idxd_user.c/idxd_user_ut 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.333 ************************************ 00:15:46.333 START TEST unittest_idxd_user 00:15:46.333 ************************************ 00:15:46.333 12:18:09 unittest.unittest_idxd_user -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/idxd/idxd_user.c/idxd_user_ut 00:15:46.333 00:15:46.333 00:15:46.333 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.333 http://cunit.sourceforge.net/ 00:15:46.333 00:15:46.333 00:15:46.333 Suite: idxd_user 00:15:46.333 Test: test_idxd_wait_cmd ...[2024-06-07 12:18:09.718691] /home/vagrant/spdk_repo/spdk/lib/idxd/idxd_user.c: 52:idxd_wait_cmd: *ERROR*: Command status reg reports error 0x1 00:15:46.333 [2024-06-07 12:18:09.719463] /home/vagrant/spdk_repo/spdk/lib/idxd/idxd_user.c: 46:idxd_wait_cmd: *ERROR*: Command timeout, waited 1 00:15:46.333 passed 00:15:46.333 Test: test_idxd_reset_dev ...[2024-06-07 12:18:09.719868] /home/vagrant/spdk_repo/spdk/lib/idxd/idxd_user.c: 52:idxd_wait_cmd: *ERROR*: Command status reg reports error 0x1 00:15:46.333 [2024-06-07 12:18:09.719967] /home/vagrant/spdk_repo/spdk/lib/idxd/idxd_user.c: 132:idxd_reset_dev: *ERROR*: Error resetting device 4294967274 00:15:46.333 passed 00:15:46.333 Test: test_idxd_group_config ...passed 00:15:46.333 Test: test_idxd_wq_config ...passed 00:15:46.333 00:15:46.333 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.333 suites 1 1 n/a 0 0 00:15:46.333 tests 4 4 4 0 0 00:15:46.333 asserts 20 20 20 0 n/a 00:15:46.333 00:15:46.333 Elapsed time = 0.001 seconds 00:15:46.333 00:15:46.333 real 0m0.035s 00:15:46.333 user 0m0.016s 00:15:46.333 sys 0m0.019s 00:15:46.333 12:18:09 unittest.unittest_idxd_user -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.333 12:18:09 unittest.unittest_idxd_user -- common/autotest_common.sh@10 -- # set +x 00:15:46.333 ************************************ 00:15:46.333 END TEST unittest_idxd_user 00:15:46.333 ************************************ 00:15:46.333 12:18:09 unittest -- unit/unittest.sh@244 -- # run_test unittest_iscsi unittest_iscsi 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.333 12:18:09 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.333 ************************************ 00:15:46.333 START TEST unittest_iscsi 00:15:46.333 ************************************ 00:15:46.333 12:18:09 unittest.unittest_iscsi -- common/autotest_common.sh@1124 -- # unittest_iscsi 00:15:46.333 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@68 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/conn.c/conn_ut 00:15:46.333 00:15:46.333 00:15:46.333 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.333 http://cunit.sourceforge.net/ 00:15:46.333 00:15:46.333 00:15:46.333 Suite: conn_suite 00:15:46.333 Test: read_task_split_in_order_case ...passed 00:15:46.333 Test: read_task_split_reverse_order_case ...passed 00:15:46.333 Test: propagate_scsi_error_status_for_split_read_tasks ...passed 00:15:46.333 Test: process_non_read_task_completion_test ...passed 00:15:46.333 Test: free_tasks_on_connection ...passed 00:15:46.333 Test: free_tasks_with_queued_datain ...passed 00:15:46.333 Test: abort_queued_datain_task_test ...passed 00:15:46.333 Test: abort_queued_datain_tasks_test ...passed 00:15:46.333 00:15:46.333 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.333 suites 1 1 n/a 0 0 00:15:46.333 tests 8 8 8 0 0 00:15:46.333 asserts 230 230 230 0 n/a 00:15:46.333 00:15:46.333 Elapsed time = 0.000 seconds 00:15:46.333 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@69 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/param.c/param_ut 00:15:46.333 00:15:46.333 00:15:46.333 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.333 http://cunit.sourceforge.net/ 00:15:46.333 00:15:46.333 00:15:46.333 Suite: iscsi_suite 00:15:46.333 Test: param_negotiation_test ...passed 00:15:46.333 Test: list_negotiation_test ...passed 00:15:46.333 Test: parse_valid_test ...passed 00:15:46.333 Test: parse_invalid_test ...[2024-06-07 12:18:09.849943] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 201:iscsi_parse_param: *ERROR*: '=' not found 00:15:46.333 [2024-06-07 12:18:09.850541] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 201:iscsi_parse_param: *ERROR*: '=' not found 00:15:46.333 [2024-06-07 12:18:09.850603] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 207:iscsi_parse_param: *ERROR*: Empty key 00:15:46.333 [2024-06-07 12:18:09.850691] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 247:iscsi_parse_param: *ERROR*: Overflow Val 8193 00:15:46.333 [2024-06-07 12:18:09.851264] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 247:iscsi_parse_param: *ERROR*: Overflow Val 256 00:15:46.333 [2024-06-07 12:18:09.851526] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 214:iscsi_parse_param: *ERROR*: Key name length is bigger than 63 00:15:46.333 [2024-06-07 12:18:09.851815] /home/vagrant/spdk_repo/spdk/lib/iscsi/param.c: 228:iscsi_parse_param: *ERROR*: Duplicated Key B 00:15:46.333 passed 00:15:46.333 00:15:46.333 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.333 suites 1 1 n/a 0 0 00:15:46.333 tests 4 4 4 0 0 00:15:46.333 asserts 161 161 161 0 n/a 00:15:46.333 00:15:46.333 Elapsed time = 0.005 seconds 00:15:46.333 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@70 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/tgt_node.c/tgt_node_ut 00:15:46.333 00:15:46.333 00:15:46.333 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.333 http://cunit.sourceforge.net/ 00:15:46.333 00:15:46.333 00:15:46.333 Suite: iscsi_target_node_suite 00:15:46.333 Test: add_lun_test_cases ...[2024-06-07 12:18:09.884914] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1252:iscsi_tgt_node_add_lun: *ERROR*: Target has active connections (count=1) 00:15:46.333 [2024-06-07 12:18:09.885241] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1258:iscsi_tgt_node_add_lun: *ERROR*: Specified LUN ID (-2) is negative 00:15:46.333 passed 00:15:46.333 Test: allow_any_allowed ...passed 00:15:46.333 Test: allow_ipv6_allowed ...passed 00:15:46.333 Test: allow_ipv6_denied ...passed 00:15:46.333 Test: allow_ipv6_invalid ...passed 00:15:46.333 Test: allow_ipv4_allowed ...passed 00:15:46.333 Test: allow_ipv4_denied ...passed[2024-06-07 12:18:09.885350] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1264:iscsi_tgt_node_add_lun: *ERROR*: SCSI device is not found 00:15:46.333 [2024-06-07 12:18:09.885401] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1264:iscsi_tgt_node_add_lun: *ERROR*: SCSI device is not found 00:15:46.333 [2024-06-07 12:18:09.885435] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1270:iscsi_tgt_node_add_lun: *ERROR*: spdk_scsi_dev_add_lun failed 00:15:46.333 00:15:46.333 Test: allow_ipv4_invalid ...passed 00:15:46.333 Test: node_access_allowed ...passed 00:15:46.333 Test: node_access_denied_by_empty_netmask ...passed 00:15:46.333 Test: node_access_multi_initiator_groups_cases ...passed 00:15:46.333 Test: allow_iscsi_name_multi_maps_case ...passed 00:15:46.333 Test: chap_param_test_cases ...[2024-06-07 12:18:09.885854] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1039:iscsi_check_chap_params: *ERROR*: Invalid combination of CHAP params (d=1,r=1,m=0) 00:15:46.333 [2024-06-07 12:18:09.885909] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1039:iscsi_check_chap_params: *ERROR*: Invalid combination of CHAP params (d=0,r=0,m=1) 00:15:46.333 [2024-06-07 12:18:09.886011] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1039:iscsi_check_chap_params: *ERROR*: Invalid combination of CHAP params (d=1,r=0,m=1) 00:15:46.333 [2024-06-07 12:18:09.886067] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1039:iscsi_check_chap_params: *ERROR*: Invalid combination of CHAP params (d=1,r=1,m=1) 00:15:46.333 passed 00:15:46.333 00:15:46.333 [2024-06-07 12:18:09.886130] /home/vagrant/spdk_repo/spdk/lib/iscsi/tgt_node.c:1030:iscsi_check_chap_params: *ERROR*: Invalid auth group ID (-1) 00:15:46.333 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.333 suites 1 1 n/a 0 0 00:15:46.333 tests 13 13 13 0 0 00:15:46.333 asserts 50 50 50 0 n/a 00:15:46.333 00:15:46.333 Elapsed time = 0.001 seconds 00:15:46.334 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@71 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/iscsi.c/iscsi_ut 00:15:46.334 00:15:46.334 00:15:46.334 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.334 http://cunit.sourceforge.net/ 00:15:46.334 00:15:46.334 00:15:46.334 Suite: iscsi_suite 00:15:46.334 Test: op_login_check_target_test ...passed 00:15:46.334 Test: op_login_session_normal_test ...[2024-06-07 12:18:09.910951] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1434:iscsi_op_login_check_target: *ERROR*: access denied 00:15:46.334 [2024-06-07 12:18:09.911218] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1626:iscsi_op_login_session_normal: *ERROR*: TargetName is empty 00:15:46.334 [2024-06-07 12:18:09.911264] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1626:iscsi_op_login_session_normal: *ERROR*: TargetName is empty 00:15:46.334 [2024-06-07 12:18:09.911300] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1626:iscsi_op_login_session_normal: *ERROR*: TargetName is empty 00:15:46.334 [2024-06-07 12:18:09.911344] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c: 695:append_iscsi_sess: *ERROR*: spdk_get_iscsi_sess_by_tsih failed 00:15:46.334 [2024-06-07 12:18:09.911445] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1467:iscsi_op_login_check_session: *ERROR*: isid=0, tsih=256, cid=0:spdk_append_iscsi_sess() failed 00:15:46.334 passed 00:15:46.334 Test: maxburstlength_test ...[2024-06-07 12:18:09.911554] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c: 702:append_iscsi_sess: *ERROR*: no MCS session for init port name=iqn.2017-11.spdk.io:i0001, tsih=256, cid=0 00:15:46.334 [2024-06-07 12:18:09.911606] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1467:iscsi_op_login_check_session: *ERROR*: isid=0, tsih=256, cid=0:spdk_append_iscsi_sess() failed 00:15:46.334 [2024-06-07 12:18:09.911800] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4217:iscsi_pdu_hdr_op_data: *ERROR*: the dataout pdu data length is larger than the value sent by R2T PDU 00:15:46.334 [2024-06-07 12:18:09.911853] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4554:iscsi_pdu_hdr_handle: *ERROR*: processing PDU header (opcode=5) failed on NULL(NULL) 00:15:46.334 passed 00:15:46.334 Test: underflow_for_read_transfer_test ...passed 00:15:46.334 Test: underflow_for_zero_read_transfer_test ...passed 00:15:46.334 Test: underflow_for_request_sense_test ...passed 00:15:46.334 Test: underflow_for_check_condition_test ...passed 00:15:46.334 Test: add_transfer_task_test ...passed 00:15:46.334 Test: get_transfer_task_test ...passed 00:15:46.334 Test: del_transfer_task_test ...passed 00:15:46.334 Test: clear_all_transfer_tasks_test ...passed 00:15:46.334 Test: build_iovs_test ...passed 00:15:46.334 Test: build_iovs_with_md_test ...passed 00:15:46.334 Test: pdu_hdr_op_login_test ...[2024-06-07 12:18:09.912724] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1251:iscsi_op_login_rsp_init: *ERROR*: transit error 00:15:46.334 [2024-06-07 12:18:09.912822] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1258:iscsi_op_login_rsp_init: *ERROR*: unsupported version min 1/max 0, expecting 0 00:15:46.334 [2024-06-07 12:18:09.912896] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:1272:iscsi_op_login_rsp_init: *ERROR*: Received reserved NSG code: 2 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_text_test ...[2024-06-07 12:18:09.912970] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:2246:iscsi_pdu_hdr_op_text: *ERROR*: data segment len(=69) > immediate data len(=68) 00:15:46.334 [2024-06-07 12:18:09.913044] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:2278:iscsi_pdu_hdr_op_text: *ERROR*: final and continue 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_logout_test ...[2024-06-07 12:18:09.913084] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:2291:iscsi_pdu_hdr_op_text: *ERROR*: The correct itt is 5679, and the current itt is 5678... 00:15:46.334 [2024-06-07 12:18:09.913159] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:2521:iscsi_pdu_hdr_op_logout: *ERROR*: Target can accept logout only with reason "close the session" on discovery session. 1 is not acceptable reason. 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_scsi_test ...[2024-06-07 12:18:09.913304] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3342:iscsi_pdu_hdr_op_scsi: *ERROR*: ISCSI_OP_SCSI not allowed in discovery and invalid session 00:15:46.334 [2024-06-07 12:18:09.913336] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3342:iscsi_pdu_hdr_op_scsi: *ERROR*: ISCSI_OP_SCSI not allowed in discovery and invalid session 00:15:46.334 [2024-06-07 12:18:09.913380] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3370:iscsi_pdu_hdr_op_scsi: *ERROR*: Bidirectional CDB is not supported 00:15:46.334 [2024-06-07 12:18:09.913455] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3403:iscsi_pdu_hdr_op_scsi: *ERROR*: data segment len(=69) > immediate data len(=68) 00:15:46.334 [2024-06-07 12:18:09.913528] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3410:iscsi_pdu_hdr_op_scsi: *ERROR*: data segment len(=68) > task transfer len(=67) 00:15:46.334 [2024-06-07 12:18:09.913673] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3434:iscsi_pdu_hdr_op_scsi: *ERROR*: Reject scsi cmd with EDTL > 0 but (R | W) == 0 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_task_mgmt_test ...[2024-06-07 12:18:09.913765] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3611:iscsi_pdu_hdr_op_task: *ERROR*: ISCSI_OP_TASK not allowed in discovery and invalid session 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_nopout_test ...[2024-06-07 12:18:09.913817] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3700:iscsi_pdu_hdr_op_task: *ERROR*: unsupported function 0 00:15:46.334 [2024-06-07 12:18:09.913979] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3719:iscsi_pdu_hdr_op_nopout: *ERROR*: ISCSI_OP_NOPOUT not allowed in discovery session 00:15:46.334 [2024-06-07 12:18:09.914060] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3741:iscsi_pdu_hdr_op_nopout: *ERROR*: invalid transfer tag 0x4d3 00:15:46.334 passed 00:15:46.334 Test: pdu_hdr_op_data_test ...[2024-06-07 12:18:09.914090] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3741:iscsi_pdu_hdr_op_nopout: *ERROR*: invalid transfer tag 0x4d3 00:15:46.334 [2024-06-07 12:18:09.914128] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:3749:iscsi_pdu_hdr_op_nopout: *ERROR*: got NOPOUT ITT=0xffffffff, I=0 00:15:46.334 [2024-06-07 12:18:09.914162] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4192:iscsi_pdu_hdr_op_data: *ERROR*: ISCSI_OP_SCSI_DATAOUT not allowed in discovery session 00:15:46.334 [2024-06-07 12:18:09.914209] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4209:iscsi_pdu_hdr_op_data: *ERROR*: Not found task for transfer_tag=0 00:15:46.334 [2024-06-07 12:18:09.914282] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4217:iscsi_pdu_hdr_op_data: *ERROR*: the dataout pdu data length is larger than the value sent by R2T PDU 00:15:46.334 [2024-06-07 12:18:09.914335] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4222:iscsi_pdu_hdr_op_data: *ERROR*: The r2t task tag is 0, and the dataout task tag is 1 00:15:46.334 [2024-06-07 12:18:09.914388] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4228:iscsi_pdu_hdr_op_data: *ERROR*: DataSN(1) exp=0 error 00:15:46.334 passed 00:15:46.334 Test: empty_text_with_cbit_test ...passed 00:15:46.334 Test: pdu_payload_read_test ...[2024-06-07 12:18:09.914476] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4239:iscsi_pdu_hdr_op_data: *ERROR*: offset(4096) error 00:15:46.334 [2024-06-07 12:18:09.914503] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4249:iscsi_pdu_hdr_op_data: *ERROR*: R2T burst(65536) > MaxBurstLength(65535) 00:15:46.334 [2024-06-07 12:18:09.915314] /home/vagrant/spdk_repo/spdk/lib/iscsi/iscsi.c:4637:iscsi_pdu_payload_read: *ERROR*: Data(65537) > MaxSegment(65536) 00:15:46.334 passed 00:15:46.334 Test: data_out_pdu_sequence_test ...passed 00:15:46.334 Test: immediate_data_and_data_out_pdu_sequence_test ...passed 00:15:46.334 00:15:46.334 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.334 suites 1 1 n/a 0 0 00:15:46.334 tests 24 24 24 0 0 00:15:46.334 asserts 150253 150253 150253 0 n/a 00:15:46.334 00:15:46.334 Elapsed time = 0.008 seconds 00:15:46.334 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@72 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/init_grp.c/init_grp_ut 00:15:46.334 00:15:46.334 00:15:46.334 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.334 http://cunit.sourceforge.net/ 00:15:46.334 00:15:46.334 00:15:46.334 Suite: init_grp_suite 00:15:46.334 Test: create_initiator_group_success_case ...passed 00:15:46.334 Test: find_initiator_group_success_case ...passed 00:15:46.334 Test: register_initiator_group_twice_case ...passed 00:15:46.334 Test: add_initiator_name_success_case ...passed 00:15:46.334 Test: add_initiator_name_fail_case ...[2024-06-07 12:18:09.948013] /home/vagrant/spdk_repo/spdk/lib/iscsi/init_grp.c: 54:iscsi_init_grp_add_initiator: *ERROR*: > MAX_INITIATOR(=256) is not allowed 00:15:46.334 passed 00:15:46.334 Test: delete_all_initiator_names_success_case ...passed 00:15:46.334 Test: add_netmask_success_case ...passed 00:15:46.334 Test: add_netmask_fail_case ...passed 00:15:46.334 Test: delete_all_netmasks_success_case ...[2024-06-07 12:18:09.948452] /home/vagrant/spdk_repo/spdk/lib/iscsi/init_grp.c: 188:iscsi_init_grp_add_netmask: *ERROR*: > MAX_NETMASK(=256) is not allowed 00:15:46.334 passed 00:15:46.334 Test: initiator_name_overwrite_all_to_any_case ...passed 00:15:46.334 Test: netmask_overwrite_all_to_any_case ...passed 00:15:46.334 Test: add_delete_initiator_names_case ...passed 00:15:46.334 Test: add_duplicated_initiator_names_case ...passed 00:15:46.334 Test: delete_nonexisting_initiator_names_case ...passed 00:15:46.334 Test: add_delete_netmasks_case ...passed 00:15:46.334 Test: add_duplicated_netmasks_case ...passed 00:15:46.334 Test: delete_nonexisting_netmasks_case ...passed 00:15:46.334 00:15:46.334 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.334 suites 1 1 n/a 0 0 00:15:46.334 tests 17 17 17 0 0 00:15:46.334 asserts 108 108 108 0 n/a 00:15:46.334 00:15:46.334 Elapsed time = 0.001 seconds 00:15:46.334 12:18:09 unittest.unittest_iscsi -- unit/unittest.sh@73 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/iscsi/portal_grp.c/portal_grp_ut 00:15:46.593 00:15:46.593 00:15:46.593 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.593 http://cunit.sourceforge.net/ 00:15:46.593 00:15:46.593 00:15:46.593 Suite: portal_grp_suite 00:15:46.593 Test: portal_create_ipv4_normal_case ...passed 00:15:46.593 Test: portal_create_ipv6_normal_case ...passed 00:15:46.593 Test: portal_create_ipv4_wildcard_case ...passed 00:15:46.593 Test: portal_create_ipv6_wildcard_case ...passed 00:15:46.593 Test: portal_create_twice_case ...passed 00:15:46.593 Test: portal_grp_register_unregister_case ...passed 00:15:46.593 Test: portal_grp_register_twice_case ...[2024-06-07 12:18:09.979960] /home/vagrant/spdk_repo/spdk/lib/iscsi/portal_grp.c: 113:iscsi_portal_create: *ERROR*: portal (192.168.2.0, 3260) already exists 00:15:46.593 passed 00:15:46.593 Test: portal_grp_add_delete_case ...passed 00:15:46.593 Test: portal_grp_add_delete_twice_case ...passed 00:15:46.593 00:15:46.593 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.593 suites 1 1 n/a 0 0 00:15:46.593 tests 9 9 9 0 0 00:15:46.593 asserts 44 44 44 0 n/a 00:15:46.593 00:15:46.593 Elapsed time = 0.002 seconds 00:15:46.593 00:15:46.593 real 0m0.195s 00:15:46.593 user 0m0.108s 00:15:46.593 sys 0m0.086s 00:15:46.593 12:18:09 unittest.unittest_iscsi -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.593 12:18:09 unittest.unittest_iscsi -- common/autotest_common.sh@10 -- # set +x 00:15:46.593 ************************************ 00:15:46.593 END TEST unittest_iscsi 00:15:46.593 ************************************ 00:15:46.593 12:18:10 unittest -- unit/unittest.sh@245 -- # run_test unittest_json unittest_json 00:15:46.593 12:18:10 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.593 12:18:10 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.593 12:18:10 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.593 ************************************ 00:15:46.593 START TEST unittest_json 00:15:46.593 ************************************ 00:15:46.593 12:18:10 unittest.unittest_json -- common/autotest_common.sh@1124 -- # unittest_json 00:15:46.593 12:18:10 unittest.unittest_json -- unit/unittest.sh@77 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/json/json_parse.c/json_parse_ut 00:15:46.593 00:15:46.593 00:15:46.593 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.593 http://cunit.sourceforge.net/ 00:15:46.593 00:15:46.593 00:15:46.593 Suite: json 00:15:46.593 Test: test_parse_literal ...passed 00:15:46.593 Test: test_parse_string_simple ...passed 00:15:46.593 Test: test_parse_string_control_chars ...passed 00:15:46.593 Test: test_parse_string_utf8 ...passed 00:15:46.593 Test: test_parse_string_escapes_twochar ...passed 00:15:46.593 Test: test_parse_string_escapes_unicode ...passed 00:15:46.593 Test: test_parse_number ...passed 00:15:46.593 Test: test_parse_array ...passed 00:15:46.593 Test: test_parse_object ...passed 00:15:46.593 Test: test_parse_nesting ...passed 00:15:46.593 Test: test_parse_comment ...passed 00:15:46.593 00:15:46.593 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.593 suites 1 1 n/a 0 0 00:15:46.593 tests 11 11 11 0 0 00:15:46.593 asserts 1516 1516 1516 0 n/a 00:15:46.593 00:15:46.593 Elapsed time = 0.001 seconds 00:15:46.593 12:18:10 unittest.unittest_json -- unit/unittest.sh@78 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/json/json_util.c/json_util_ut 00:15:46.593 00:15:46.593 00:15:46.593 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.593 http://cunit.sourceforge.net/ 00:15:46.593 00:15:46.593 00:15:46.593 Suite: json 00:15:46.593 Test: test_strequal ...passed 00:15:46.593 Test: test_num_to_uint16 ...passed 00:15:46.593 Test: test_num_to_int32 ...passed 00:15:46.593 Test: test_num_to_uint64 ...passed 00:15:46.593 Test: test_decode_object ...passed 00:15:46.593 Test: test_decode_array ...passed 00:15:46.593 Test: test_decode_bool ...passed 00:15:46.593 Test: test_decode_uint16 ...passed 00:15:46.593 Test: test_decode_int32 ...passed 00:15:46.593 Test: test_decode_uint32 ...passed 00:15:46.593 Test: test_decode_uint64 ...passed 00:15:46.593 Test: test_decode_string ...passed 00:15:46.593 Test: test_decode_uuid ...passed 00:15:46.593 Test: test_find ...passed 00:15:46.593 Test: test_find_array ...passed 00:15:46.593 Test: test_iterating ...passed 00:15:46.593 Test: test_free_object ...passed 00:15:46.593 00:15:46.593 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.593 suites 1 1 n/a 0 0 00:15:46.593 tests 17 17 17 0 0 00:15:46.593 asserts 236 236 236 0 n/a 00:15:46.593 00:15:46.593 Elapsed time = 0.001 seconds 00:15:46.593 12:18:10 unittest.unittest_json -- unit/unittest.sh@79 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/json/json_write.c/json_write_ut 00:15:46.593 00:15:46.593 00:15:46.593 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.593 http://cunit.sourceforge.net/ 00:15:46.593 00:15:46.593 00:15:46.593 Suite: json 00:15:46.593 Test: test_write_literal ...passed 00:15:46.593 Test: test_write_string_simple ...passed 00:15:46.593 Test: test_write_string_escapes ...passed 00:15:46.593 Test: test_write_string_utf16le ...passed 00:15:46.593 Test: test_write_number_int32 ...passed 00:15:46.593 Test: test_write_number_uint32 ...passed 00:15:46.593 Test: test_write_number_uint128 ...passed 00:15:46.593 Test: test_write_string_number_uint128 ...passed 00:15:46.593 Test: test_write_number_int64 ...passed 00:15:46.593 Test: test_write_number_uint64 ...passed 00:15:46.593 Test: test_write_number_double ...passed 00:15:46.593 Test: test_write_uuid ...passed 00:15:46.593 Test: test_write_array ...passed 00:15:46.593 Test: test_write_object ...passed 00:15:46.593 Test: test_write_nesting ...passed 00:15:46.593 Test: test_write_val ...passed 00:15:46.593 00:15:46.593 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.593 suites 1 1 n/a 0 0 00:15:46.593 tests 16 16 16 0 0 00:15:46.593 asserts 918 918 918 0 n/a 00:15:46.593 00:15:46.593 Elapsed time = 0.003 seconds 00:15:46.593 12:18:10 unittest.unittest_json -- unit/unittest.sh@80 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/jsonrpc/jsonrpc_server.c/jsonrpc_server_ut 00:15:46.593 00:15:46.593 00:15:46.593 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.593 http://cunit.sourceforge.net/ 00:15:46.593 00:15:46.593 00:15:46.593 Suite: jsonrpc 00:15:46.593 Test: test_parse_request ...passed 00:15:46.593 Test: test_parse_request_streaming ...passed 00:15:46.593 00:15:46.593 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.593 suites 1 1 n/a 0 0 00:15:46.593 tests 2 2 2 0 0 00:15:46.594 asserts 289 289 289 0 n/a 00:15:46.594 00:15:46.594 Elapsed time = 0.004 seconds 00:15:46.594 00:15:46.594 real 0m0.139s 00:15:46.594 user 0m0.072s 00:15:46.594 sys 0m0.064s 00:15:46.594 12:18:10 unittest.unittest_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.594 12:18:10 unittest.unittest_json -- common/autotest_common.sh@10 -- # set +x 00:15:46.594 ************************************ 00:15:46.594 END TEST unittest_json 00:15:46.594 ************************************ 00:15:46.853 12:18:10 unittest -- unit/unittest.sh@246 -- # run_test unittest_rpc unittest_rpc 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.853 ************************************ 00:15:46.853 START TEST unittest_rpc 00:15:46.853 ************************************ 00:15:46.853 12:18:10 unittest.unittest_rpc -- common/autotest_common.sh@1124 -- # unittest_rpc 00:15:46.853 12:18:10 unittest.unittest_rpc -- unit/unittest.sh@84 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/rpc/rpc.c/rpc_ut 00:15:46.853 00:15:46.853 00:15:46.853 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.853 http://cunit.sourceforge.net/ 00:15:46.853 00:15:46.853 00:15:46.853 Suite: rpc 00:15:46.853 Test: test_jsonrpc_handler ...passed 00:15:46.853 Test: test_spdk_rpc_is_method_allowed ...passed 00:15:46.853 Test: test_rpc_get_methods ...[2024-06-07 12:18:10.278966] /home/vagrant/spdk_repo/spdk/lib/rpc/rpc.c: 446:rpc_get_methods: *ERROR*: spdk_json_decode_object failed 00:15:46.853 passed 00:15:46.853 Test: test_rpc_spdk_get_version ...passed 00:15:46.853 Test: test_spdk_rpc_listen_close ...passed 00:15:46.853 Test: test_rpc_run_multiple_servers ...passed 00:15:46.853 00:15:46.853 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.853 suites 1 1 n/a 0 0 00:15:46.853 tests 6 6 6 0 0 00:15:46.853 asserts 23 23 23 0 n/a 00:15:46.853 00:15:46.853 Elapsed time = 0.001 seconds 00:15:46.853 00:15:46.853 real 0m0.032s 00:15:46.853 user 0m0.016s 00:15:46.853 sys 0m0.015s 00:15:46.853 12:18:10 unittest.unittest_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.853 12:18:10 unittest.unittest_rpc -- common/autotest_common.sh@10 -- # set +x 00:15:46.853 ************************************ 00:15:46.853 END TEST unittest_rpc 00:15:46.853 ************************************ 00:15:46.853 12:18:10 unittest -- unit/unittest.sh@247 -- # run_test unittest_notify /home/vagrant/spdk_repo/spdk/test/unit/lib/notify/notify.c/notify_ut 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.853 ************************************ 00:15:46.853 START TEST unittest_notify 00:15:46.853 ************************************ 00:15:46.853 12:18:10 unittest.unittest_notify -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/notify/notify.c/notify_ut 00:15:46.853 00:15:46.853 00:15:46.853 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.853 http://cunit.sourceforge.net/ 00:15:46.853 00:15:46.853 00:15:46.853 Suite: app_suite 00:15:46.853 Test: notify ...passed 00:15:46.853 00:15:46.853 Run Summary: Type Total Ran Passed Failed Inactive 00:15:46.853 suites 1 1 n/a 0 0 00:15:46.853 tests 1 1 1 0 0 00:15:46.853 asserts 13 13 13 0 n/a 00:15:46.853 00:15:46.853 Elapsed time = 0.000 seconds 00:15:46.853 00:15:46.853 real 0m0.031s 00:15:46.853 user 0m0.015s 00:15:46.853 sys 0m0.016s 00:15:46.853 12:18:10 unittest.unittest_notify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:46.853 12:18:10 unittest.unittest_notify -- common/autotest_common.sh@10 -- # set +x 00:15:46.853 ************************************ 00:15:46.853 END TEST unittest_notify 00:15:46.853 ************************************ 00:15:46.853 12:18:10 unittest -- unit/unittest.sh@248 -- # run_test unittest_nvme unittest_nvme 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:46.853 12:18:10 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:46.853 ************************************ 00:15:46.853 START TEST unittest_nvme 00:15:46.853 ************************************ 00:15:46.853 12:18:10 unittest.unittest_nvme -- common/autotest_common.sh@1124 -- # unittest_nvme 00:15:46.853 12:18:10 unittest.unittest_nvme -- unit/unittest.sh@88 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme.c/nvme_ut 00:15:46.853 00:15:46.853 00:15:46.853 CUnit - A unit testing framework for C - Version 2.1-3 00:15:46.853 http://cunit.sourceforge.net/ 00:15:46.853 00:15:46.853 00:15:46.853 Suite: nvme 00:15:46.853 Test: test_opc_data_transfer ...passed 00:15:46.853 Test: test_spdk_nvme_transport_id_parse_trtype ...passed 00:15:46.853 Test: test_spdk_nvme_transport_id_parse_adrfam ...passed 00:15:46.853 Test: test_trid_parse_and_compare ...[2024-06-07 12:18:10.470936] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1176:parse_next_key: *ERROR*: Key without ':' or '=' separator 00:15:46.853 [2024-06-07 12:18:10.471275] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1233:spdk_nvme_transport_id_parse: *ERROR*: Failed to parse transport ID 00:15:46.853 [2024-06-07 12:18:10.471407] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1188:parse_next_key: *ERROR*: Key length 32 greater than maximum allowed 31 00:15:46.853 [2024-06-07 12:18:10.471459] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1233:spdk_nvme_transport_id_parse: *ERROR*: Failed to parse transport ID 00:15:46.853 [2024-06-07 12:18:10.471501] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1199:parse_next_key: *ERROR*: Key without value 00:15:46.853 [2024-06-07 12:18:10.471627] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1233:spdk_nvme_transport_id_parse: *ERROR*: Failed to parse transport ID 00:15:46.853 passed 00:15:46.853 Test: test_trid_trtype_str ...passed 00:15:46.853 Test: test_trid_adrfam_str ...passed 00:15:46.853 Test: test_nvme_ctrlr_probe ...[2024-06-07 12:18:10.471907] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 683:nvme_ctrlr_probe: *ERROR*: Failed to construct NVMe controller for SSD: 00:15:46.853 passed 00:15:46.853 Test: test_spdk_nvme_probe ...[2024-06-07 12:18:10.472035] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 601:nvme_driver_init: *ERROR*: primary process is not started yet 00:15:46.853 [2024-06-07 12:18:10.472081] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 898:spdk_nvme_probe: *ERROR*: Create probe context failed 00:15:46.853 [2024-06-07 12:18:10.472201] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 812:nvme_probe_internal: *ERROR*: NVMe trtype 256 (PCIE) not available 00:15:46.853 passed 00:15:46.853 Test: test_spdk_nvme_connect ...[2024-06-07 12:18:10.472261] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 898:spdk_nvme_probe: *ERROR*: Create probe context failed 00:15:46.853 [2024-06-07 12:18:10.472383] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 994:spdk_nvme_connect: *ERROR*: No transport ID specified 00:15:46.853 [2024-06-07 12:18:10.472696] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 601:nvme_driver_init: *ERROR*: primary process is not started yet 00:15:46.853 passed 00:15:46.853 Test: test_nvme_ctrlr_probe_internal ...[2024-06-07 12:18:10.472782] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1005:spdk_nvme_connect: *ERROR*: Create probe context failed 00:15:46.853 [2024-06-07 12:18:10.472924] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 683:nvme_ctrlr_probe: *ERROR*: Failed to construct NVMe controller for SSD: 00:15:46.853 [2024-06-07 12:18:10.472981] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 821:nvme_probe_internal: *ERROR*: NVMe ctrlr scan failed 00:15:46.853 passed 00:15:46.853 Test: test_nvme_init_controllers ...[2024-06-07 12:18:10.473081] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 708:nvme_ctrlr_poll_internal: *ERROR*: Failed to initialize SSD: 00:15:46.853 passed 00:15:46.853 Test: test_nvme_driver_init ...[2024-06-07 12:18:10.473204] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 578:nvme_driver_init: *ERROR*: primary process failed to reserve memory 00:15:46.853 [2024-06-07 12:18:10.473279] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 601:nvme_driver_init: *ERROR*: primary process is not started yet 00:15:47.113 [2024-06-07 12:18:10.582007] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 596:nvme_driver_init: *ERROR*: timeout waiting for primary process to init 00:15:47.113 passed 00:15:47.113 Test: test_spdk_nvme_detach ...passed 00:15:47.113 Test: test_nvme_completion_poll_cb ...[2024-06-07 12:18:10.582199] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c: 618:nvme_driver_init: *ERROR*: failed to initialize mutex 00:15:47.113 passed 00:15:47.113 Test: test_nvme_user_copy_cmd_complete ...passed 00:15:47.113 Test: test_nvme_allocate_request_null ...passed 00:15:47.113 Test: test_nvme_allocate_request ...passed 00:15:47.113 Test: test_nvme_free_request ...passed 00:15:47.113 Test: test_nvme_allocate_request_user_copy ...passed 00:15:47.113 Test: test_nvme_robust_mutex_init_shared ...passed 00:15:47.113 Test: test_nvme_request_check_timeout ...passed 00:15:47.113 Test: test_nvme_wait_for_completion ...passed 00:15:47.113 Test: test_spdk_nvme_parse_func ...passed 00:15:47.113 Test: test_spdk_nvme_detach_async ...passed 00:15:47.113 Test: test_nvme_parse_addr ...[2024-06-07 12:18:10.583106] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme.c:1586:nvme_parse_addr: *ERROR*: addr and service must both be non-NULL 00:15:47.113 passed 00:15:47.113 00:15:47.113 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.113 suites 1 1 n/a 0 0 00:15:47.113 tests 25 25 25 0 0 00:15:47.113 asserts 326 326 326 0 n/a 00:15:47.113 00:15:47.113 Elapsed time = 0.005 seconds 00:15:47.113 12:18:10 unittest.unittest_nvme -- unit/unittest.sh@89 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ctrlr.c/nvme_ctrlr_ut 00:15:47.113 00:15:47.113 00:15:47.113 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.113 http://cunit.sourceforge.net/ 00:15:47.113 00:15:47.113 00:15:47.113 Suite: nvme_ctrlr 00:15:47.113 Test: test_nvme_ctrlr_init_en_1_rdy_0 ...[2024-06-07 12:18:10.618429] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_1_rdy_1 ...[2024-06-07 12:18:10.620102] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_0_rdy_0 ...[2024-06-07 12:18:10.621462] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_0_rdy_1 ...[2024-06-07 12:18:10.622679] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_0_rdy_0_ams_rr ...[2024-06-07 12:18:10.623906] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 [2024-06-07 12:18:10.625052] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22[2024-06-07 12:18:10.626265] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22[2024-06-07 12:18:10.627423] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_0_rdy_0_ams_wrr ...[2024-06-07 12:18:10.629741] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 [2024-06-07 12:18:10.631970] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22[2024-06-07 12:18:10.633149] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22passed 00:15:47.113 Test: test_nvme_ctrlr_init_en_0_rdy_0_ams_vs ...[2024-06-07 12:18:10.635541] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 [2024-06-07 12:18:10.636719] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22[2024-06-07 12:18:10.639029] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:3947:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr enable failed with error: -22passed 00:15:47.113 Test: test_nvme_ctrlr_init_delay ...[2024-06-07 12:18:10.641439] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_alloc_io_qpair_rr_1 ...[2024-06-07 12:18:10.642737] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 [2024-06-07 12:18:10.643002] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:5330:spdk_nvme_ctrlr_alloc_qid: *ERROR*: [] No free I/O queue IDs 00:15:47.113 [2024-06-07 12:18:10.643306] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c: 399:nvme_ctrlr_create_io_qpair: *ERROR*: [] invalid queue priority for default round robin arbitration method 00:15:47.113 [2024-06-07 12:18:10.643419] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c: 399:nvme_ctrlr_create_io_qpair: *ERROR*: [] invalid queue priority for default round robin arbitration method 00:15:47.113 [2024-06-07 12:18:10.643497] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c: 399:nvme_ctrlr_create_io_qpair: *ERROR*: [] invalid queue priority for default round robin arbitration method 00:15:47.113 passed 00:15:47.113 Test: test_ctrlr_get_default_ctrlr_opts ...passed 00:15:47.113 Test: test_ctrlr_get_default_io_qpair_opts ...passed 00:15:47.113 Test: test_alloc_io_qpair_wrr_1 ...[2024-06-07 12:18:10.643721] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 passed 00:15:47.113 Test: test_alloc_io_qpair_wrr_2 ...[2024-06-07 12:18:10.644012] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.113 [2024-06-07 12:18:10.644196] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:5330:spdk_nvme_ctrlr_alloc_qid: *ERROR*: [] No free I/O queue IDs 00:15:47.113 passed 00:15:47.113 Test: test_spdk_nvme_ctrlr_update_firmware ...[2024-06-07 12:18:10.644585] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4858:spdk_nvme_ctrlr_update_firmware: *ERROR*: [] spdk_nvme_ctrlr_update_firmware invalid size! 00:15:47.113 [2024-06-07 12:18:10.644827] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4895:spdk_nvme_ctrlr_update_firmware: *ERROR*: [] spdk_nvme_ctrlr_fw_image_download failed! 00:15:47.113 [2024-06-07 12:18:10.644993] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4935:spdk_nvme_ctrlr_update_firmware: *ERROR*: [] nvme_ctrlr_cmd_fw_commit failed! 00:15:47.113 [2024-06-07 12:18:10.645118] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4895:spdk_nvme_ctrlr_update_firmware: *ERROR*: [] spdk_nvme_ctrlr_fw_image_download failed! 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_fail ...[2024-06-07 12:18:10.645270] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:1042:nvme_ctrlr_fail: *ERROR*: [] in failed state. 00:15:47.113 passed 00:15:47.113 Test: test_nvme_ctrlr_construct_intel_support_log_page_list ...passed 00:15:47.113 Test: test_nvme_ctrlr_set_supported_features ...passed 00:15:47.113 Test: test_spdk_nvme_ctrlr_doorbell_buffer_config ...passed 00:15:47.113 Test: test_nvme_ctrlr_test_active_ns ...[2024-06-07 12:18:10.645651] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_test_active_ns_error_case ...passed 00:15:47.372 Test: test_spdk_nvme_ctrlr_reconnect_io_qpair ...passed 00:15:47.372 Test: test_spdk_nvme_ctrlr_set_trid ...passed 00:15:47.372 Test: test_nvme_ctrlr_init_set_nvmf_ioccsz ...[2024-06-07 12:18:10.828902] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_init_set_num_queues ...[2024-06-07 12:18:10.835884] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_init_set_keep_alive_timeout ...[2024-06-07 12:18:10.837087] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 [2024-06-07 12:18:10.837165] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:2883:nvme_ctrlr_set_keep_alive_timeout_done: *ERROR*: [] Keep alive timeout Get Feature failed: SC 6 SCT 0 00:15:47.372 passed 00:15:47.372 Test: test_alloc_io_qpair_fail ...[2024-06-07 12:18:10.838265] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 [2024-06-07 12:18:10.838380] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c: 511:spdk_nvme_ctrlr_alloc_io_qpair: *ERROR*: [] nvme_transport_ctrlr_connect_io_qpair() failed 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_add_remove_process ...passed 00:15:47.372 Test: test_nvme_ctrlr_set_arbitration_feature ...passed 00:15:47.372 Test: test_nvme_ctrlr_set_state ...passed 00:15:47.372 Test: test_nvme_ctrlr_active_ns_list_v0 ...[2024-06-07 12:18:10.838524] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:1478:_nvme_ctrlr_set_state: *ERROR*: [] Specified timeout would cause integer overflow. Defaulting to no timeout. 00:15:47.372 [2024-06-07 12:18:10.838583] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_active_ns_list_v2 ...[2024-06-07 12:18:10.860405] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_ns_mgmt ...[2024-06-07 12:18:10.905009] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_reset ...[2024-06-07 12:18:10.906549] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_aer_callback ...[2024-06-07 12:18:10.906899] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_ns_attr_changed ...[2024-06-07 12:18:10.908319] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_identify_namespaces_iocs_specific_next ...passed 00:15:47.372 Test: test_nvme_ctrlr_set_supported_log_pages ...passed 00:15:47.372 Test: test_nvme_ctrlr_set_intel_supported_log_pages ...[2024-06-07 12:18:10.910124] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_parse_ana_log_page ...passed 00:15:47.372 Test: test_nvme_ctrlr_ana_resize ...[2024-06-07 12:18:10.911527] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_get_memory_domains ...passed 00:15:47.372 Test: test_nvme_transport_ctrlr_ready ...[2024-06-07 12:18:10.913076] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4029:nvme_ctrlr_process_init: *ERROR*: [] Transport controller ready step failed: rc -1 00:15:47.372 [2024-06-07 12:18:10.913139] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4080:nvme_ctrlr_process_init: *ERROR*: [] Ctrlr operation failed with error: -1, ctrlr state: 51 (error) 00:15:47.372 passed 00:15:47.372 Test: test_nvme_ctrlr_disable ...[2024-06-07 12:18:10.913201] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr.c:4148:nvme_ctrlr_construct: *ERROR*: [] admin_queue_size 0 is less than minimum defined by NVMe spec, use min value 00:15:47.372 passed 00:15:47.372 00:15:47.372 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.372 suites 1 1 n/a 0 0 00:15:47.372 tests 43 43 43 0 0 00:15:47.372 asserts 10418 10418 10418 0 n/a 00:15:47.372 00:15:47.372 Elapsed time = 0.254 seconds 00:15:47.372 12:18:10 unittest.unittest_nvme -- unit/unittest.sh@90 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ctrlr_cmd.c/nvme_ctrlr_cmd_ut 00:15:47.372 00:15:47.372 00:15:47.372 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.372 http://cunit.sourceforge.net/ 00:15:47.372 00:15:47.372 00:15:47.372 Suite: nvme_ctrlr_cmd 00:15:47.372 Test: test_get_log_pages ...passed 00:15:47.372 Test: test_set_feature_cmd ...passed 00:15:47.372 Test: test_set_feature_ns_cmd ...passed 00:15:47.372 Test: test_get_feature_cmd ...passed 00:15:47.372 Test: test_get_feature_ns_cmd ...passed 00:15:47.373 Test: test_abort_cmd ...passed 00:15:47.373 Test: test_set_host_id_cmds ...passed 00:15:47.373 Test: test_io_cmd_raw_no_payload_build ...passed 00:15:47.373 Test: test_io_raw_cmd ...passed 00:15:47.373 Test: test_io_raw_cmd_with_md ...passed 00:15:47.373 Test: test_namespace_attach ...passed 00:15:47.373 Test: test_namespace_detach ...passed[2024-06-07 12:18:10.969151] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ctrlr_cmd.c: 508:nvme_ctrlr_cmd_set_host_id: *ERROR*: Invalid host ID size 1024 00:15:47.373 00:15:47.373 Test: test_namespace_create ...passed 00:15:47.373 Test: test_namespace_delete ...passed 00:15:47.373 Test: test_doorbell_buffer_config ...passed 00:15:47.373 Test: test_format_nvme ...passed 00:15:47.373 Test: test_fw_commit ...passed 00:15:47.373 Test: test_fw_image_download ...passed 00:15:47.373 Test: test_sanitize ...passed 00:15:47.373 Test: test_directive ...passed 00:15:47.373 Test: test_nvme_request_add_abort ...passed 00:15:47.373 Test: test_spdk_nvme_ctrlr_cmd_abort ...passed 00:15:47.373 Test: test_nvme_ctrlr_cmd_identify ...passed 00:15:47.373 Test: test_spdk_nvme_ctrlr_cmd_security_receive_send ...passed 00:15:47.373 00:15:47.373 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.373 suites 1 1 n/a 0 0 00:15:47.373 tests 24 24 24 0 0 00:15:47.373 asserts 198 198 198 0 n/a 00:15:47.373 00:15:47.373 Elapsed time = 0.001 seconds 00:15:47.373 12:18:10 unittest.unittest_nvme -- unit/unittest.sh@91 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ctrlr_ocssd_cmd.c/nvme_ctrlr_ocssd_cmd_ut 00:15:47.373 00:15:47.373 00:15:47.373 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.373 http://cunit.sourceforge.net/ 00:15:47.373 00:15:47.373 00:15:47.373 Suite: nvme_ctrlr_cmd 00:15:47.373 Test: test_geometry_cmd ...passed 00:15:47.373 Test: test_spdk_nvme_ctrlr_is_ocssd_supported ...passed 00:15:47.373 00:15:47.373 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.373 suites 1 1 n/a 0 0 00:15:47.373 tests 2 2 2 0 0 00:15:47.373 asserts 7 7 7 0 n/a 00:15:47.373 00:15:47.373 Elapsed time = 0.000 seconds 00:15:47.373 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@92 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ns.c/nvme_ns_ut 00:15:47.632 00:15:47.632 00:15:47.632 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.632 http://cunit.sourceforge.net/ 00:15:47.632 00:15:47.632 00:15:47.632 Suite: nvme 00:15:47.632 Test: test_nvme_ns_construct ...passed 00:15:47.632 Test: test_nvme_ns_uuid ...passed 00:15:47.632 Test: test_nvme_ns_csi ...passed 00:15:47.632 Test: test_nvme_ns_data ...passed 00:15:47.632 Test: test_nvme_ns_set_identify_data ...passed 00:15:47.632 Test: test_spdk_nvme_ns_get_values ...passed 00:15:47.632 Test: test_spdk_nvme_ns_is_active ...passed 00:15:47.632 Test: spdk_nvme_ns_supports ...passed 00:15:47.632 Test: test_nvme_ns_has_supported_iocs_specific_data ...passed 00:15:47.632 Test: test_nvme_ctrlr_identify_ns_iocs_specific ...passed 00:15:47.632 Test: test_nvme_ctrlr_identify_id_desc ...passed 00:15:47.632 Test: test_nvme_ns_find_id_desc ...passed 00:15:47.632 00:15:47.632 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.632 suites 1 1 n/a 0 0 00:15:47.632 tests 12 12 12 0 0 00:15:47.632 asserts 83 83 83 0 n/a 00:15:47.632 00:15:47.632 Elapsed time = 0.000 seconds 00:15:47.632 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@93 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ns_cmd.c/nvme_ns_cmd_ut 00:15:47.632 00:15:47.632 00:15:47.632 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.632 http://cunit.sourceforge.net/ 00:15:47.632 00:15:47.632 00:15:47.632 Suite: nvme_ns_cmd 00:15:47.632 Test: split_test ...passed 00:15:47.632 Test: split_test2 ...passed 00:15:47.632 Test: split_test3 ...passed 00:15:47.632 Test: split_test4 ...passed 00:15:47.632 Test: test_nvme_ns_cmd_flush ...passed 00:15:47.632 Test: test_nvme_ns_cmd_dataset_management ...passed 00:15:47.632 Test: test_nvme_ns_cmd_copy ...passed 00:15:47.632 Test: test_io_flags ...[2024-06-07 12:18:11.059143] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ns_cmd.c: 144:_is_io_flags_valid: *ERROR*: Invalid io_flags 0xfffc 00:15:47.632 passed 00:15:47.632 Test: test_nvme_ns_cmd_write_zeroes ...passed 00:15:47.632 Test: test_nvme_ns_cmd_write_uncorrectable ...passed 00:15:47.632 Test: test_nvme_ns_cmd_reservation_register ...passed 00:15:47.632 Test: test_nvme_ns_cmd_reservation_release ...passed 00:15:47.632 Test: test_nvme_ns_cmd_reservation_acquire ...passed 00:15:47.632 Test: test_nvme_ns_cmd_reservation_report ...passed 00:15:47.632 Test: test_cmd_child_request ...passed 00:15:47.632 Test: test_nvme_ns_cmd_readv ...passed 00:15:47.632 Test: test_nvme_ns_cmd_read_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_writev ...[2024-06-07 12:18:11.061609] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ns_cmd.c: 291:_nvme_ns_cmd_split_request_prp: *ERROR*: child_length 200 not even multiple of lba_size 512 00:15:47.632 passed 00:15:47.632 Test: test_nvme_ns_cmd_write_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_zone_append_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_zone_appendv_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_comparev ...passed 00:15:47.632 Test: test_nvme_ns_cmd_compare_and_write ...passed 00:15:47.632 Test: test_nvme_ns_cmd_compare_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_comparev_with_md ...passed 00:15:47.632 Test: test_nvme_ns_cmd_setup_request ...passed 00:15:47.632 Test: test_spdk_nvme_ns_cmd_readv_with_md ...passed 00:15:47.632 Test: test_spdk_nvme_ns_cmd_writev_ext ...[2024-06-07 12:18:11.064755] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ns_cmd.c: 144:_is_io_flags_valid: *ERROR*: Invalid io_flags 0xffff000f 00:15:47.632 passed 00:15:47.632 Test: test_spdk_nvme_ns_cmd_readv_ext ...[2024-06-07 12:18:11.065044] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_ns_cmd.c: 144:_is_io_flags_valid: *ERROR*: Invalid io_flags 0xffff000f 00:15:47.632 passed 00:15:47.632 Test: test_nvme_ns_cmd_verify ...passed 00:15:47.632 Test: test_nvme_ns_cmd_io_mgmt_send ...passed 00:15:47.632 Test: test_nvme_ns_cmd_io_mgmt_recv ...passed 00:15:47.632 00:15:47.632 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.632 suites 1 1 n/a 0 0 00:15:47.632 tests 32 32 32 0 0 00:15:47.632 asserts 550 550 550 0 n/a 00:15:47.632 00:15:47.632 Elapsed time = 0.005 seconds 00:15:47.632 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@94 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_ns_ocssd_cmd.c/nvme_ns_ocssd_cmd_ut 00:15:47.632 00:15:47.632 00:15:47.632 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_ns_cmd 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_reset ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_reset_single_entry ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_read_with_md ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_read_with_md_single_entry ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_read ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_read_single_entry ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_write_with_md ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_write_with_md_single_entry ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_write ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_write_single_entry ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_copy ...passed 00:15:47.633 Test: test_nvme_ocssd_ns_cmd_vector_copy_single_entry ...passed 00:15:47.633 00:15:47.633 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.633 suites 1 1 n/a 0 0 00:15:47.633 tests 12 12 12 0 0 00:15:47.633 asserts 123 123 123 0 n/a 00:15:47.633 00:15:47.633 Elapsed time = 0.001 seconds 00:15:47.633 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@95 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_qpair.c/nvme_qpair_ut 00:15:47.633 00:15:47.633 00:15:47.633 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_qpair 00:15:47.633 Test: test3 ...passed 00:15:47.633 Test: test_ctrlr_failed ...passed 00:15:47.633 Test: struct_packing ...passed 00:15:47.633 Test: test_nvme_qpair_process_completions ...[2024-06-07 12:18:11.131911] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:15:47.633 [2024-06-07 12:18:11.132411] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:15:47.633 [2024-06-07 12:18:11.132502] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:15:47.633 [2024-06-07 12:18:11.132828] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 1 00:15:47.633 passed 00:15:47.633 Test: test_nvme_completion_is_retry ...passed 00:15:47.633 Test: test_get_status_string ...passed 00:15:47.633 Test: test_nvme_qpair_add_cmd_error_injection ...passed 00:15:47.633 Test: test_nvme_qpair_submit_request ...passed 00:15:47.633 Test: test_nvme_qpair_resubmit_request_with_transport_failed ...passed 00:15:47.633 Test: test_nvme_qpair_manual_complete_request ...passed 00:15:47.633 Test: test_nvme_qpair_init_deinit ...[2024-06-07 12:18:11.133982] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:15:47.633 passed 00:15:47.633 Test: test_nvme_get_sgl_print_info ...passed 00:15:47.633 00:15:47.633 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.633 suites 1 1 n/a 0 0 00:15:47.633 tests 12 12 12 0 0 00:15:47.633 asserts 154 154 154 0 n/a 00:15:47.633 00:15:47.633 Elapsed time = 0.002 seconds 00:15:47.633 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@96 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_pcie.c/nvme_pcie_ut 00:15:47.633 00:15:47.633 00:15:47.633 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_pcie 00:15:47.633 Test: test_prp_list_append ...[2024-06-07 12:18:11.171244] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1205:nvme_pcie_prp_list_append: *ERROR*: virt_addr 0x100001 not dword aligned 00:15:47.633 [2024-06-07 12:18:11.171574] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1234:nvme_pcie_prp_list_append: *ERROR*: PRP 2 not page aligned (0x900800) 00:15:47.633 [2024-06-07 12:18:11.171650] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1224:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x100000) failed 00:15:47.633 [2024-06-07 12:18:11.171980] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1218:nvme_pcie_prp_list_append: *ERROR*: out of PRP entries 00:15:47.633 passed 00:15:47.633 Test: test_nvme_pcie_hotplug_monitor ...[2024-06-07 12:18:11.172109] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1218:nvme_pcie_prp_list_append: *ERROR*: out of PRP entries 00:15:47.633 passed 00:15:47.633 Test: test_shadow_doorbell_update ...passed 00:15:47.633 Test: test_build_contig_hw_sgl_request ...passed 00:15:47.633 Test: test_nvme_pcie_qpair_build_metadata ...passed 00:15:47.633 Test: test_nvme_pcie_qpair_build_prps_sgl_request ...passed 00:15:47.633 Test: test_nvme_pcie_qpair_build_hw_sgl_request ...passed 00:15:47.633 Test: test_nvme_pcie_qpair_build_contig_request ...passed 00:15:47.633 Test: test_nvme_pcie_ctrlr_regs_get_set ...passed 00:15:47.633 Test: test_nvme_pcie_ctrlr_map_unmap_cmb ...[2024-06-07 12:18:11.172392] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1205:nvme_pcie_prp_list_append: *ERROR*: virt_addr 0x100001 not dword aligned 00:15:47.633 passed 00:15:47.633 Test: test_nvme_pcie_ctrlr_map_io_cmb ...passed 00:15:47.633 Test: test_nvme_pcie_ctrlr_map_unmap_pmr ...passed[2024-06-07 12:18:11.172534] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie.c: 442:nvme_pcie_ctrlr_map_io_cmb: *ERROR*: CMB is already in use for submission queues. 00:15:47.633 [2024-06-07 12:18:11.172655] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie.c: 521:nvme_pcie_ctrlr_map_pmr: *ERROR*: invalid base indicator register value 00:15:47.633 00:15:47.633 Test: test_nvme_pcie_ctrlr_config_pmr ...[2024-06-07 12:18:11.172748] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie.c: 647:nvme_pcie_ctrlr_config_pmr: *ERROR*: PMR is already disabled 00:15:47.633 passed 00:15:47.633 Test: test_nvme_pcie_ctrlr_map_io_pmr ...passed 00:15:47.633 00:15:47.633 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.633 suites 1 1 n/a 0 0 00:15:47.633 tests 14 14 14 0 0 00:15:47.633 asserts 235 235 235 0 n/a 00:15:47.633 00:15:47.633 Elapsed time = 0.002 seconds 00:15:47.633 [2024-06-07 12:18:11.172842] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie.c: 699:nvme_pcie_ctrlr_map_io_pmr: *ERROR*: PMR is not supported by the controller 00:15:47.633 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@97 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_poll_group.c/nvme_poll_group_ut 00:15:47.633 00:15:47.633 00:15:47.633 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_ns_cmd 00:15:47.633 Test: nvme_poll_group_create_test ...passed 00:15:47.633 Test: nvme_poll_group_add_remove_test ...passed 00:15:47.633 Test: nvme_poll_group_process_completions ...passed 00:15:47.633 Test: nvme_poll_group_destroy_test ...passed 00:15:47.633 Test: nvme_poll_group_get_free_stats ...passed 00:15:47.633 00:15:47.633 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.633 suites 1 1 n/a 0 0 00:15:47.633 tests 5 5 5 0 0 00:15:47.633 asserts 75 75 75 0 n/a 00:15:47.633 00:15:47.633 Elapsed time = 0.000 seconds 00:15:47.633 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@98 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_quirks.c/nvme_quirks_ut 00:15:47.633 00:15:47.633 00:15:47.633 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_quirks 00:15:47.633 Test: test_nvme_quirks_striping ...passed 00:15:47.633 00:15:47.633 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.633 suites 1 1 n/a 0 0 00:15:47.633 tests 1 1 1 0 0 00:15:47.633 asserts 5 5 5 0 n/a 00:15:47.633 00:15:47.633 Elapsed time = 0.000 seconds 00:15:47.633 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@99 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_tcp.c/nvme_tcp_ut 00:15:47.633 00:15:47.633 00:15:47.633 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.633 http://cunit.sourceforge.net/ 00:15:47.633 00:15:47.633 00:15:47.633 Suite: nvme_tcp 00:15:47.633 Test: test_nvme_tcp_pdu_set_data_buf ...passed 00:15:47.633 Test: test_nvme_tcp_build_iovs ...passed 00:15:47.633 Test: test_nvme_tcp_build_sgl_request ...[2024-06-07 12:18:11.261318] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 825:nvme_tcp_build_sgl_request: *ERROR*: Failed to construct tcp_req=0x7ffc98d963d0, and the iovcnt=16, remaining_size=28672 00:15:47.633 passed 00:15:47.633 Test: test_nvme_tcp_pdu_set_data_buf_with_md ...passed 00:15:47.633 Test: test_nvme_tcp_build_iovs_with_md ...passed 00:15:47.633 Test: test_nvme_tcp_req_complete_safe ...passed 00:15:47.633 Test: test_nvme_tcp_req_get ...passed 00:15:47.633 Test: test_nvme_tcp_req_init ...passed 00:15:47.633 Test: test_nvme_tcp_qpair_capsule_cmd_send ...passed 00:15:47.633 Test: test_nvme_tcp_qpair_write_pdu ...passed 00:15:47.633 Test: test_nvme_tcp_qpair_set_recv_state ...[2024-06-07 12:18:11.261997] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d980e0 is same with the state(6) to be set 00:15:47.633 passed 00:15:47.633 Test: test_nvme_tcp_alloc_reqs ...passed 00:15:47.633 Test: test_nvme_tcp_qpair_send_h2c_term_req ...[2024-06-07 12:18:11.262475] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d972a0 is same with the state(5) to be set 00:15:47.633 passed 00:15:47.633 Test: test_nvme_tcp_pdu_ch_handle ...[2024-06-07 12:18:11.262566] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1167:nvme_tcp_pdu_ch_handle: *ERROR*: Already received IC_RESP PDU, and we should reject this pdu=0x7ffc98d97e30 00:15:47.633 [2024-06-07 12:18:11.262636] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1226:nvme_tcp_pdu_ch_handle: *ERROR*: Expected PDU header length 128, got 0 00:15:47.633 [2024-06-07 12:18:11.262810] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.633 [2024-06-07 12:18:11.262903] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1177:nvme_tcp_pdu_ch_handle: *ERROR*: The TCP/IP tqpair connection is not negotiated 00:15:47.634 [2024-06-07 12:18:11.263038] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263117] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1218:nvme_tcp_pdu_ch_handle: *ERROR*: Unexpected PDU type 0x00 00:15:47.634 [2024-06-07 12:18:11.263185] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263282] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263350] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263450] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263512] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.263586] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d97760 is same with the state(5) to be set 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_qpair_connect_sock ...[2024-06-07 12:18:11.263763] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2324:nvme_tcp_qpair_connect_sock: *ERROR*: Unhandled ADRFAM 3 00:15:47.634 [2024-06-07 12:18:11.263830] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2336:nvme_tcp_qpair_connect_sock: *ERROR*: dst_addr nvme_parse_addr() failed 00:15:47.634 [2024-06-07 12:18:11.264195] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2336:nvme_tcp_qpair_connect_sock: *ERROR*: dst_addr nvme_parse_addr() failed 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_qpair_icreq_send ...passed 00:15:47.634 Test: test_nvme_tcp_c2h_payload_handle ...[2024-06-07 12:18:11.264365] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1341:nvme_tcp_c2h_term_req_dump: *ERROR*: Error info of pdu(0x7ffc98d97970): PDU Sequence Error 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_icresp_handle ...[2024-06-07 12:18:11.264451] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1567:nvme_tcp_icresp_handle: *ERROR*: Expected ICResp PFV 0, got 1 00:15:47.634 [2024-06-07 12:18:11.264516] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1574:nvme_tcp_icresp_handle: *ERROR*: Expected ICResp maxh2cdata >=4096, got 2048 00:15:47.634 [2024-06-07 12:18:11.264581] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d972a0 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.264656] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1583:nvme_tcp_icresp_handle: *ERROR*: Expected ICResp cpda <=31, got 64 00:15:47.634 [2024-06-07 12:18:11.264727] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d972a0 is same with the state(5) to be set 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_pdu_payload_handle ...[2024-06-07 12:18:11.264849] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d972a0 is same with the state(0) to be set 00:15:47.634 [2024-06-07 12:18:11.264958] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1341:nvme_tcp_c2h_term_req_dump: *ERROR*: Error info of pdu(0x7ffc98d97e30): PDU Sequence Error 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_capsule_resp_hdr_handle ...[2024-06-07 12:18:11.265067] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1644:nvme_tcp_capsule_resp_hdr_handle: *ERROR*: no tcp_req is found with cid=1 for tqpair=0x7ffc98d96570 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_ctrlr_connect_qpair ...passed 00:15:47.634 Test: test_nvme_tcp_ctrlr_disconnect_qpair ...[2024-06-07 12:18:11.265280] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 354:nvme_tcp_ctrlr_disconnect_qpair: *ERROR*: tqpair=0x7ffc98d95bf0, errno=0, rc=0 00:15:47.634 [2024-06-07 12:18:11.265380] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d95bf0 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.265480] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 323:nvme_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc98d95bf0 is same with the state(5) to be set 00:15:47.634 [2024-06-07 12:18:11.265580] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2176:nvme_tcp_qpair_process_completions: *ERROR*: Failed to flush tqpair=0x7ffc98d95bf0 (0): Success 00:15:47.634 passed 00:15:47.634 Test: test_nvme_tcp_ctrlr_create_io_qpair ...[2024-06-07 12:18:11.265660] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2176:nvme_tcp_qpair_process_completions: *ERROR*: Failed to flush tqpair=0x7ffc98d95bf0 (0): Success 00:15:47.893 [2024-06-07 12:18:11.346402] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2507:nvme_tcp_ctrlr_create_qpair: *ERROR*: Failed to create qpair with size 0. Minimum queue size is 2. 00:15:47.893 [2024-06-07 12:18:11.346566] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2507:nvme_tcp_ctrlr_create_qpair: *ERROR*: Failed to create qpair with size 1. Minimum queue size is 2. 00:15:47.893 passed 00:15:47.893 Test: test_nvme_tcp_ctrlr_delete_io_qpair ...passed 00:15:47.893 Test: test_nvme_tcp_poll_group_get_stats ...[2024-06-07 12:18:11.346821] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2955:nvme_tcp_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:47.893 [2024-06-07 12:18:11.346867] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2955:nvme_tcp_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:47.893 passed 00:15:47.893 Test: test_nvme_tcp_ctrlr_construct ...[2024-06-07 12:18:11.347194] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2507:nvme_tcp_ctrlr_create_qpair: *ERROR*: Failed to create qpair with size 1. Minimum queue size is 2. 00:15:47.893 [2024-06-07 12:18:11.347296] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2702:nvme_tcp_ctrlr_construct: *ERROR*: failed to create admin qpair 00:15:47.893 [2024-06-07 12:18:11.347491] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2324:nvme_tcp_qpair_connect_sock: *ERROR*: Unhandled ADRFAM 254 00:15:47.893 [2024-06-07 12:18:11.347586] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2702:nvme_tcp_ctrlr_construct: *ERROR*: failed to create admin qpair 00:15:47.893 [2024-06-07 12:18:11.347736] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2374:nvme_tcp_qpair_connect_sock: *ERROR*: sock connection error of tqpair=0x615000007d80 with addr=192.168.1.78, port=23 00:15:47.893 passed 00:15:47.893 Test: test_nvme_tcp_qpair_submit_request ...[2024-06-07 12:18:11.347850] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:2702:nvme_tcp_ctrlr_construct: *ERROR*: failed to create admin qpair 00:15:47.893 [2024-06-07 12:18:11.348054] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c: 825:nvme_tcp_build_sgl_request: *ERROR*: Failed to construct tcp_req=0x613000000c80, and the iovcnt=1, remaining_size=1024 00:15:47.893 [2024-06-07 12:18:11.348148] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_tcp.c:1018:nvme_tcp_qpair_submit_request: *ERROR*: nvme_tcp_req_init() failed 00:15:47.893 passed 00:15:47.893 00:15:47.893 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.893 suites 1 1 n/a 0 0 00:15:47.893 tests 27 27 27 0 0 00:15:47.893 asserts 624 624 624 0 n/a 00:15:47.893 00:15:47.893 Elapsed time = 0.087 seconds 00:15:47.893 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@100 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_transport.c/nvme_transport_ut 00:15:47.893 00:15:47.893 00:15:47.893 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.893 http://cunit.sourceforge.net/ 00:15:47.893 00:15:47.893 00:15:47.893 Suite: nvme_transport 00:15:47.893 Test: test_nvme_get_transport ...passed 00:15:47.893 Test: test_nvme_transport_poll_group_connect_qpair ...passed 00:15:47.893 Test: test_nvme_transport_poll_group_disconnect_qpair ...passed 00:15:47.893 Test: test_nvme_transport_poll_group_add_remove ...passed 00:15:47.893 Test: test_ctrlr_get_memory_domains ...passed 00:15:47.893 00:15:47.893 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.893 suites 1 1 n/a 0 0 00:15:47.894 tests 5 5 5 0 0 00:15:47.894 asserts 28 28 28 0 n/a 00:15:47.894 00:15:47.894 Elapsed time = 0.000 seconds 00:15:47.894 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@101 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_io_msg.c/nvme_io_msg_ut 00:15:47.894 00:15:47.894 00:15:47.894 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.894 http://cunit.sourceforge.net/ 00:15:47.894 00:15:47.894 00:15:47.894 Suite: nvme_io_msg 00:15:47.894 Test: test_nvme_io_msg_send ...passed 00:15:47.894 Test: test_nvme_io_msg_process ...passed 00:15:47.894 Test: test_nvme_io_msg_ctrlr_register_unregister ...passed 00:15:47.894 00:15:47.894 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.894 suites 1 1 n/a 0 0 00:15:47.894 tests 3 3 3 0 0 00:15:47.894 asserts 56 56 56 0 n/a 00:15:47.894 00:15:47.894 Elapsed time = 0.000 seconds 00:15:47.894 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@102 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_pcie_common.c/nvme_pcie_common_ut 00:15:47.894 00:15:47.894 00:15:47.894 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.894 http://cunit.sourceforge.net/ 00:15:47.894 00:15:47.894 00:15:47.894 Suite: nvme_pcie_common 00:15:47.894 Test: test_nvme_pcie_ctrlr_alloc_cmb ...[2024-06-07 12:18:11.443296] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c: 87:nvme_pcie_ctrlr_alloc_cmb: *ERROR*: Tried to allocate past valid CMB range! 00:15:47.894 passed 00:15:47.894 Test: test_nvme_pcie_qpair_construct_destroy ...passed 00:15:47.894 Test: test_nvme_pcie_ctrlr_cmd_create_delete_io_queue ...passed 00:15:47.894 Test: test_nvme_pcie_ctrlr_connect_qpair ...[2024-06-07 12:18:11.449605] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c: 504:nvme_completion_create_cq_cb: *ERROR*: nvme_create_io_cq failed! 00:15:47.894 [2024-06-07 12:18:11.449784] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c: 457:nvme_completion_create_sq_cb: *ERROR*: nvme_create_io_sq failed, deleting cq! 00:15:47.894 passed 00:15:47.894 Test: test_nvme_pcie_ctrlr_construct_admin_qpair ...[2024-06-07 12:18:11.449842] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c: 551:_nvme_pcie_ctrlr_create_io_qpair: *ERROR*: Failed to send request to create_io_cq 00:15:47.894 passed 00:15:47.894 Test: test_nvme_pcie_poll_group_get_stats ...[2024-06-07 12:18:11.450484] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1797:nvme_pcie_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:47.894 [2024-06-07 12:18:11.450555] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_pcie_common.c:1797:nvme_pcie_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:47.894 passed 00:15:47.894 00:15:47.894 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.894 suites 1 1 n/a 0 0 00:15:47.894 tests 6 6 6 0 0 00:15:47.894 asserts 148 148 148 0 n/a 00:15:47.894 00:15:47.894 Elapsed time = 0.002 seconds 00:15:47.894 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@103 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_fabric.c/nvme_fabric_ut 00:15:47.894 00:15:47.894 00:15:47.894 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.894 http://cunit.sourceforge.net/ 00:15:47.894 00:15:47.894 00:15:47.894 Suite: nvme_fabric 00:15:47.894 Test: test_nvme_fabric_prop_set_cmd ...passed 00:15:47.894 Test: test_nvme_fabric_prop_get_cmd ...passed 00:15:47.894 Test: test_nvme_fabric_get_discovery_log_page ...passed 00:15:47.894 Test: test_nvme_fabric_discover_probe ...passed 00:15:47.894 Test: test_nvme_fabric_qpair_connect ...[2024-06-07 12:18:11.488543] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_fabric.c: 600:_nvme_fabric_qpair_connect_poll: *ERROR*: Connect command failed, rc -125, trtype:(null) adrfam:(null) traddr: trsvcid: subnqn:nqn.2016-06.io.spdk:subsystem1 00:15:47.894 passed 00:15:47.894 00:15:47.894 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.894 suites 1 1 n/a 0 0 00:15:47.894 tests 5 5 5 0 0 00:15:47.894 asserts 60 60 60 0 n/a 00:15:47.894 00:15:47.894 Elapsed time = 0.001 seconds 00:15:47.894 12:18:11 unittest.unittest_nvme -- unit/unittest.sh@104 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_opal.c/nvme_opal_ut 00:15:47.894 00:15:47.894 00:15:47.894 CUnit - A unit testing framework for C - Version 2.1-3 00:15:47.894 http://cunit.sourceforge.net/ 00:15:47.894 00:15:47.894 00:15:47.894 Suite: nvme_opal 00:15:47.894 Test: test_opal_nvme_security_recv_send_done ...passed 00:15:47.894 Test: test_opal_add_short_atom_header ...[2024-06-07 12:18:11.520401] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_opal.c: 171:opal_add_token_bytestring: *ERROR*: Error adding bytestring: end of buffer. 00:15:47.894 passed 00:15:47.894 00:15:47.894 Run Summary: Type Total Ran Passed Failed Inactive 00:15:47.894 suites 1 1 n/a 0 0 00:15:47.894 tests 2 2 2 0 0 00:15:47.894 asserts 22 22 22 0 n/a 00:15:47.894 00:15:47.894 Elapsed time = 0.000 seconds 00:15:47.894 00:15:47.894 real 0m1.082s 00:15:47.894 user 0m0.464s 00:15:47.894 sys 0m0.453s 00:15:47.894 12:18:11 unittest.unittest_nvme -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:47.894 12:18:11 unittest.unittest_nvme -- common/autotest_common.sh@10 -- # set +x 00:15:47.894 ************************************ 00:15:47.894 END TEST unittest_nvme 00:15:47.894 ************************************ 00:15:48.153 12:18:11 unittest -- unit/unittest.sh@249 -- # run_test unittest_log /home/vagrant/spdk_repo/spdk/test/unit/lib/log/log.c/log_ut 00:15:48.153 12:18:11 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:48.153 12:18:11 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:48.153 12:18:11 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:48.153 ************************************ 00:15:48.153 START TEST unittest_log 00:15:48.153 ************************************ 00:15:48.153 12:18:11 unittest.unittest_log -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/log/log.c/log_ut 00:15:48.153 00:15:48.153 00:15:48.153 CUnit - A unit testing framework for C - Version 2.1-3 00:15:48.153 http://cunit.sourceforge.net/ 00:15:48.153 00:15:48.153 00:15:48.153 Suite: log 00:15:48.153 Test: log_test ...[2024-06-07 12:18:11.613488] log_ut.c: 56:log_test: *WARNING*: log warning unit test 00:15:48.153 [2024-06-07 12:18:11.613741] log_ut.c: 57:log_test: *DEBUG*: log test 00:15:48.153 log dump test: 00:15:48.153 00000000 6c 6f 67 20 64 75 6d 70 log dump 00:15:48.153 spdk dump test: 00:15:48.153 00000000 73 70 64 6b 20 64 75 6d 70 spdk dump 00:15:48.153 spdk dump test: 00:15:48.153 00000000 73 70 64 6b 20 64 75 6d 70 20 31 36 20 6d 6f 72 spdk dump 16 mor 00:15:48.153 00000010 65 20 63 68 61 72 73 e chars 00:15:48.153 passed 00:15:49.089 Test: deprecation ...passed 00:15:49.089 00:15:49.089 Run Summary: Type Total Ran Passed Failed Inactive 00:15:49.089 suites 1 1 n/a 0 0 00:15:49.089 tests 2 2 2 0 0 00:15:49.089 asserts 73 73 73 0 n/a 00:15:49.089 00:15:49.089 Elapsed time = 0.001 seconds 00:15:49.089 00:15:49.089 real 0m1.034s 00:15:49.089 user 0m0.017s 00:15:49.089 sys 0m0.017s 00:15:49.089 12:18:12 unittest.unittest_log -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:49.089 12:18:12 unittest.unittest_log -- common/autotest_common.sh@10 -- # set +x 00:15:49.089 ************************************ 00:15:49.089 END TEST unittest_log 00:15:49.089 ************************************ 00:15:49.089 12:18:12 unittest -- unit/unittest.sh@250 -- # run_test unittest_lvol /home/vagrant/spdk_repo/spdk/test/unit/lib/lvol/lvol.c/lvol_ut 00:15:49.089 12:18:12 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:49.089 12:18:12 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:49.089 12:18:12 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:49.089 ************************************ 00:15:49.089 START TEST unittest_lvol 00:15:49.089 ************************************ 00:15:49.089 12:18:12 unittest.unittest_lvol -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/lvol/lvol.c/lvol_ut 00:15:49.089 00:15:49.089 00:15:49.089 CUnit - A unit testing framework for C - Version 2.1-3 00:15:49.089 http://cunit.sourceforge.net/ 00:15:49.089 00:15:49.089 00:15:49.089 Suite: lvol 00:15:49.089 Test: lvs_init_unload_success ...[2024-06-07 12:18:12.716726] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 892:spdk_lvs_unload: *ERROR*: Lvols still open on lvol store 00:15:49.089 passed 00:15:49.089 Test: lvs_init_destroy_success ...[2024-06-07 12:18:12.717254] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 962:spdk_lvs_destroy: *ERROR*: Lvols still open on lvol store 00:15:49.089 passed 00:15:49.089 Test: lvs_init_opts_success ...passed 00:15:49.089 Test: lvs_unload_lvs_is_null_fail ...[2024-06-07 12:18:12.717509] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 882:spdk_lvs_unload: *ERROR*: Lvol store is NULL 00:15:49.089 passed 00:15:49.089 Test: lvs_names ...[2024-06-07 12:18:12.717579] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 726:spdk_lvs_init: *ERROR*: No name specified. 00:15:49.089 [2024-06-07 12:18:12.717653] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 720:spdk_lvs_init: *ERROR*: Name has no null terminator. 00:15:49.089 [2024-06-07 12:18:12.717825] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 736:spdk_lvs_init: *ERROR*: lvolstore with name x already exists 00:15:49.089 passed 00:15:49.089 Test: lvol_create_destroy_success ...passed 00:15:49.089 Test: lvol_create_fail ...[2024-06-07 12:18:12.718417] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 689:spdk_lvs_init: *ERROR*: Blobstore device does not exist 00:15:49.089 [2024-06-07 12:18:12.718542] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1190:spdk_lvol_create: *ERROR*: lvol store does not exist 00:15:49.089 passed 00:15:49.089 Test: lvol_destroy_fail ...[2024-06-07 12:18:12.718874] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1026:lvol_delete_blob_cb: *ERROR*: Could not remove blob on lvol gracefully - forced removal 00:15:49.089 passed 00:15:49.089 Test: lvol_close ...[2024-06-07 12:18:12.719104] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1614:spdk_lvol_close: *ERROR*: lvol does not exist 00:15:49.089 [2024-06-07 12:18:12.719156] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 995:lvol_close_blob_cb: *ERROR*: Could not close blob on lvol 00:15:49.089 passed 00:15:49.089 Test: lvol_resize ...passed 00:15:49.089 Test: lvol_set_read_only ...passed 00:15:49.089 Test: test_lvs_load ...[2024-06-07 12:18:12.719970] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 631:lvs_opts_copy: *ERROR*: opts_size should not be zero value 00:15:49.089 [2024-06-07 12:18:12.720030] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 441:lvs_load: *ERROR*: Invalid options 00:15:49.089 passed 00:15:49.089 Test: lvols_load ...[2024-06-07 12:18:12.720292] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 227:load_next_lvol: *ERROR*: Failed to fetch blobs list 00:15:49.089 [2024-06-07 12:18:12.720425] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 227:load_next_lvol: *ERROR*: Failed to fetch blobs list 00:15:49.089 passed 00:15:49.089 Test: lvol_open ...passed 00:15:49.089 Test: lvol_snapshot ...passed 00:15:49.089 Test: lvol_snapshot_fail ...[2024-06-07 12:18:12.721174] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name snap already exists 00:15:49.089 passed 00:15:49.089 Test: lvol_clone ...passed 00:15:49.089 Test: lvol_clone_fail ...[2024-06-07 12:18:12.721762] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name clone already exists 00:15:49.089 passed 00:15:49.089 Test: lvol_iter_clones ...passed 00:15:49.089 Test: lvol_refcnt ...[2024-06-07 12:18:12.722300] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1572:spdk_lvol_destroy: *ERROR*: Cannot destroy lvol 570f1cc3-e61a-42d8-99ef-e0b185ab1d6f because it is still open 00:15:49.089 passed 00:15:49.089 Test: lvol_names ...[2024-06-07 12:18:12.722535] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1156:lvs_verify_lvol_name: *ERROR*: Name has no null terminator. 00:15:49.089 [2024-06-07 12:18:12.722657] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name lvol already exists 00:15:49.089 [2024-06-07 12:18:12.722875] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1169:lvs_verify_lvol_name: *ERROR*: lvol with name tmp_name is being already created 00:15:49.089 passed 00:15:49.089 Test: lvol_create_thin_provisioned ...passed 00:15:49.089 Test: lvol_rename ...[2024-06-07 12:18:12.723342] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name lvol already exists 00:15:49.089 [2024-06-07 12:18:12.723454] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1524:spdk_lvol_rename: *ERROR*: Lvol lvol_new already exists in lvol store lvs 00:15:49.089 passed 00:15:49.089 Test: lvs_rename ...[2024-06-07 12:18:12.723742] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c: 769:lvs_rename_cb: *ERROR*: Lvol store rename operation failed 00:15:49.089 passed 00:15:49.089 Test: lvol_inflate ...[2024-06-07 12:18:12.723945] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1658:lvol_inflate_cb: *ERROR*: Could not inflate lvol 00:15:49.089 passed 00:15:49.089 Test: lvol_decouple_parent ...[2024-06-07 12:18:12.724205] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1658:lvol_inflate_cb: *ERROR*: Could not inflate lvol 00:15:49.089 passed 00:15:49.089 Test: lvol_get_xattr ...passed 00:15:49.089 Test: lvol_esnap_reload ...passed 00:15:49.089 Test: lvol_esnap_create_bad_args ...[2024-06-07 12:18:12.724660] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1245:spdk_lvol_create_esnap_clone: *ERROR*: lvol store does not exist 00:15:49.089 [2024-06-07 12:18:12.724720] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1156:lvs_verify_lvol_name: *ERROR*: Name has no null terminator. 00:15:49.089 [2024-06-07 12:18:12.724773] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1258:spdk_lvol_create_esnap_clone: *ERROR*: Cannot create 'lvs/clone1': size 4198400 is not an integer multiple of cluster size 1048576 00:15:49.089 [2024-06-07 12:18:12.724910] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name lvol already exists 00:15:49.089 [2024-06-07 12:18:12.725034] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name clone1 already exists 00:15:49.089 passed 00:15:49.089 Test: lvol_esnap_create_delete ...passed 00:15:49.089 Test: lvol_esnap_load_esnaps ...[2024-06-07 12:18:12.725396] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1832:lvs_esnap_bs_dev_create: *ERROR*: Blob 0x2a: no lvs context nor lvol context 00:15:49.089 passed 00:15:49.089 Test: lvol_esnap_missing ...[2024-06-07 12:18:12.725529] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name lvol1 already exists 00:15:49.089 [2024-06-07 12:18:12.725580] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:1162:lvs_verify_lvol_name: *ERROR*: lvol with name lvol1 already exists 00:15:49.089 passed 00:15:49.089 Test: lvol_esnap_hotplug ... 00:15:49.089 lvol_esnap_hotplug scenario 0: PASS - one missing, happy path 00:15:49.089 lvol_esnap_hotplug scenario 1: PASS - one missing, cb registers degraded_set 00:15:49.089 [2024-06-07 12:18:12.726177] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2062:lvs_esnap_degraded_hotplug: *ERROR*: lvol 2d8623e0-5e63-48de-95a8-4225866230ae: failed to create esnap bs_dev: error -12 00:15:49.089 lvol_esnap_hotplug scenario 2: PASS - one missing, cb retuns -ENOMEM 00:15:49.089 lvol_esnap_hotplug scenario 3: PASS - two missing with same esnap, happy path 00:15:49.089 [2024-06-07 12:18:12.726356] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2062:lvs_esnap_degraded_hotplug: *ERROR*: lvol a3fdd5b9-df8a-4e16-a288-c654858cb8dc: failed to create esnap bs_dev: error -12 00:15:49.089 lvol_esnap_hotplug scenario 4: PASS - two missing with same esnap, first -ENOMEM 00:15:49.089 lvol_esnap_hotplug scenario 5: PASS - two missing with same esnap, second -ENOMEM 00:15:49.089 lvol_esnap_hotplug scenario 6: PASS - two missing with different esnaps, happy path 00:15:49.089 [2024-06-07 12:18:12.726461] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2062:lvs_esnap_degraded_hotplug: *ERROR*: lvol 50e613b3-d2a3-4fc6-92d2-13e9d42b1013: failed to create esnap bs_dev: error -12 00:15:49.089 lvol_esnap_hotplug scenario 7: PASS - two missing with different esnaps, first still missing 00:15:49.089 lvol_esnap_hotplug scenario 8: PASS - three missing with same esnap, happy path 00:15:49.089 lvol_esnap_hotplug scenario 9: PASS - three missing with same esnap, first still missing 00:15:49.089 lvol_esnap_hotplug scenario 10: PASS - three missing with same esnap, first two still missing 00:15:49.089 lvol_esnap_hotplug scenario 11: PASS - three missing with same esnap, middle still missing 00:15:49.089 lvol_esnap_hotplug scenario 12: PASS - three missing with same esnap, last still missing 00:15:49.089 passed 00:15:49.090 Test: lvol_get_by ...passed 00:15:49.090 Test: lvol_shallow_copy ...[2024-06-07 12:18:12.727400] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2274:spdk_lvol_shallow_copy: *ERROR*: lvol must not be NULL 00:15:49.090 [2024-06-07 12:18:12.727449] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2281:spdk_lvol_shallow_copy: *ERROR*: lvol 60d7af5b-848d-4cc9-b0c5-01cfb7f8b055 shallow copy, ext_dev must not be NULL 00:15:49.090 passed 00:15:49.090 Test: lvol_set_parent ...[2024-06-07 12:18:12.727690] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2338:spdk_lvol_set_parent: *ERROR*: lvol must not be NULL 00:15:49.090 [2024-06-07 12:18:12.727731] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2344:spdk_lvol_set_parent: *ERROR*: snapshot must not be NULL 00:15:49.090 passed 00:15:49.090 Test: lvol_set_external_parent ...[2024-06-07 12:18:12.728004] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2393:spdk_lvol_set_external_parent: *ERROR*: lvol must not be NULL 00:15:49.090 [2024-06-07 12:18:12.728065] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2399:spdk_lvol_set_external_parent: *ERROR*: snapshot must not be NULL 00:15:49.090 [2024-06-07 12:18:12.728137] /home/vagrant/spdk_repo/spdk/lib/lvol/lvol.c:2406:spdk_lvol_set_external_parent: *ERROR*: lvol lvol and esnap have the same UUID 00:15:49.090 passed 00:15:49.090 00:15:49.090 Run Summary: Type Total Ran Passed Failed Inactive 00:15:49.090 suites 1 1 n/a 0 0 00:15:49.090 tests 37 37 37 0 0 00:15:49.090 asserts 1505 1505 1505 0 n/a 00:15:49.090 00:15:49.090 Elapsed time = 0.012 seconds 00:15:49.349 00:15:49.349 real 0m0.046s 00:15:49.349 user 0m0.020s 00:15:49.349 sys 0m0.026s 00:15:49.349 12:18:12 unittest.unittest_lvol -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:49.349 12:18:12 unittest.unittest_lvol -- common/autotest_common.sh@10 -- # set +x 00:15:49.349 ************************************ 00:15:49.349 END TEST unittest_lvol 00:15:49.349 ************************************ 00:15:49.349 12:18:12 unittest -- unit/unittest.sh@251 -- # grep -q '#define SPDK_CONFIG_RDMA 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:49.349 12:18:12 unittest -- unit/unittest.sh@252 -- # run_test unittest_nvme_rdma /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_rdma.c/nvme_rdma_ut 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:49.349 ************************************ 00:15:49.349 START TEST unittest_nvme_rdma 00:15:49.349 ************************************ 00:15:49.349 12:18:12 unittest.unittest_nvme_rdma -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_rdma.c/nvme_rdma_ut 00:15:49.349 00:15:49.349 00:15:49.349 CUnit - A unit testing framework for C - Version 2.1-3 00:15:49.349 http://cunit.sourceforge.net/ 00:15:49.349 00:15:49.349 00:15:49.349 Suite: nvme_rdma 00:15:49.349 Test: test_nvme_rdma_build_sgl_request ...[2024-06-07 12:18:12.828353] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1459:nvme_rdma_get_memory_translation: *ERROR*: RDMA memory translation failed, rc -34 00:15:49.349 [2024-06-07 12:18:12.828682] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1632:nvme_rdma_build_sgl_request: *ERROR*: SGL length 16777216 exceeds max keyed SGL block size 16777215 00:15:49.349 [2024-06-07 12:18:12.828807] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1688:nvme_rdma_build_sgl_request: *ERROR*: Size of SGL descriptors (64) exceeds ICD (60) 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_build_sgl_inline_request ...passed 00:15:49.349 Test: test_nvme_rdma_build_contig_request ...passed 00:15:49.349 Test: test_nvme_rdma_build_contig_inline_request ...passed 00:15:49.349 Test: test_nvme_rdma_create_reqs ...[2024-06-07 12:18:12.828912] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1569:nvme_rdma_build_contig_request: *ERROR*: SGL length 16777216 exceeds max keyed SGL block size 16777215 00:15:49.349 [2024-06-07 12:18:12.829044] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1011:nvme_rdma_create_reqs: *ERROR*: Failed to allocate rdma_reqs 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_create_rsps ...[2024-06-07 12:18:12.829437] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 929:nvme_rdma_create_rsps: *ERROR*: Failed to allocate rsp_sgls 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_ctrlr_create_qpair ...[2024-06-07 12:18:12.829595] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1826:nvme_rdma_ctrlr_create_qpair: *ERROR*: Failed to create qpair with size 0. Minimum queue size is 2. 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_poller_create ...[2024-06-07 12:18:12.829668] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1826:nvme_rdma_ctrlr_create_qpair: *ERROR*: Failed to create qpair with size 1. Minimum queue size is 2. 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_qpair_process_cm_event ...passed 00:15:49.349 Test: test_nvme_rdma_ctrlr_construct ...[2024-06-07 12:18:12.829842] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 530:nvme_rdma_qpair_process_cm_event: *ERROR*: Unexpected Acceptor Event [255] 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_req_put_and_get ...passed 00:15:49.349 Test: test_nvme_rdma_req_init ...passed 00:15:49.349 Test: test_nvme_rdma_validate_cm_event ...[2024-06-07 12:18:12.830097] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 621:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ADDR_RESOLVED but received RDMA_CM_EVENT_CONNECT_RESPONSE (5) from CM event channel (status = 0) 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_qpair_init ...passed 00:15:49.349 Test: test_nvme_rdma_qpair_submit_request ...passed 00:15:49.349 Test: test_nvme_rdma_memory_domain ...[2024-06-07 12:18:12.830156] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 621:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 10) 00:15:49.349 [2024-06-07 12:18:12.830291] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 353:nvme_rdma_get_memory_domain: *ERROR*: Failed to create memory domain 00:15:49.349 passed 00:15:49.349 Test: test_rdma_ctrlr_get_memory_domains ...passed 00:15:49.349 Test: test_rdma_get_memory_translation ...[2024-06-07 12:18:12.830387] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1448:nvme_rdma_get_memory_translation: *ERROR*: DMA memory translation failed, rc -1, iov count 0 00:15:49.349 [2024-06-07 12:18:12.830471] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:1459:nvme_rdma_get_memory_translation: *ERROR*: RDMA memory translation failed, rc -1 00:15:49.349 passed 00:15:49.349 Test: test_get_rdma_qpair_from_wc ...passed 00:15:49.349 Test: test_nvme_rdma_ctrlr_get_max_sges ...passed 00:15:49.349 Test: test_nvme_rdma_poll_group_get_stats ...[2024-06-07 12:18:12.830551] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:3273:nvme_rdma_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:49.349 [2024-06-07 12:18:12.830606] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:3273:nvme_rdma_poll_group_get_stats: *ERROR*: Invalid stats or group pointer 00:15:49.349 passed 00:15:49.349 Test: test_nvme_rdma_qpair_set_poller ...[2024-06-07 12:18:12.830841] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:2985:nvme_rdma_poller_create: *ERROR*: Unable to create CQ, errno 0. 00:15:49.349 [2024-06-07 12:18:12.830911] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:3031:nvme_rdma_poll_group_get_poller: *ERROR*: Failed to create a poller for device 0xfeedbeef 00:15:49.349 [2024-06-07 12:18:12.830960] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 727:nvme_rdma_qpair_set_poller: *ERROR*: Unable to find a cq for qpair 0x7ffcdcd7ece0 on poll group 0x60c000000040 00:15:49.349 [2024-06-07 12:18:12.831041] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:2985:nvme_rdma_poller_create: *ERROR*: Unable to create CQ, errno 0. 00:15:49.349 [2024-06-07 12:18:12.831100] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c:3031:nvme_rdma_poll_group_get_poller: *ERROR*: Failed to create a poller for device (nil) 00:15:49.349 [2024-06-07 12:18:12.831149] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 727:nvme_rdma_qpair_set_poller: *ERROR*: Unable to find a cq for qpair 0x7ffcdcd7ece0 on poll group 0x60c000000040 00:15:49.349 passed 00:15:49.349 00:15:49.349 Run Summary: Type Total Ran Passed Failed Inactive 00:15:49.349 suites 1 1 n/a 0 0 00:15:49.349 tests 22 22 22 0 0 00:15:49.349 asserts 412 412 412 0 n/a 00:15:49.349 00:15:49.349 Elapsed time = 0.003 seconds 00:15:49.349 [2024-06-07 12:18:12.831247] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_rdma.c: 705:nvme_rdma_resize_cq: *ERROR*: RDMA CQ resize failed: errno 0: Success 00:15:49.349 00:15:49.349 real 0m0.037s 00:15:49.349 user 0m0.024s 00:15:49.349 sys 0m0.013s 00:15:49.349 12:18:12 unittest.unittest_nvme_rdma -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:49.349 12:18:12 unittest.unittest_nvme_rdma -- common/autotest_common.sh@10 -- # set +x 00:15:49.349 ************************************ 00:15:49.349 END TEST unittest_nvme_rdma 00:15:49.349 ************************************ 00:15:49.349 12:18:12 unittest -- unit/unittest.sh@253 -- # run_test unittest_nvmf_transport /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/transport.c/transport_ut 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:49.349 12:18:12 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:49.349 ************************************ 00:15:49.349 START TEST unittest_nvmf_transport 00:15:49.349 ************************************ 00:15:49.349 12:18:12 unittest.unittest_nvmf_transport -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/transport.c/transport_ut 00:15:49.349 00:15:49.349 00:15:49.349 CUnit - A unit testing framework for C - Version 2.1-3 00:15:49.349 http://cunit.sourceforge.net/ 00:15:49.349 00:15:49.349 00:15:49.349 Suite: nvmf 00:15:49.349 Test: test_spdk_nvmf_transport_create ...[2024-06-07 12:18:12.929687] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 251:nvmf_transport_create: *ERROR*: Transport type 'new_ops' unavailable. 00:15:49.349 [2024-06-07 12:18:12.930049] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 271:nvmf_transport_create: *ERROR*: io_unit_size cannot be 0 00:15:49.349 [2024-06-07 12:18:12.930126] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 275:nvmf_transport_create: *ERROR*: io_unit_size 131072 is larger than iobuf pool large buffer size 65536 00:15:49.349 [2024-06-07 12:18:12.930295] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 258:nvmf_transport_create: *ERROR*: max_io_size 4096 must be a power of 2 and be greater than or equal 8KB 00:15:49.349 passed 00:15:49.349 Test: test_nvmf_transport_poll_group_create ...passed 00:15:49.349 Test: test_spdk_nvmf_transport_opts_init ...[2024-06-07 12:18:12.930545] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 792:spdk_nvmf_transport_opts_init: *ERROR*: Transport type invalid_ops unavailable. 00:15:49.349 [2024-06-07 12:18:12.930671] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 797:spdk_nvmf_transport_opts_init: *ERROR*: opts should not be NULL 00:15:49.349 passed 00:15:49.350 Test: test_spdk_nvmf_transport_listen_ext ...passed 00:15:49.350 00:15:49.350 [2024-06-07 12:18:12.930721] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 802:spdk_nvmf_transport_opts_init: *ERROR*: opts_size inside opts should not be zero value 00:15:49.350 Run Summary: Type Total Ran Passed Failed Inactive 00:15:49.350 suites 1 1 n/a 0 0 00:15:49.350 tests 4 4 4 0 0 00:15:49.350 asserts 49 49 49 0 n/a 00:15:49.350 00:15:49.350 Elapsed time = 0.001 seconds 00:15:49.350 00:15:49.350 real 0m0.032s 00:15:49.350 user 0m0.016s 00:15:49.350 sys 0m0.016s 00:15:49.350 12:18:12 unittest.unittest_nvmf_transport -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:49.350 12:18:12 unittest.unittest_nvmf_transport -- common/autotest_common.sh@10 -- # set +x 00:15:49.350 ************************************ 00:15:49.350 END TEST unittest_nvmf_transport 00:15:49.350 ************************************ 00:15:49.350 12:18:12 unittest -- unit/unittest.sh@254 -- # run_test unittest_rdma /home/vagrant/spdk_repo/spdk/test/unit/lib/rdma/common.c/common_ut 00:15:49.350 12:18:12 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:49.350 12:18:12 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:49.350 12:18:12 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:49.608 ************************************ 00:15:49.608 START TEST unittest_rdma 00:15:49.608 ************************************ 00:15:49.608 12:18:13 unittest.unittest_rdma -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/rdma/common.c/common_ut 00:15:49.608 00:15:49.608 00:15:49.608 CUnit - A unit testing framework for C - Version 2.1-3 00:15:49.608 http://cunit.sourceforge.net/ 00:15:49.608 00:15:49.608 00:15:49.608 Suite: rdma_common 00:15:49.609 Test: test_spdk_rdma_pd ...[2024-06-07 12:18:13.026607] /home/vagrant/spdk_repo/spdk/lib/rdma/common.c: 533:spdk_rdma_get_pd: *ERROR*: Failed to get PD 00:15:49.609 [2024-06-07 12:18:13.026988] /home/vagrant/spdk_repo/spdk/lib/rdma/common.c: 533:spdk_rdma_get_pd: *ERROR*: Failed to get PD 00:15:49.609 passed 00:15:49.609 00:15:49.609 Run Summary: Type Total Ran Passed Failed Inactive 00:15:49.609 suites 1 1 n/a 0 0 00:15:49.609 tests 1 1 1 0 0 00:15:49.609 asserts 31 31 31 0 n/a 00:15:49.609 00:15:49.609 Elapsed time = 0.001 seconds 00:15:49.609 00:15:49.609 real 0m0.034s 00:15:49.609 user 0m0.021s 00:15:49.609 sys 0m0.013s 00:15:49.609 12:18:13 unittest.unittest_rdma -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:49.609 12:18:13 unittest.unittest_rdma -- common/autotest_common.sh@10 -- # set +x 00:15:49.609 ************************************ 00:15:49.609 END TEST unittest_rdma 00:15:49.609 ************************************ 00:15:49.609 12:18:13 unittest -- unit/unittest.sh@257 -- # grep -q '#define SPDK_CONFIG_NVME_CUSE 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:49.609 12:18:13 unittest -- unit/unittest.sh@258 -- # run_test unittest_nvme_cuse /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_cuse.c/nvme_cuse_ut 00:15:49.609 12:18:13 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:49.609 12:18:13 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:49.609 12:18:13 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:49.609 ************************************ 00:15:49.609 START TEST unittest_nvme_cuse 00:15:49.609 ************************************ 00:15:49.609 12:18:13 unittest.unittest_nvme_cuse -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvme/nvme_cuse.c/nvme_cuse_ut 00:15:49.609 00:15:49.609 00:15:49.609 CUnit - A unit testing framework for C - Version 2.1-3 00:15:49.609 http://cunit.sourceforge.net/ 00:15:49.609 00:15:49.609 00:15:49.609 Suite: nvme_cuse 00:15:49.609 Test: test_cuse_nvme_submit_io_read_write ...passed 00:15:49.609 Test: test_cuse_nvme_submit_io_read_write_with_md ...passed 00:15:49.609 Test: test_cuse_nvme_submit_passthru_cmd ...passed 00:15:49.609 Test: test_cuse_nvme_submit_passthru_cmd_with_md ...passed 00:15:49.609 Test: test_nvme_cuse_get_cuse_ns_device ...passed 00:15:49.609 Test: test_cuse_nvme_submit_io ...[2024-06-07 12:18:13.130438] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_cuse.c: 667:cuse_nvme_submit_io: *ERROR*: SUBMIT_IO: opc:0 not valid 00:15:49.609 passed 00:15:49.609 Test: test_cuse_nvme_reset ...[2024-06-07 12:18:13.130735] /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_cuse.c: 352:cuse_nvme_reset: *ERROR*: Namespace reset not supported 00:15:49.609 passed 00:15:50.238 Test: test_nvme_cuse_stop ...passed 00:15:50.238 Test: test_spdk_nvme_cuse_get_ctrlr_name ...passed 00:15:50.238 00:15:50.238 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.238 suites 1 1 n/a 0 0 00:15:50.238 tests 9 9 9 0 0 00:15:50.238 asserts 118 118 118 0 n/a 00:15:50.238 00:15:50.238 Elapsed time = 0.502 seconds 00:15:50.238 00:15:50.238 real 0m0.541s 00:15:50.238 user 0m0.243s 00:15:50.238 sys 0m0.297s 00:15:50.238 12:18:13 unittest.unittest_nvme_cuse -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:50.238 12:18:13 unittest.unittest_nvme_cuse -- common/autotest_common.sh@10 -- # set +x 00:15:50.238 ************************************ 00:15:50.238 END TEST unittest_nvme_cuse 00:15:50.238 ************************************ 00:15:50.238 12:18:13 unittest -- unit/unittest.sh@261 -- # run_test unittest_nvmf unittest_nvmf 00:15:50.238 12:18:13 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:50.238 12:18:13 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:50.238 12:18:13 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:50.238 ************************************ 00:15:50.238 START TEST unittest_nvmf 00:15:50.238 ************************************ 00:15:50.238 12:18:13 unittest.unittest_nvmf -- common/autotest_common.sh@1124 -- # unittest_nvmf 00:15:50.238 12:18:13 unittest.unittest_nvmf -- unit/unittest.sh@108 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/ctrlr.c/ctrlr_ut 00:15:50.238 00:15:50.238 00:15:50.238 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.238 http://cunit.sourceforge.net/ 00:15:50.239 00:15:50.239 00:15:50.239 Suite: nvmf 00:15:50.239 Test: test_get_log_page ...passed 00:15:50.239 Test: test_process_fabrics_cmd ...[2024-06-07 12:18:13.734212] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2614:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2 00:15:50.239 passed 00:15:50.239 Test: test_connect ...[2024-06-07 12:18:13.734621] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4683:nvmf_check_qpair_active: *ERROR*: Received command 0x0 on qid 0 before CONNECT 00:15:50.239 [2024-06-07 12:18:13.735219] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1008:nvmf_ctrlr_cmd_connect: *ERROR*: Connect command data length 0x3ff too small 00:15:50.239 [2024-06-07 12:18:13.735400] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 871:_nvmf_ctrlr_connect: *ERROR*: Connect command unsupported RECFMT 1234 00:15:50.239 [2024-06-07 12:18:13.735448] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1047:nvmf_ctrlr_cmd_connect: *ERROR*: Connect HOSTNQN is not null terminated 00:15:50.239 [2024-06-07 12:18:13.735518] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 818:nvmf_qpair_access_allowed: *ERROR*: Subsystem 'nqn.2016-06.io.spdk:subsystem1' does not allow host 'nqn.2016-06.io.spdk:host1' 00:15:50.239 [2024-06-07 12:18:13.735665] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 882:_nvmf_ctrlr_connect: *ERROR*: Invalid SQSIZE = 0 00:15:50.239 [2024-06-07 12:18:13.735752] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 889:_nvmf_ctrlr_connect: *ERROR*: Invalid SQSIZE for admin queue 32 (min 1, max 31) 00:15:50.239 [2024-06-07 12:18:13.735797] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 895:_nvmf_ctrlr_connect: *ERROR*: Invalid SQSIZE 64 (min 1, max 63) 00:15:50.239 [2024-06-07 12:18:13.735863] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 922:_nvmf_ctrlr_connect: *ERROR*: The NVMf target only supports dynamic mode (CNTLID = 0x1234). 00:15:50.239 [2024-06-07 12:18:13.735977] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 757:_nvmf_ctrlr_add_io_qpair: *ERROR*: Unknown controller ID 0xffff 00:15:50.239 [2024-06-07 12:18:13.736094] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 672:nvmf_ctrlr_add_io_qpair: *ERROR*: I/O connect not allowed on discovery controller 00:15:50.239 [2024-06-07 12:18:13.736473] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 678:nvmf_ctrlr_add_io_qpair: *ERROR*: Got I/O connect before ctrlr was enabled 00:15:50.239 [2024-06-07 12:18:13.736598] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 684:nvmf_ctrlr_add_io_qpair: *ERROR*: Got I/O connect with invalid IOSQES 3 00:15:50.239 [2024-06-07 12:18:13.736685] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 691:nvmf_ctrlr_add_io_qpair: *ERROR*: Got I/O connect with invalid IOCQES 3 00:15:50.239 [2024-06-07 12:18:13.736784] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 715:nvmf_ctrlr_add_io_qpair: *ERROR*: Requested QID 3 but Max QID is 2 00:15:50.239 [2024-06-07 12:18:13.736888] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 294:nvmf_ctrlr_add_qpair: *ERROR*: Got I/O connect with duplicate QID 1 00:15:50.239 [2024-06-07 12:18:13.737078] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 802:_nvmf_ctrlr_add_io_qpair: *ERROR*: Inactive admin qpair (state 4, group (nil)) 00:15:50.239 passed 00:15:50.239 Test: test_get_ns_id_desc_list ...[2024-06-07 12:18:13.737161] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c: 802:_nvmf_ctrlr_add_io_qpair: *ERROR*: Inactive admin qpair (state 0, group (nil)) 00:15:50.239 passed 00:15:50.239 Test: test_identify_ns ...[2024-06-07 12:18:13.737493] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2708:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:15:50.239 [2024-06-07 12:18:13.737729] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2708:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 4 00:15:50.239 [2024-06-07 12:18:13.737842] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2708:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 4294967295 00:15:50.239 passed 00:15:50.239 Test: test_identify_ns_iocs_specific ...[2024-06-07 12:18:13.738016] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2708:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:15:50.239 [2024-06-07 12:18:13.738299] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:2708:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:15:50.239 passed 00:15:50.239 Test: test_reservation_write_exclusive ...passed 00:15:50.239 Test: test_reservation_exclusive_access ...passed 00:15:50.239 Test: test_reservation_write_exclusive_regs_only_and_all_regs ...passed 00:15:50.239 Test: test_reservation_exclusive_access_regs_only_and_all_regs ...passed 00:15:50.239 Test: test_reservation_notification_log_page ...passed 00:15:50.239 Test: test_get_dif_ctx ...passed 00:15:50.239 Test: test_set_get_features ...[2024-06-07 12:18:13.738946] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1644:temp_threshold_opts_valid: *ERROR*: Invalid TMPSEL 9 00:15:50.239 [2024-06-07 12:18:13.739052] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1644:temp_threshold_opts_valid: *ERROR*: Invalid TMPSEL 9 00:15:50.239 [2024-06-07 12:18:13.739110] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1655:temp_threshold_opts_valid: *ERROR*: Invalid THSEL 3 00:15:50.239 passed 00:15:50.239 Test: test_identify_ctrlr ...passed 00:15:50.239 Test: test_identify_ctrlr_iocs_specific ...[2024-06-07 12:18:13.739182] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1731:nvmf_ctrlr_set_features_error_recovery: *ERROR*: Host set unsupported DULBE bit 00:15:50.239 passed 00:15:50.239 Test: test_custom_admin_cmd ...passed 00:15:50.239 Test: test_fused_compare_and_write ...[2024-06-07 12:18:13.739582] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4216:nvmf_ctrlr_process_io_fused_cmd: *ERROR*: Wrong sequence of fused operations 00:15:50.239 [2024-06-07 12:18:13.739643] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4205:nvmf_ctrlr_process_io_fused_cmd: *ERROR*: Wrong op code of fused operations 00:15:50.239 passed 00:15:50.239 Test: test_multi_async_event_reqs ...passed 00:15:50.239 Test: test_get_ana_log_page_one_ns_per_anagrp ...passed 00:15:50.239 Test: test_get_ana_log_page_multi_ns_per_anagrp ...[2024-06-07 12:18:13.739706] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4223:nvmf_ctrlr_process_io_fused_cmd: *ERROR*: Wrong op code of fused operations 00:15:50.239 passed 00:15:50.239 Test: test_multi_async_events ...passed 00:15:50.239 Test: test_rae ...passed 00:15:50.239 Test: test_nvmf_ctrlr_create_destruct ...passed 00:15:50.239 Test: test_nvmf_ctrlr_use_zcopy ...passed 00:15:50.239 Test: test_spdk_nvmf_request_zcopy_start ...[2024-06-07 12:18:13.740286] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4683:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 1 before CONNECT 00:15:50.239 passed 00:15:50.239 Test: test_zcopy_read ...passed 00:15:50.239 Test: test_zcopy_write ...passed 00:15:50.239 Test: test_nvmf_property_set ...passed 00:15:50.239 Test: test_nvmf_ctrlr_get_features_host_behavior_support ...[2024-06-07 12:18:13.740371] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4709:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 1 in state 4 00:15:50.239 [2024-06-07 12:18:13.740560] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1942:nvmf_ctrlr_get_features_host_behavior_support: *ERROR*: invalid data buffer for Host Behavior Support 00:15:50.239 passed 00:15:50.239 Test: test_nvmf_ctrlr_set_features_host_behavior_support ...[2024-06-07 12:18:13.740623] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1942:nvmf_ctrlr_get_features_host_behavior_support: *ERROR*: invalid data buffer for Host Behavior Support 00:15:50.239 [2024-06-07 12:18:13.740702] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1965:nvmf_ctrlr_set_features_host_behavior_support: *ERROR*: Host Behavior Support invalid iovcnt: 0 00:15:50.239 [2024-06-07 12:18:13.740768] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1971:nvmf_ctrlr_set_features_host_behavior_support: *ERROR*: Host Behavior Support invalid iov_len: 0 00:15:50.239 passed 00:15:50.239 Test: test_nvmf_ctrlr_ns_attachment ...[2024-06-07 12:18:13.740868] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:1983:nvmf_ctrlr_set_features_host_behavior_support: *ERROR*: Host Behavior Support invalid acre: 0x02 00:15:50.239 passed 00:15:50.239 Test: test_nvmf_check_qpair_active ...[2024-06-07 12:18:13.741018] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4683:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 0 before CONNECT 00:15:50.239 [2024-06-07 12:18:13.741097] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4697:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 0 before authentication 00:15:50.239 [2024-06-07 12:18:13.741160] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4709:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 0 in state 0 00:15:50.239 [2024-06-07 12:18:13.741217] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4709:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 0 in state 4 00:15:50.239 passed 00:15:50.239 00:15:50.239 [2024-06-07 12:18:13.741276] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr.c:4709:nvmf_check_qpair_active: *ERROR*: Received command 0x2 on qid 0 in state 5 00:15:50.239 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.239 suites 1 1 n/a 0 0 00:15:50.239 tests 32 32 32 0 0 00:15:50.239 asserts 977 977 977 0 n/a 00:15:50.239 00:15:50.239 Elapsed time = 0.007 seconds 00:15:50.239 12:18:13 unittest.unittest_nvmf -- unit/unittest.sh@109 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/ctrlr_bdev.c/ctrlr_bdev_ut 00:15:50.239 00:15:50.239 00:15:50.239 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.239 http://cunit.sourceforge.net/ 00:15:50.239 00:15:50.239 00:15:50.239 Suite: nvmf 00:15:50.239 Test: test_get_rw_params ...passed 00:15:50.239 Test: test_get_rw_ext_params ...passed 00:15:50.239 Test: test_lba_in_range ...passed 00:15:50.239 Test: test_get_dif_ctx ...passed 00:15:50.239 Test: test_nvmf_bdev_ctrlr_identify_ns ...passed 00:15:50.239 Test: test_spdk_nvmf_bdev_ctrlr_compare_and_write_cmd ...[2024-06-07 12:18:13.769577] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 447:nvmf_bdev_ctrlr_compare_and_write_cmd: *ERROR*: Fused command start lba / num blocks mismatch 00:15:50.239 [2024-06-07 12:18:13.769925] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 455:nvmf_bdev_ctrlr_compare_and_write_cmd: *ERROR*: end of media 00:15:50.239 passed 00:15:50.239 Test: test_nvmf_bdev_ctrlr_zcopy_start ...[2024-06-07 12:18:13.770063] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 462:nvmf_bdev_ctrlr_compare_and_write_cmd: *ERROR*: Write NLB 2 * block size 512 > SGL length 1023 00:15:50.239 [2024-06-07 12:18:13.770138] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 965:nvmf_bdev_ctrlr_zcopy_start: *ERROR*: end of media 00:15:50.239 [2024-06-07 12:18:13.770274] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 972:nvmf_bdev_ctrlr_zcopy_start: *ERROR*: Read NLB 2 * block size 512 > SGL length 1023 00:15:50.239 passed 00:15:50.240 Test: test_nvmf_bdev_ctrlr_cmd ...[2024-06-07 12:18:13.770405] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 401:nvmf_bdev_ctrlr_compare_cmd: *ERROR*: end of media 00:15:50.240 [2024-06-07 12:18:13.770456] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 408:nvmf_bdev_ctrlr_compare_cmd: *ERROR*: Compare NLB 3 * block size 512 > SGL length 512 00:15:50.240 [2024-06-07 12:18:13.770539] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 500:nvmf_bdev_ctrlr_write_zeroes_cmd: *ERROR*: invalid write zeroes size, should not exceed 1Kib 00:15:50.240 passed 00:15:50.240 Test: test_nvmf_bdev_ctrlr_read_write_cmd ...passed 00:15:50.240 Test: test_nvmf_bdev_ctrlr_nvme_passthru ...passed 00:15:50.240 00:15:50.240 [2024-06-07 12:18:13.770583] /home/vagrant/spdk_repo/spdk/lib/nvmf/ctrlr_bdev.c: 507:nvmf_bdev_ctrlr_write_zeroes_cmd: *ERROR*: end of media 00:15:50.240 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.240 suites 1 1 n/a 0 0 00:15:50.240 tests 10 10 10 0 0 00:15:50.240 asserts 159 159 159 0 n/a 00:15:50.240 00:15:50.240 Elapsed time = 0.001 seconds 00:15:50.240 12:18:13 unittest.unittest_nvmf -- unit/unittest.sh@110 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/ctrlr_discovery.c/ctrlr_discovery_ut 00:15:50.240 00:15:50.240 00:15:50.240 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.240 http://cunit.sourceforge.net/ 00:15:50.240 00:15:50.240 00:15:50.240 Suite: nvmf 00:15:50.240 Test: test_discovery_log ...passed 00:15:50.240 Test: test_discovery_log_with_filters ...passed 00:15:50.240 00:15:50.240 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.240 suites 1 1 n/a 0 0 00:15:50.240 tests 2 2 2 0 0 00:15:50.240 asserts 238 238 238 0 n/a 00:15:50.240 00:15:50.240 Elapsed time = 0.003 seconds 00:15:50.240 12:18:13 unittest.unittest_nvmf -- unit/unittest.sh@111 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/subsystem.c/subsystem_ut 00:15:50.240 00:15:50.240 00:15:50.240 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.240 http://cunit.sourceforge.net/ 00:15:50.240 00:15:50.240 00:15:50.240 Suite: nvmf 00:15:50.240 Test: nvmf_test_create_subsystem ...[2024-06-07 12:18:13.841448] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 125:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2016-06.io.spdk:". NQN must contain user specified name with a ':' as a prefix. 00:15:50.240 [2024-06-07 12:18:13.841712] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.spdk:' is invalid 00:15:50.240 [2024-06-07 12:18:13.841872] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 134:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io.abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz:sub". At least one Label is too long. 00:15:50.240 [2024-06-07 12:18:13.841998] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz:sub' is invalid 00:15:50.240 [2024-06-07 12:18:13.842041] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 146:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io.3spdk:sub". Label names must start with a letter. 00:15:50.240 [2024-06-07 12:18:13.842088] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.3spdk:sub' is invalid 00:15:50.240 [2024-06-07 12:18:13.842193] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 146:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io.-spdk:subsystem1". Label names must start with a letter. 00:15:50.240 [2024-06-07 12:18:13.842286] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.-spdk:subsystem1' is invalid 00:15:50.240 [2024-06-07 12:18:13.842332] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 183:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io.spdk-:subsystem1". Label names must end with an alphanumeric symbol. 00:15:50.240 [2024-06-07 12:18:13.842397] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.spdk-:subsystem1' is invalid 00:15:50.240 [2024-06-07 12:18:13.842436] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 146:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io..spdk:subsystem1". Label names must start with a letter. 00:15:50.240 [2024-06-07 12:18:13.842490] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io..spdk:subsystem1' is invalid 00:15:50.240 [2024-06-07 12:18:13.842589] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 79:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2016-06.io.spdk:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa": length 224 > max 223 00:15:50.240 [2024-06-07 12:18:13.842704] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.spdk:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' is invalid 00:15:50.240 [2024-06-07 12:18:13.842805] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 207:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io.spdk:�subsystem1". Label names must contain only valid utf-8. 00:15:50.240 [2024-06-07 12:18:13.842862] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2016-06.io.spdk:�subsystem1' is invalid 00:15:50.240 [2024-06-07 12:18:13.842964] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 97:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2014-08.org.nvmexpress:uuid:ff9b6406-0fc8-4779-80ca-4dca14bda0d2aaaa": uuid is not the correct length 00:15:50.240 [2024-06-07 12:18:13.843018] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2014-08.org.nvmexpress:uuid:ff9b6406-0fc8-4779-80ca-4dca14bda0d2aaaa' is invalid 00:15:50.240 [2024-06-07 12:18:13.843060] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 102:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2014-08.org.nvmexpress:uuid:ff9b64-060fc8-4779-80ca-4dca14bda0d2": uuid is not formatted correctly 00:15:50.240 [2024-06-07 12:18:13.843120] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2014-08.org.nvmexpress:uuid:ff9b64-060fc8-4779-80ca-4dca14bda0d2' is invalid 00:15:50.240 passed 00:15:50.240 Test: test_spdk_nvmf_subsystem_add_ns ...[2024-06-07 12:18:13.843167] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 102:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2014-08.org.nvmexpress:uuid:ff9hg406-0fc8-4779-80ca-4dca14bda0d2": uuid is not formatted correctly 00:15:50.240 [2024-06-07 12:18:13.843210] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 233:spdk_nvmf_subsystem_create: *ERROR*: Subsystem NQN 'nqn.2014-08.org.nvmexpress:uuid:ff9hg406-0fc8-4779-80ca-4dca14bda0d2' is invalid 00:15:50.240 [2024-06-07 12:18:13.843387] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:2037:spdk_nvmf_subsystem_add_ns_ext: *ERROR*: Requested NSID 5 already in use 00:15:50.240 passed 00:15:50.240 Test: test_spdk_nvmf_subsystem_add_fdp_ns ...[2024-06-07 12:18:13.843440] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:2010:spdk_nvmf_subsystem_add_ns_ext: *ERROR*: Invalid NSID 4294967295 00:15:50.240 passed 00:15:50.240 Test: test_spdk_nvmf_subsystem_set_sn ...passed 00:15:50.240 Test: test_spdk_nvmf_ns_visible ...[2024-06-07 12:18:13.843663] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:2141:spdk_nvmf_subsystem_add_ns_ext: *ERROR*: Subsystem with id: 0 can only add FDP namespace. 00:15:50.240 [2024-06-07 12:18:13.843834] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 85:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "": length 0 < min 11 00:15:50.240 passed 00:15:50.240 Test: test_reservation_register ...[2024-06-07 12:18:13.844168] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 [2024-06-07 12:18:13.844298] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3138:nvmf_ns_reservation_register: *ERROR*: No registrant 00:15:50.240 passed 00:15:50.240 Test: test_reservation_register_with_ptpl ...passed 00:15:50.240 Test: test_reservation_acquire_preempt_1 ...[2024-06-07 12:18:13.845661] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_acquire_release_with_ptpl ...passed 00:15:50.240 Test: test_reservation_release ...[2024-06-07 12:18:13.848508] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_unregister_notification ...[2024-06-07 12:18:13.848760] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_release_notification ...[2024-06-07 12:18:13.848993] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_release_notification_write_exclusive ...[2024-06-07 12:18:13.849206] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_clear_notification ...[2024-06-07 12:18:13.849425] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_reservation_preempt_notification ...[2024-06-07 12:18:13.849651] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3080:nvmf_ns_reservation_register: *ERROR*: The same host already register a key with 0xa1 00:15:50.240 passed 00:15:50.240 Test: test_spdk_nvmf_ns_event ...passed 00:15:50.240 Test: test_nvmf_ns_reservation_add_remove_registrant ...passed 00:15:50.240 Test: test_nvmf_subsystem_add_ctrlr ...passed 00:15:50.240 Test: test_spdk_nvmf_subsystem_add_host ...[2024-06-07 12:18:13.851147] /home/vagrant/spdk_repo/spdk/lib/nvmf/transport.c: 264:nvmf_transport_create: *ERROR*: max_aq_depth 0 is less than minimum defined by NVMf spec, use min value 00:15:50.240 [2024-06-07 12:18:13.851350] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:1051:spdk_nvmf_subsystem_add_host_ext: *ERROR*: Unable to add host to transport_ut transport 00:15:50.240 passed 00:15:50.240 Test: test_nvmf_ns_reservation_report ...[2024-06-07 12:18:13.851637] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:3443:nvmf_ns_reservation_report: *ERROR*: NVMeoF uses extended controller data structure, please set EDS bit in cdw11 and try again 00:15:50.240 passed 00:15:50.240 Test: test_nvmf_nqn_is_valid ...[2024-06-07 12:18:13.851801] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 85:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.": length 4 < min 11 00:15:50.240 [2024-06-07 12:18:13.851950] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 97:nvmf_nqn_is_valid: *ERROR*: Invalid NQN "nqn.2014-08.org.nvmexpress:uuid:100423ce-6dba-4bb4-ad53-30464c1410d": uuid is not the correct length 00:15:50.240 [2024-06-07 12:18:13.852048] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c: 146:nvmf_nqn_is_valid: *ERROR*: Invalid domain name in NQN "nqn.2016-06.io...spdk:cnode1". Label names must start with a letter. 00:15:50.240 passed 00:15:50.240 Test: test_nvmf_ns_reservation_restore ...[2024-06-07 12:18:13.852331] /home/vagrant/spdk_repo/spdk/lib/nvmf/subsystem.c:2637:nvmf_ns_reservation_restore: *ERROR*: Existing bdev UUID is not same with configuration file 00:15:50.240 passed 00:15:50.240 Test: test_nvmf_subsystem_state_change ...passed 00:15:50.240 Test: test_nvmf_reservation_custom_ops ...passed 00:15:50.240 00:15:50.240 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.240 suites 1 1 n/a 0 0 00:15:50.240 tests 24 24 24 0 0 00:15:50.240 asserts 499 499 499 0 n/a 00:15:50.241 00:15:50.241 Elapsed time = 0.011 seconds 00:15:50.241 12:18:13 unittest.unittest_nvmf -- unit/unittest.sh@112 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/tcp.c/tcp_ut 00:15:50.498 00:15:50.498 00:15:50.498 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.498 http://cunit.sourceforge.net/ 00:15:50.498 00:15:50.498 00:15:50.498 Suite: nvmf 00:15:50.498 Test: test_nvmf_tcp_create ...[2024-06-07 12:18:13.916352] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c: 745:nvmf_tcp_create: *ERROR*: Unsupported IO Unit size specified, 16 bytes 00:15:50.498 passed 00:15:50.498 Test: test_nvmf_tcp_destroy ...passed 00:15:50.498 Test: test_nvmf_tcp_poll_group_create ...passed 00:15:50.498 Test: test_nvmf_tcp_send_c2h_data ...passed 00:15:50.498 Test: test_nvmf_tcp_h2c_data_hdr_handle ...passed 00:15:50.498 Test: test_nvmf_tcp_in_capsule_data_handle ...passed 00:15:50.498 Test: test_nvmf_tcp_qpair_init_mem_resource ...passed 00:15:50.498 Test: test_nvmf_tcp_send_c2h_term_req ...[2024-06-07 12:18:14.071211] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.498 [2024-06-07 12:18:14.071346] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.071467] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.071530] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.498 passed 00:15:50.498 Test: test_nvmf_tcp_send_capsule_resp_pdu ...passed 00:15:50.498 Test: test_nvmf_tcp_icreq_handle ...[2024-06-07 12:18:14.072088] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.072216] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2117:nvmf_tcp_icreq_handle: *ERROR*: Expected ICReq PFV 0, got 1 00:15:50.498 [2024-06-07 12:18:14.072379] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.498 [2024-06-07 12:18:14.072569] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.072672] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2117:nvmf_tcp_icreq_handle: *ERROR*: Expected ICReq PFV 0, got 1 00:15:50.498 [2024-06-07 12:18:14.072724] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.072765] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.498 [2024-06-07 12:18:14.072929] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.498 [2024-06-07 12:18:14.072974] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write IC_RESP to socket: rc=0, errno=0 00:15:50.498 passed 00:15:50.499 Test: test_nvmf_tcp_check_xfer_type ...passed 00:15:50.499 Test: test_nvmf_tcp_invalid_sgl ...[2024-06-07 12:18:14.073479] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.073573] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2512:nvmf_tcp_req_parse_sgl: *ERROR*: SGL length 0x1001 exceeds max io size 0x1000 00:15:50.499 [2024-06-07 12:18:14.073627] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.073671] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9a10 is same with the state(5) to be set 00:15:50.499 passed 00:15:50.499 Test: test_nvmf_tcp_pdu_ch_handle ...[2024-06-07 12:18:14.073729] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2244:nvmf_tcp_pdu_ch_handle: *ERROR*: Already received ICreq PDU, and reject this pdu=0x7ffc170ba770 00:15:50.499 [2024-06-07 12:18:14.074014] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.074080] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.074328] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2301:nvmf_tcp_pdu_ch_handle: *ERROR*: PDU type=0x00, Expected ICReq header length 128, got 0 on tqpair=0x7ffc170b9ed0 00:15:50.499 [2024-06-07 12:18:14.074383] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.074538] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.074690] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2254:nvmf_tcp_pdu_ch_handle: *ERROR*: The TCP/IP connection is not negotiated 00:15:50.499 [2024-06-07 12:18:14.074841] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.074997] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.075122] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:2293:nvmf_tcp_pdu_ch_handle: *ERROR*: Unexpected PDU type 0x05 00:15:50.499 [2024-06-07 12:18:14.075163] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.075217] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.075287] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.075443] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.075571] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.075608] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.075762] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.076220] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.076389] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.076444] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.076509] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.076553] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 [2024-06-07 12:18:14.076767] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1086:_tcp_write_pdu: *ERROR*: Could not write TERM_REQ to socket: rc=0, errno=0 00:15:50.499 [2024-06-07 12:18:14.076817] /home/vagrant/spdk_repo/spdk/lib/nvmf/tcp.c:1602:nvmf_tcp_qpair_set_recv_state: *ERROR*: The recv state of tqpair=0x7ffc170b9ed0 is same with the state(5) to be set 00:15:50.499 passed 00:15:50.499 Test: test_nvmf_tcp_tls_add_remove_credentials ...passed 00:15:50.499 Test: test_nvmf_tcp_tls_generate_psk_id ...[2024-06-07 12:18:14.109820] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 591:nvme_tcp_generate_psk_identity: *ERROR*: Out buffer too small! 00:15:50.499 passed 00:15:50.499 Test: test_nvmf_tcp_tls_generate_retained_psk ...[2024-06-07 12:18:14.109962] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 602:nvme_tcp_generate_psk_identity: *ERROR*: Unknown cipher suite requested! 00:15:50.499 [2024-06-07 12:18:14.110768] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 658:nvme_tcp_derive_retained_psk: *ERROR*: Unknown PSK hash requested! 00:15:50.499 [2024-06-07 12:18:14.110866] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 663:nvme_tcp_derive_retained_psk: *ERROR*: Insufficient buffer size for out key! 00:15:50.499 passed 00:15:50.499 Test: test_nvmf_tcp_tls_generate_tls_psk ...[2024-06-07 12:18:14.111525] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 732:nvme_tcp_derive_tls_psk: *ERROR*: Unknown cipher suite requested! 00:15:50.499 passed[2024-06-07 12:18:14.111597] /home/vagrant/spdk_repo/spdk/include/spdk_internal/nvme_tcp.h: 756:nvme_tcp_derive_tls_psk: *ERROR*: Insufficient buffer size for out key! 00:15:50.499 00:15:50.499 00:15:50.499 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.499 suites 1 1 n/a 0 0 00:15:50.499 tests 17 17 17 0 0 00:15:50.499 asserts 222 222 222 0 n/a 00:15:50.499 00:15:50.499 Elapsed time = 0.215 seconds 00:15:50.757 12:18:14 unittest.unittest_nvmf -- unit/unittest.sh@113 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/nvmf.c/nvmf_ut 00:15:50.757 00:15:50.757 00:15:50.757 CUnit - A unit testing framework for C - Version 2.1-3 00:15:50.757 http://cunit.sourceforge.net/ 00:15:50.757 00:15:50.757 00:15:50.757 Suite: nvmf 00:15:50.757 Test: test_nvmf_tgt_create_poll_group ...passed 00:15:50.757 00:15:50.757 Run Summary: Type Total Ran Passed Failed Inactive 00:15:50.757 suites 1 1 n/a 0 0 00:15:50.757 tests 1 1 1 0 0 00:15:50.757 asserts 17 17 17 0 n/a 00:15:50.757 00:15:50.757 Elapsed time = 0.031 seconds 00:15:50.757 00:15:50.757 real 0m0.604s 00:15:50.757 user 0m0.255s 00:15:50.757 sys 0m0.334s 00:15:50.757 12:18:14 unittest.unittest_nvmf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:50.757 12:18:14 unittest.unittest_nvmf -- common/autotest_common.sh@10 -- # set +x 00:15:50.757 ************************************ 00:15:50.757 END TEST unittest_nvmf 00:15:50.757 ************************************ 00:15:50.757 12:18:14 unittest -- unit/unittest.sh@262 -- # grep -q '#define SPDK_CONFIG_FC 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:50.757 12:18:14 unittest -- unit/unittest.sh@267 -- # grep -q '#define SPDK_CONFIG_RDMA 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:50.757 12:18:14 unittest -- unit/unittest.sh@268 -- # run_test unittest_nvmf_rdma /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/rdma.c/rdma_ut 00:15:50.757 12:18:14 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:50.757 12:18:14 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:50.757 12:18:14 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:50.757 ************************************ 00:15:50.757 START TEST unittest_nvmf_rdma 00:15:50.757 ************************************ 00:15:50.757 12:18:14 unittest.unittest_nvmf_rdma -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/nvmf/rdma.c/rdma_ut 00:15:51.015 00:15:51.015 00:15:51.015 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.015 http://cunit.sourceforge.net/ 00:15:51.015 00:15:51.015 00:15:51.015 Suite: nvmf 00:15:51.015 Test: test_spdk_nvmf_rdma_request_parse_sgl ...[2024-06-07 12:18:14.410272] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c:1858:nvmf_rdma_request_parse_sgl: *ERROR*: SGL length 0x40000 exceeds max io size 0x20000 00:15:51.015 [2024-06-07 12:18:14.410604] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c:1908:nvmf_rdma_request_parse_sgl: *ERROR*: In-capsule data length 0x1000 exceeds capsule length 0x0 00:15:51.015 [2024-06-07 12:18:14.410665] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c:1908:nvmf_rdma_request_parse_sgl: *ERROR*: In-capsule data length 0x2000 exceeds capsule length 0x1000 00:15:51.015 passed 00:15:51.015 Test: test_spdk_nvmf_rdma_request_process ...passed 00:15:51.015 Test: test_nvmf_rdma_get_optimal_poll_group ...passed 00:15:51.015 Test: test_spdk_nvmf_rdma_request_parse_sgl_with_md ...passed 00:15:51.015 Test: test_nvmf_rdma_opts_init ...passed 00:15:51.015 Test: test_nvmf_rdma_request_free_data ...passed 00:15:51.015 Test: test_nvmf_rdma_resources_create ...passed 00:15:51.015 Test: test_nvmf_rdma_qpair_compare ...passed 00:15:51.015 Test: test_nvmf_rdma_resize_cq ...[2024-06-07 12:18:14.413308] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c: 949:nvmf_rdma_resize_cq: *ERROR*: iWARP doesn't support CQ resize. Current capacity 20, required 0 00:15:51.015 Using CQ of insufficient size may lead to CQ overrun 00:15:51.015 [2024-06-07 12:18:14.413445] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c: 954:nvmf_rdma_resize_cq: *ERROR*: RDMA CQE requirement (26) exceeds device max_cqe limitation (3) 00:15:51.016 [2024-06-07 12:18:14.413519] /home/vagrant/spdk_repo/spdk/lib/nvmf/rdma.c: 962:nvmf_rdma_resize_cq: *ERROR*: RDMA CQ resize failed: errno 0: Success 00:15:51.016 passed 00:15:51.016 00:15:51.016 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.016 suites 1 1 n/a 0 0 00:15:51.016 tests 9 9 9 0 0 00:15:51.016 asserts 579 579 579 0 n/a 00:15:51.016 00:15:51.016 Elapsed time = 0.003 seconds 00:15:51.016 00:15:51.016 real 0m0.035s 00:15:51.016 user 0m0.015s 00:15:51.016 sys 0m0.019s 00:15:51.016 12:18:14 unittest.unittest_nvmf_rdma -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.016 12:18:14 unittest.unittest_nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:15:51.016 ************************************ 00:15:51.016 END TEST unittest_nvmf_rdma 00:15:51.016 ************************************ 00:15:51.016 12:18:14 unittest -- unit/unittest.sh@271 -- # grep -q '#define SPDK_CONFIG_VFIO_USER 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:51.016 12:18:14 unittest -- unit/unittest.sh@275 -- # run_test unittest_scsi unittest_scsi 00:15:51.016 12:18:14 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:51.016 12:18:14 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.016 12:18:14 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:51.016 ************************************ 00:15:51.016 START TEST unittest_scsi 00:15:51.016 ************************************ 00:15:51.016 12:18:14 unittest.unittest_scsi -- common/autotest_common.sh@1124 -- # unittest_scsi 00:15:51.016 12:18:14 unittest.unittest_scsi -- unit/unittest.sh@117 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/scsi/dev.c/dev_ut 00:15:51.016 00:15:51.016 00:15:51.016 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.016 http://cunit.sourceforge.net/ 00:15:51.016 00:15:51.016 00:15:51.016 Suite: dev_suite 00:15:51.016 Test: dev_destruct_null_dev ...passed 00:15:51.016 Test: dev_destruct_zero_luns ...passed 00:15:51.016 Test: dev_destruct_null_lun ...passed 00:15:51.016 Test: dev_destruct_success ...passed 00:15:51.016 Test: dev_construct_num_luns_zero ...[2024-06-07 12:18:14.521263] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 228:spdk_scsi_dev_construct_ext: *ERROR*: device Name: no LUNs specified 00:15:51.016 passed 00:15:51.016 Test: dev_construct_no_lun_zero ...[2024-06-07 12:18:14.522050] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 241:spdk_scsi_dev_construct_ext: *ERROR*: device Name: no LUN 0 specified 00:15:51.016 passed 00:15:51.016 Test: dev_construct_null_lun ...[2024-06-07 12:18:14.522204] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 247:spdk_scsi_dev_construct_ext: *ERROR*: NULL spdk_scsi_lun for LUN 0 00:15:51.016 passed 00:15:51.016 Test: dev_construct_name_too_long ...[2024-06-07 12:18:14.522349] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 222:spdk_scsi_dev_construct_ext: *ERROR*: device xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx: name longer than maximum allowed length 255 00:15:51.016 passed 00:15:51.016 Test: dev_construct_success ...passed 00:15:51.016 Test: dev_construct_success_lun_zero_not_first ...passed 00:15:51.016 Test: dev_queue_mgmt_task_success ...passed 00:15:51.016 Test: dev_queue_task_success ...passed 00:15:51.016 Test: dev_stop_success ...passed 00:15:51.016 Test: dev_add_port_max_ports ...[2024-06-07 12:18:14.523118] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 315:spdk_scsi_dev_add_port: *ERROR*: device already has 4 ports 00:15:51.016 passed 00:15:51.016 Test: dev_add_port_construct_failure1 ...[2024-06-07 12:18:14.523623] /home/vagrant/spdk_repo/spdk/lib/scsi/port.c: 49:scsi_port_construct: *ERROR*: port name too long 00:15:51.016 passed 00:15:51.016 Test: dev_add_port_construct_failure2 ...[2024-06-07 12:18:14.524123] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 321:spdk_scsi_dev_add_port: *ERROR*: device already has port(1) 00:15:51.016 passed 00:15:51.016 Test: dev_add_port_success1 ...passed 00:15:51.016 Test: dev_add_port_success2 ...passed 00:15:51.016 Test: dev_add_port_success3 ...passed 00:15:51.016 Test: dev_find_port_by_id_num_ports_zero ...passed 00:15:51.016 Test: dev_find_port_by_id_id_not_found_failure ...passed 00:15:51.016 Test: dev_find_port_by_id_success ...passed 00:15:51.016 Test: dev_add_lun_bdev_not_found ...passed 00:15:51.016 Test: dev_add_lun_no_free_lun_id ...[2024-06-07 12:18:14.525765] /home/vagrant/spdk_repo/spdk/lib/scsi/dev.c: 159:spdk_scsi_dev_add_lun_ext: *ERROR*: Free LUN ID is not found 00:15:51.016 passed 00:15:51.016 Test: dev_add_lun_success1 ...passed 00:15:51.016 Test: dev_add_lun_success2 ...passed 00:15:51.016 Test: dev_check_pending_tasks ...passed 00:15:51.016 Test: dev_iterate_luns ...passed 00:15:51.016 Test: dev_find_free_lun ...passed 00:15:51.016 00:15:51.016 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.016 suites 1 1 n/a 0 0 00:15:51.016 tests 29 29 29 0 0 00:15:51.016 asserts 97 97 97 0 n/a 00:15:51.016 00:15:51.016 Elapsed time = 0.003 seconds 00:15:51.016 12:18:14 unittest.unittest_scsi -- unit/unittest.sh@118 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/scsi/lun.c/lun_ut 00:15:51.016 00:15:51.016 00:15:51.016 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.016 http://cunit.sourceforge.net/ 00:15:51.016 00:15:51.016 00:15:51.016 Suite: lun_suite 00:15:51.016 Test: lun_task_mgmt_execute_abort_task_not_supported ...[2024-06-07 12:18:14.569901] /home/vagrant/spdk_repo/spdk/lib/scsi/lun.c: 169:_scsi_lun_execute_mgmt_task: *ERROR*: abort task not supported 00:15:51.016 passed 00:15:51.016 Test: lun_task_mgmt_execute_abort_task_all_not_supported ...[2024-06-07 12:18:14.570582] /home/vagrant/spdk_repo/spdk/lib/scsi/lun.c: 169:_scsi_lun_execute_mgmt_task: *ERROR*: abort task set not supported 00:15:51.016 passed 00:15:51.016 Test: lun_task_mgmt_execute_lun_reset ...passed 00:15:51.016 Test: lun_task_mgmt_execute_target_reset ...passed 00:15:51.016 Test: lun_task_mgmt_execute_invalid_case ...[2024-06-07 12:18:14.571151] /home/vagrant/spdk_repo/spdk/lib/scsi/lun.c: 169:_scsi_lun_execute_mgmt_task: *ERROR*: unknown task not supported 00:15:51.016 passed 00:15:51.016 Test: lun_append_task_null_lun_task_cdb_spc_inquiry ...passed 00:15:51.016 Test: lun_append_task_null_lun_alloc_len_lt_4096 ...passed 00:15:51.016 Test: lun_append_task_null_lun_not_supported ...passed 00:15:51.016 Test: lun_execute_scsi_task_pending ...passed 00:15:51.016 Test: lun_execute_scsi_task_complete ...passed 00:15:51.016 Test: lun_execute_scsi_task_resize ...passed 00:15:51.016 Test: lun_destruct_success ...passed 00:15:51.016 Test: lun_construct_null_ctx ...[2024-06-07 12:18:14.572532] /home/vagrant/spdk_repo/spdk/lib/scsi/lun.c: 432:scsi_lun_construct: *ERROR*: bdev_name must be non-NULL 00:15:51.016 passed 00:15:51.016 Test: lun_construct_success ...passed 00:15:51.016 Test: lun_reset_task_wait_scsi_task_complete ...passed 00:15:51.016 Test: lun_reset_task_suspend_scsi_task ...passed 00:15:51.016 Test: lun_check_pending_tasks_only_for_specific_initiator ...passed 00:15:51.016 Test: abort_pending_mgmt_tasks_when_lun_is_removed ...passed 00:15:51.016 00:15:51.016 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.016 suites 1 1 n/a 0 0 00:15:51.016 tests 18 18 18 0 0 00:15:51.016 asserts 153 153 153 0 n/a 00:15:51.016 00:15:51.016 Elapsed time = 0.002 seconds 00:15:51.016 12:18:14 unittest.unittest_scsi -- unit/unittest.sh@119 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/scsi/scsi.c/scsi_ut 00:15:51.016 00:15:51.016 00:15:51.016 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.016 http://cunit.sourceforge.net/ 00:15:51.016 00:15:51.016 00:15:51.016 Suite: scsi_suite 00:15:51.016 Test: scsi_init ...passed 00:15:51.016 00:15:51.016 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.016 suites 1 1 n/a 0 0 00:15:51.016 tests 1 1 1 0 0 00:15:51.016 asserts 1 1 1 0 n/a 00:15:51.016 00:15:51.016 Elapsed time = 0.000 seconds 00:15:51.016 12:18:14 unittest.unittest_scsi -- unit/unittest.sh@120 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/scsi/scsi_bdev.c/scsi_bdev_ut 00:15:51.016 00:15:51.016 00:15:51.016 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.016 http://cunit.sourceforge.net/ 00:15:51.016 00:15:51.016 00:15:51.016 Suite: translation_suite 00:15:51.016 Test: mode_select_6_test ...passed 00:15:51.016 Test: mode_select_6_test2 ...passed 00:15:51.016 Test: mode_sense_6_test ...passed 00:15:51.016 Test: mode_sense_10_test ...passed 00:15:51.016 Test: inquiry_evpd_test ...passed 00:15:51.016 Test: inquiry_standard_test ...passed 00:15:51.016 Test: inquiry_overflow_test ...passed 00:15:51.016 Test: task_complete_test ...passed 00:15:51.016 Test: lba_range_test ...passed 00:15:51.016 Test: xfer_len_test ...[2024-06-07 12:18:14.656569] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_bdev.c:1270:bdev_scsi_readwrite: *ERROR*: xfer_len 8193 > maximum transfer length 8192 00:15:51.016 passed 00:15:51.016 Test: xfer_test ...passed 00:15:51.017 Test: scsi_name_padding_test ...passed 00:15:51.017 Test: get_dif_ctx_test ...passed 00:15:51.017 Test: unmap_split_test ...passed 00:15:51.017 00:15:51.017 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.017 suites 1 1 n/a 0 0 00:15:51.017 tests 14 14 14 0 0 00:15:51.017 asserts 1205 1205 1205 0 n/a 00:15:51.017 00:15:51.017 Elapsed time = 0.004 seconds 00:15:51.276 12:18:14 unittest.unittest_scsi -- unit/unittest.sh@121 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/scsi/scsi_pr.c/scsi_pr_ut 00:15:51.276 00:15:51.276 00:15:51.276 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.276 http://cunit.sourceforge.net/ 00:15:51.276 00:15:51.276 00:15:51.276 Suite: reservation_suite 00:15:51.276 Test: test_reservation_register ...[2024-06-07 12:18:14.695380] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 passed 00:15:51.276 Test: test_reservation_reserve ...[2024-06-07 12:18:14.696037] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 [2024-06-07 12:18:14.696253] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 209:scsi_pr_out_reserve: *ERROR*: Only 1 holder is allowed for type 1 00:15:51.276 [2024-06-07 12:18:14.696485] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 204:scsi_pr_out_reserve: *ERROR*: Reservation type doesn't match 00:15:51.276 passed 00:15:51.276 Test: test_reservation_preempt_non_all_regs ...[2024-06-07 12:18:14.696827] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 [2024-06-07 12:18:14.697010] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 458:scsi_pr_out_preempt: *ERROR*: Zeroed sa_rkey 00:15:51.276 passed 00:15:51.276 Test: test_reservation_preempt_all_regs ...[2024-06-07 12:18:14.697459] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 passed 00:15:51.276 Test: test_reservation_cmds_conflict ...[2024-06-07 12:18:14.697865] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 [2024-06-07 12:18:14.698068] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 851:scsi_pr_check: *ERROR*: CHECK: Registrants only reservation type reject command 0x2a 00:15:51.276 [2024-06-07 12:18:14.698238] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 845:scsi_pr_check: *ERROR*: CHECK: Exclusive Access reservation type rejects command 0x28 00:15:51.276 [2024-06-07 12:18:14.698382] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 845:scsi_pr_check: *ERROR*: CHECK: Exclusive Access reservation type rejects command 0x2a 00:15:51.276 [2024-06-07 12:18:14.698529] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 845:scsi_pr_check: *ERROR*: CHECK: Exclusive Access reservation type rejects command 0x28 00:15:51.276 [2024-06-07 12:18:14.698692] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 845:scsi_pr_check: *ERROR*: CHECK: Exclusive Access reservation type rejects command 0x2a 00:15:51.276 passed 00:15:51.276 Test: test_scsi2_reserve_release ...passed 00:15:51.276 Test: test_pr_with_scsi2_reserve_release ...[2024-06-07 12:18:14.699081] /home/vagrant/spdk_repo/spdk/lib/scsi/scsi_pr.c: 272:scsi_pr_out_register: *ERROR*: Reservation key 0xa1 don't match registrant's key 0xa 00:15:51.276 passed 00:15:51.276 00:15:51.276 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.276 suites 1 1 n/a 0 0 00:15:51.276 tests 7 7 7 0 0 00:15:51.276 asserts 257 257 257 0 n/a 00:15:51.276 00:15:51.276 Elapsed time = 0.002 seconds 00:15:51.276 00:15:51.276 real 0m0.210s 00:15:51.276 user 0m0.111s 00:15:51.276 sys 0m0.083s 00:15:51.276 12:18:14 unittest.unittest_scsi -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.276 12:18:14 unittest.unittest_scsi -- common/autotest_common.sh@10 -- # set +x 00:15:51.276 ************************************ 00:15:51.276 END TEST unittest_scsi 00:15:51.276 ************************************ 00:15:51.276 12:18:14 unittest -- unit/unittest.sh@278 -- # uname -s 00:15:51.276 12:18:14 unittest -- unit/unittest.sh@278 -- # '[' Linux = Linux ']' 00:15:51.276 12:18:14 unittest -- unit/unittest.sh@279 -- # run_test unittest_sock unittest_sock 00:15:51.276 12:18:14 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:51.276 12:18:14 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.276 12:18:14 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:51.276 ************************************ 00:15:51.276 START TEST unittest_sock 00:15:51.276 ************************************ 00:15:51.276 12:18:14 unittest.unittest_sock -- common/autotest_common.sh@1124 -- # unittest_sock 00:15:51.276 12:18:14 unittest.unittest_sock -- unit/unittest.sh@125 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/sock/sock.c/sock_ut 00:15:51.276 00:15:51.276 00:15:51.276 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.276 http://cunit.sourceforge.net/ 00:15:51.276 00:15:51.276 00:15:51.276 Suite: sock 00:15:51.276 Test: posix_sock ...passed 00:15:51.276 Test: ut_sock ...passed 00:15:51.276 Test: posix_sock_group ...passed 00:15:51.276 Test: ut_sock_group ...passed 00:15:51.276 Test: posix_sock_group_fairness ...passed 00:15:51.276 Test: _posix_sock_close ...passed 00:15:51.276 Test: sock_get_default_opts ...passed 00:15:51.276 Test: ut_sock_impl_get_set_opts ...passed 00:15:51.276 Test: posix_sock_impl_get_set_opts ...passed 00:15:51.276 Test: ut_sock_map ...passed 00:15:51.276 Test: override_impl_opts ...passed 00:15:51.276 Test: ut_sock_group_get_ctx ...passed 00:15:51.276 00:15:51.276 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.276 suites 1 1 n/a 0 0 00:15:51.276 tests 12 12 12 0 0 00:15:51.276 asserts 349 349 349 0 n/a 00:15:51.276 00:15:51.276 Elapsed time = 0.007 seconds 00:15:51.276 12:18:14 unittest.unittest_sock -- unit/unittest.sh@126 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/sock/posix.c/posix_ut 00:15:51.276 00:15:51.276 00:15:51.276 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.276 http://cunit.sourceforge.net/ 00:15:51.276 00:15:51.276 00:15:51.276 Suite: posix 00:15:51.276 Test: flush ...passed 00:15:51.276 00:15:51.276 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.276 suites 1 1 n/a 0 0 00:15:51.276 tests 1 1 1 0 0 00:15:51.276 asserts 28 28 28 0 n/a 00:15:51.276 00:15:51.276 Elapsed time = 0.000 seconds 00:15:51.276 12:18:14 unittest.unittest_sock -- unit/unittest.sh@128 -- # grep -q '#define SPDK_CONFIG_URING 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:51.276 00:15:51.276 real 0m0.106s 00:15:51.276 user 0m0.037s 00:15:51.276 sys 0m0.036s 00:15:51.276 12:18:14 unittest.unittest_sock -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.276 12:18:14 unittest.unittest_sock -- common/autotest_common.sh@10 -- # set +x 00:15:51.276 ************************************ 00:15:51.276 END TEST unittest_sock 00:15:51.276 ************************************ 00:15:51.535 12:18:14 unittest -- unit/unittest.sh@281 -- # run_test unittest_thread /home/vagrant/spdk_repo/spdk/test/unit/lib/thread/thread.c/thread_ut 00:15:51.535 12:18:14 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:51.535 12:18:14 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.535 12:18:14 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:51.535 ************************************ 00:15:51.535 START TEST unittest_thread 00:15:51.535 ************************************ 00:15:51.535 12:18:14 unittest.unittest_thread -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/thread/thread.c/thread_ut 00:15:51.535 00:15:51.535 00:15:51.535 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.535 http://cunit.sourceforge.net/ 00:15:51.535 00:15:51.535 00:15:51.535 Suite: io_channel 00:15:51.535 Test: thread_alloc ...passed 00:15:51.535 Test: thread_send_msg ...passed 00:15:51.535 Test: thread_poller ...passed 00:15:51.535 Test: poller_pause ...passed 00:15:51.535 Test: thread_for_each ...passed 00:15:51.535 Test: for_each_channel_remove ...passed 00:15:51.535 Test: for_each_channel_unreg ...[2024-06-07 12:18:14.981742] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:2173:spdk_io_device_register: *ERROR*: io_device 0x7ffd13b55630 already registered (old:0x613000000200 new:0x6130000003c0) 00:15:51.535 passed 00:15:51.535 Test: thread_name ...passed 00:15:51.535 Test: channel ...[2024-06-07 12:18:14.985395] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:2307:spdk_get_io_channel: *ERROR*: could not find io_device 0x49b000 00:15:51.535 passed 00:15:51.535 Test: channel_destroy_races ...passed 00:15:51.535 Test: thread_exit_test ...[2024-06-07 12:18:14.989870] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c: 635:thread_exit: *ERROR*: thread 0x618000005c80 got timeout, and move it to the exited state forcefully 00:15:51.535 passed 00:15:51.535 Test: thread_update_stats_test ...passed 00:15:51.535 Test: nested_channel ...passed 00:15:51.535 Test: device_unregister_and_thread_exit_race ...passed 00:15:51.535 Test: cache_closest_timed_poller ...passed 00:15:51.535 Test: multi_timed_pollers_have_same_expiration ...passed 00:15:51.535 Test: io_device_lookup ...passed 00:15:51.535 Test: spdk_spin ...[2024-06-07 12:18:14.999482] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3071:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 1: Not an SPDK thread (thread != ((void *)0)) 00:15:51.535 [2024-06-07 12:18:14.999587] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x7ffd13b55610 00:15:51.535 [2024-06-07 12:18:14.999750] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3109:spdk_spin_held: *ERROR*: unrecoverable spinlock error 1: Not an SPDK thread (thread != ((void *)0)) 00:15:51.535 [2024-06-07 12:18:15.001158] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3072:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:15:51.535 [2024-06-07 12:18:15.001395] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x7ffd13b55610 00:15:51.535 [2024-06-07 12:18:15.001563] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3092:spdk_spin_unlock: *ERROR*: unrecoverable spinlock error 3: Unlock on wrong SPDK thread (thread == sspin->thread) 00:15:51.535 [2024-06-07 12:18:15.001714] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x7ffd13b55610 00:15:51.535 [2024-06-07 12:18:15.001851] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3092:spdk_spin_unlock: *ERROR*: unrecoverable spinlock error 3: Unlock on wrong SPDK thread (thread == sspin->thread) 00:15:51.535 [2024-06-07 12:18:15.002058] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x7ffd13b55610 00:15:51.535 [2024-06-07 12:18:15.002240] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3053:spdk_spin_destroy: *ERROR*: unrecoverable spinlock error 5: Destroying a held spinlock (sspin->thread == ((void *)0)) 00:15:51.535 [2024-06-07 12:18:15.002411] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x7ffd13b55610 00:15:51.535 passed 00:15:51.535 Test: for_each_channel_and_thread_exit_race ...passed 00:15:51.535 Test: for_each_thread_and_thread_exit_race ...passed 00:15:51.535 00:15:51.535 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.535 suites 1 1 n/a 0 0 00:15:51.535 tests 20 20 20 0 0 00:15:51.535 asserts 409 409 409 0 n/a 00:15:51.535 00:15:51.535 Elapsed time = 0.038 seconds 00:15:51.535 00:15:51.535 real 0m0.087s 00:15:51.535 user 0m0.055s 00:15:51.535 sys 0m0.027s 00:15:51.535 12:18:15 unittest.unittest_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.535 12:18:15 unittest.unittest_thread -- common/autotest_common.sh@10 -- # set +x 00:15:51.535 ************************************ 00:15:51.535 END TEST unittest_thread 00:15:51.535 ************************************ 00:15:51.535 12:18:15 unittest -- unit/unittest.sh@282 -- # run_test unittest_iobuf /home/vagrant/spdk_repo/spdk/test/unit/lib/thread/iobuf.c/iobuf_ut 00:15:51.535 12:18:15 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:51.536 12:18:15 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.536 12:18:15 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:51.536 ************************************ 00:15:51.536 START TEST unittest_iobuf 00:15:51.536 ************************************ 00:15:51.536 12:18:15 unittest.unittest_iobuf -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/thread/iobuf.c/iobuf_ut 00:15:51.536 00:15:51.536 00:15:51.536 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.536 http://cunit.sourceforge.net/ 00:15:51.536 00:15:51.536 00:15:51.536 Suite: io_channel 00:15:51.536 Test: iobuf ...passed 00:15:51.536 Test: iobuf_cache ...[2024-06-07 12:18:15.119124] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 360:spdk_iobuf_channel_init: *ERROR*: Failed to populate 'ut_module0' iobuf small buffer cache at 4/5 entries. You may need to increase spdk_iobuf_opts.small_pool_count (4) 00:15:51.536 [2024-06-07 12:18:15.119955] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 363:spdk_iobuf_channel_init: *ERROR*: See scripts/calc-iobuf.py for guidance on how to calculate this value. 00:15:51.536 [2024-06-07 12:18:15.120536] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 372:spdk_iobuf_channel_init: *ERROR*: Failed to populate 'ut_module0' iobuf large buffer cache at 4/5 entries. You may need to increase spdk_iobuf_opts.large_pool_count (4) 00:15:51.536 [2024-06-07 12:18:15.120872] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 375:spdk_iobuf_channel_init: *ERROR*: See scripts/calc-iobuf.py for guidance on how to calculate this value. 00:15:51.536 [2024-06-07 12:18:15.121220] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 360:spdk_iobuf_channel_init: *ERROR*: Failed to populate 'ut_module1' iobuf small buffer cache at 0/4 entries. You may need to increase spdk_iobuf_opts.small_pool_count (4) 00:15:51.536 [2024-06-07 12:18:15.121578] /home/vagrant/spdk_repo/spdk/lib/thread/iobuf.c: 363:spdk_iobuf_channel_init: *ERROR*: See scripts/calc-iobuf.py for guidance on how to calculate this value. 00:15:51.536 passed 00:15:51.536 00:15:51.536 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.536 suites 1 1 n/a 0 0 00:15:51.536 tests 2 2 2 0 0 00:15:51.536 asserts 107 107 107 0 n/a 00:15:51.536 00:15:51.536 Elapsed time = 0.009 seconds 00:15:51.536 00:15:51.536 real 0m0.041s 00:15:51.536 user 0m0.020s 00:15:51.536 sys 0m0.019s 00:15:51.536 12:18:15 unittest.unittest_iobuf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:51.536 12:18:15 unittest.unittest_iobuf -- common/autotest_common.sh@10 -- # set +x 00:15:51.536 ************************************ 00:15:51.536 END TEST unittest_iobuf 00:15:51.536 ************************************ 00:15:51.795 12:18:15 unittest -- unit/unittest.sh@283 -- # run_test unittest_util unittest_util 00:15:51.795 12:18:15 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:51.795 12:18:15 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:51.795 12:18:15 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:51.795 ************************************ 00:15:51.795 START TEST unittest_util 00:15:51.795 ************************************ 00:15:51.795 12:18:15 unittest.unittest_util -- common/autotest_common.sh@1124 -- # unittest_util 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@134 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/base64.c/base64_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: base64 00:15:51.795 Test: test_base64_get_encoded_strlen ...passed 00:15:51.795 Test: test_base64_get_decoded_len ...passed 00:15:51.795 Test: test_base64_encode ...passed 00:15:51.795 Test: test_base64_decode ...passed 00:15:51.795 Test: test_base64_urlsafe_encode ...passed 00:15:51.795 Test: test_base64_urlsafe_decode ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 6 6 6 0 0 00:15:51.795 asserts 112 112 112 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.000 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@135 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/bit_array.c/bit_array_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: bit_array 00:15:51.795 Test: test_1bit ...passed 00:15:51.795 Test: test_64bit ...passed 00:15:51.795 Test: test_find ...passed 00:15:51.795 Test: test_resize ...passed 00:15:51.795 Test: test_errors ...passed 00:15:51.795 Test: test_count ...passed 00:15:51.795 Test: test_mask_store_load ...passed 00:15:51.795 Test: test_mask_clear ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 8 8 8 0 0 00:15:51.795 asserts 5075 5075 5075 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.001 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@136 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/cpuset.c/cpuset_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: cpuset 00:15:51.795 Test: test_cpuset ...passed 00:15:51.795 Test: test_cpuset_parse ...[2024-06-07 12:18:15.275757] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 239:parse_list: *ERROR*: Unexpected end of core list '[' 00:15:51.795 [2024-06-07 12:18:15.276429] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 241:parse_list: *ERROR*: Parsing of core list '[]' failed on character ']' 00:15:51.795 [2024-06-07 12:18:15.276884] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 241:parse_list: *ERROR*: Parsing of core list '[10--11]' failed on character '-' 00:15:51.795 [2024-06-07 12:18:15.277347] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 219:parse_list: *ERROR*: Invalid range of CPUs (11 > 10) 00:15:51.795 [2024-06-07 12:18:15.277727] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 241:parse_list: *ERROR*: Parsing of core list '[10-11,]' failed on character ',' 00:15:51.795 [2024-06-07 12:18:15.278110] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 241:parse_list: *ERROR*: Parsing of core list '[,10-11]' failed on character ',' 00:15:51.795 [2024-06-07 12:18:15.278417] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 203:parse_list: *ERROR*: Core number 1025 is out of range in '[1025]' 00:15:51.795 [2024-06-07 12:18:15.278807] /home/vagrant/spdk_repo/spdk/lib/util/cpuset.c: 198:parse_list: *ERROR*: Conversion of core mask in '[184467440737095516150]' failed 00:15:51.795 passed 00:15:51.795 Test: test_cpuset_fmt ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 3 3 3 0 0 00:15:51.795 asserts 65 65 65 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.002 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@137 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/crc16.c/crc16_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: crc16 00:15:51.795 Test: test_crc16_t10dif ...passed 00:15:51.795 Test: test_crc16_t10dif_seed ...passed 00:15:51.795 Test: test_crc16_t10dif_copy ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 3 3 3 0 0 00:15:51.795 asserts 5 5 5 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.000 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@138 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/crc32_ieee.c/crc32_ieee_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: crc32_ieee 00:15:51.795 Test: test_crc32_ieee ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 1 1 1 0 0 00:15:51.795 asserts 1 1 1 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.000 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@139 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/crc32c.c/crc32c_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: crc32c 00:15:51.795 Test: test_crc32c ...passed 00:15:51.795 Test: test_crc32c_nvme ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 2 2 2 0 0 00:15:51.795 asserts 16 16 16 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.000 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@140 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/crc64.c/crc64_ut 00:15:51.795 00:15:51.795 00:15:51.795 CUnit - A unit testing framework for C - Version 2.1-3 00:15:51.795 http://cunit.sourceforge.net/ 00:15:51.795 00:15:51.795 00:15:51.795 Suite: crc64 00:15:51.795 Test: test_crc64_nvme ...passed 00:15:51.795 00:15:51.795 Run Summary: Type Total Ran Passed Failed Inactive 00:15:51.795 suites 1 1 n/a 0 0 00:15:51.795 tests 1 1 1 0 0 00:15:51.795 asserts 4 4 4 0 n/a 00:15:51.795 00:15:51.795 Elapsed time = 0.000 seconds 00:15:51.795 12:18:15 unittest.unittest_util -- unit/unittest.sh@141 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/string.c/string_ut 00:15:52.056 00:15:52.056 00:15:52.056 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.056 http://cunit.sourceforge.net/ 00:15:52.056 00:15:52.056 00:15:52.056 Suite: string 00:15:52.056 Test: test_parse_ip_addr ...passed 00:15:52.056 Test: test_str_chomp ...passed 00:15:52.056 Test: test_parse_capacity ...passed 00:15:52.056 Test: test_sprintf_append_realloc ...passed 00:15:52.056 Test: test_strtol ...passed 00:15:52.056 Test: test_strtoll ...passed 00:15:52.056 Test: test_strarray ...passed 00:15:52.056 Test: test_strcpy_replace ...passed 00:15:52.056 00:15:52.056 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.056 suites 1 1 n/a 0 0 00:15:52.056 tests 8 8 8 0 0 00:15:52.056 asserts 161 161 161 0 n/a 00:15:52.056 00:15:52.056 Elapsed time = 0.001 seconds 00:15:52.056 12:18:15 unittest.unittest_util -- unit/unittest.sh@142 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/dif.c/dif_ut 00:15:52.056 00:15:52.056 00:15:52.056 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.056 http://cunit.sourceforge.net/ 00:15:52.056 00:15:52.056 00:15:52.056 Suite: dif 00:15:52.056 Test: dif_generate_and_verify_test ...[2024-06-07 12:18:15.476039] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=23, Expected=17, Actual=16 00:15:52.056 [2024-06-07 12:18:15.476629] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=23, Expected=17, Actual=16 00:15:52.056 [2024-06-07 12:18:15.476918] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=23, Expected=17, Actual=16 00:15:52.056 [2024-06-07 12:18:15.477187] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=23, Actual=22 00:15:52.056 [2024-06-07 12:18:15.477519] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=23, Actual=22 00:15:52.056 [2024-06-07 12:18:15.477809] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=23, Actual=22 00:15:52.056 passed 00:15:52.056 Test: dif_disable_check_test ...[2024-06-07 12:18:15.478656] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=22, Actual=ffff 00:15:52.056 [2024-06-07 12:18:15.478965] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=22, Actual=ffff 00:15:52.056 [2024-06-07 12:18:15.479250] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=22, Expected=22, Actual=ffff 00:15:52.056 passed 00:15:52.056 Test: dif_generate_and_verify_different_pi_formats_test ...[2024-06-07 12:18:15.480058] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=12, Expected=b0a80000, Actual=b9848de 00:15:52.056 [2024-06-07 12:18:15.480351] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=12, Expected=b98, Actual=b0a8 00:15:52.056 [2024-06-07 12:18:15.480665] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=12, Expected=b0a8000000000000, Actual=81039fcf5685d8d4 00:15:52.056 [2024-06-07 12:18:15.481018] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=12, Expected=b9848de00000000, Actual=81039fcf5685d8d4 00:15:52.056 [2024-06-07 12:18:15.481363] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=17, Actual=0 00:15:52.056 [2024-06-07 12:18:15.481683] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=17, Actual=0 00:15:52.056 [2024-06-07 12:18:15.481989] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=17, Actual=0 00:15:52.056 [2024-06-07 12:18:15.482292] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=17, Actual=0 00:15:52.056 [2024-06-07 12:18:15.482597] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=12, Expected=c, Actual=0 00:15:52.056 [2024-06-07 12:18:15.482882] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=12, Expected=c, Actual=0 00:15:52.056 [2024-06-07 12:18:15.483181] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=12, Expected=c, Actual=0 00:15:52.056 passed 00:15:52.056 Test: dif_apptag_mask_test ...[2024-06-07 12:18:15.483666] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=1256, Actual=1234 00:15:52.056 [2024-06-07 12:18:15.483964] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=12, Expected=1256, Actual=1234 00:15:52.056 passed 00:15:52.056 Test: dif_sec_512_md_0_error_test ...[2024-06-07 12:18:15.484357] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 510:spdk_dif_ctx_init: *ERROR*: Metadata size is smaller than DIF size. 00:15:52.056 passed 00:15:52.056 Test: dif_sec_4096_md_0_error_test ...[2024-06-07 12:18:15.484628] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 510:spdk_dif_ctx_init: *ERROR*: Metadata size is smaller than DIF size. 00:15:52.056 [2024-06-07 12:18:15.484772] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 510:spdk_dif_ctx_init: *ERROR*: Metadata size is smaller than DIF size. 00:15:52.056 passed 00:15:52.056 Test: dif_sec_4100_md_128_error_test ...[2024-06-07 12:18:15.485107] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 528:spdk_dif_ctx_init: *ERROR*: Zero block size is not allowed and should be a multiple of 4kB 00:15:52.056 [2024-06-07 12:18:15.485265] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 528:spdk_dif_ctx_init: *ERROR*: Zero block size is not allowed and should be a multiple of 4kB 00:15:52.056 passed 00:15:52.056 Test: dif_guard_seed_test ...passed 00:15:52.056 Test: dif_guard_value_test ...passed 00:15:52.056 Test: dif_disable_sec_512_md_8_single_iov_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_0_single_iov_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_0_single_iov_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_0_1_2_4_multi_iovs_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_0_1_2_4_multi_iovs_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_split_data_and_md_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_split_data_and_md_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_split_data_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_split_data_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_split_guard_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_split_guard_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_split_apptag_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_split_apptag_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_split_reftag_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_split_reftag_test ...passed 00:15:52.056 Test: dif_sec_512_md_8_prchk_7_multi_iovs_complex_splits_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_complex_splits_test ...passed 00:15:52.056 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_test ...[2024-06-07 12:18:15.511606] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=7d4c, Actual=fd4c 00:15:52.056 [2024-06-07 12:18:15.512968] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=7e21, Actual=fe21 00:15:52.056 [2024-06-07 12:18:15.514339] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.056 [2024-06-07 12:18:15.515719] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.517098] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.057 [2024-06-07 12:18:15.518464] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.057 [2024-06-07 12:18:15.519823] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=fd4c, Actual=5638 00:15:52.057 [2024-06-07 12:18:15.521159] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=fe21, Actual=f3c5 00:15:52.057 [2024-06-07 12:18:15.522521] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.057 [2024-06-07 12:18:15.523876] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=3857c660, Actual=38574660 00:15:52.057 [2024-06-07 12:18:15.525259] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.526607] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.527956] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.057 [2024-06-07 12:18:15.529317] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.057 [2024-06-07 12:18:15.530666] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab753ed, Actual=81b34797 00:15:52.057 [2024-06-07 12:18:15.532003] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=38574660, Actual=73323e54 00:15:52.057 [2024-06-07 12:18:15.533381] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.057 [2024-06-07 12:18:15.534737] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.057 [2024-06-07 12:18:15.536078] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.537447] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.538793] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.057 [2024-06-07 12:18:15.540133] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.057 [2024-06-07 12:18:15.541532] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.057 [2024-06-07 12:18:15.542884] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.057 passed 00:15:52.057 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_data_and_md_test ...[2024-06-07 12:18:15.543896] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7d4c, Actual=fd4c 00:15:52.057 [2024-06-07 12:18:15.544187] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7e21, Actual=fe21 00:15:52.057 [2024-06-07 12:18:15.544487] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.544767] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.545063] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.057 [2024-06-07 12:18:15.545359] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.057 [2024-06-07 12:18:15.545661] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fd4c, Actual=5638 00:15:52.057 [2024-06-07 12:18:15.545937] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fe21, Actual=f3c5 00:15:52.057 [2024-06-07 12:18:15.546220] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.057 [2024-06-07 12:18:15.546511] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=3857c660, Actual=38574660 00:15:52.057 [2024-06-07 12:18:15.546808] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.547085] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.547363] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.547620] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.547906] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab753ed, Actual=81b34797 00:15:52.057 [2024-06-07 12:18:15.548153] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=38574660, Actual=73323e54 00:15:52.057 [2024-06-07 12:18:15.548452] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.057 [2024-06-07 12:18:15.548733] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.057 [2024-06-07 12:18:15.549017] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.549298] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.549588] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.549852] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.550156] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.057 [2024-06-07 12:18:15.550451] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.057 passed 00:15:52.057 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_data_test ...[2024-06-07 12:18:15.550932] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7d4c, Actual=fd4c 00:15:52.057 [2024-06-07 12:18:15.551198] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7e21, Actual=fe21 00:15:52.057 [2024-06-07 12:18:15.551484] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.551761] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.552041] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.057 [2024-06-07 12:18:15.552341] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.057 [2024-06-07 12:18:15.552615] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fd4c, Actual=5638 00:15:52.057 [2024-06-07 12:18:15.552886] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fe21, Actual=f3c5 00:15:52.057 [2024-06-07 12:18:15.553139] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.057 [2024-06-07 12:18:15.553422] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=3857c660, Actual=38574660 00:15:52.057 [2024-06-07 12:18:15.553704] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.554000] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.554292] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.554591] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.554873] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab753ed, Actual=81b34797 00:15:52.057 [2024-06-07 12:18:15.555140] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=38574660, Actual=73323e54 00:15:52.057 [2024-06-07 12:18:15.555449] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.057 [2024-06-07 12:18:15.555721] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.057 [2024-06-07 12:18:15.555987] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.556280] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.556550] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.556816] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.057 [2024-06-07 12:18:15.557111] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.057 [2024-06-07 12:18:15.557384] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.057 passed 00:15:52.057 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_guard_test ...[2024-06-07 12:18:15.557838] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7d4c, Actual=fd4c 00:15:52.057 [2024-06-07 12:18:15.558142] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7e21, Actual=fe21 00:15:52.057 [2024-06-07 12:18:15.558423] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.558682] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.057 [2024-06-07 12:18:15.558979] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.057 [2024-06-07 12:18:15.559255] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.058 [2024-06-07 12:18:15.559522] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fd4c, Actual=5638 00:15:52.058 [2024-06-07 12:18:15.559795] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fe21, Actual=f3c5 00:15:52.058 [2024-06-07 12:18:15.560067] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.058 [2024-06-07 12:18:15.560336] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=3857c660, Actual=38574660 00:15:52.058 [2024-06-07 12:18:15.560629] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.560901] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.561176] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.561466] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.561714] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab753ed, Actual=81b34797 00:15:52.058 [2024-06-07 12:18:15.561973] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=38574660, Actual=73323e54 00:15:52.058 [2024-06-07 12:18:15.562266] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.058 [2024-06-07 12:18:15.562533] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.058 [2024-06-07 12:18:15.562787] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.563065] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.563345] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.563611] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.563895] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.058 [2024-06-07 12:18:15.564176] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.058 passed 00:15:52.058 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_apptag_pi_16_test ...[2024-06-07 12:18:15.564654] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7d4c, Actual=fd4c 00:15:52.058 [2024-06-07 12:18:15.564923] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7e21, Actual=fe21 00:15:52.058 [2024-06-07 12:18:15.565179] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.565470] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.565770] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.058 [2024-06-07 12:18:15.566041] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.058 [2024-06-07 12:18:15.566324] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fd4c, Actual=5638 00:15:52.058 [2024-06-07 12:18:15.566588] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fe21, Actual=f3c5 00:15:52.058 passed 00:15:52.058 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_apptag_test ...[2024-06-07 12:18:15.567013] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.058 [2024-06-07 12:18:15.567293] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=3857c660, Actual=38574660 00:15:52.058 [2024-06-07 12:18:15.567587] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.567850] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.568123] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.568406] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.568676] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab753ed, Actual=81b34797 00:15:52.058 [2024-06-07 12:18:15.568941] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=38574660, Actual=73323e54 00:15:52.058 [2024-06-07 12:18:15.569288] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.058 [2024-06-07 12:18:15.569571] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.058 [2024-06-07 12:18:15.569827] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.570104] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.570385] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.570653] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.570925] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.058 [2024-06-07 12:18:15.571214] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.058 passed 00:15:52.058 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_reftag_pi_16_test ...[2024-06-07 12:18:15.571669] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7d4c, Actual=fd4c 00:15:52.058 [2024-06-07 12:18:15.571931] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=7e21, Actual=fe21 00:15:52.058 [2024-06-07 12:18:15.572185] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.572469] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.572751] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.058 [2024-06-07 12:18:15.573010] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=8058 00:15:52.058 [2024-06-07 12:18:15.573307] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fd4c, Actual=5638 00:15:52.058 [2024-06-07 12:18:15.573562] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=fe21, Actual=f3c5 00:15:52.058 passed 00:15:52.058 Test: dif_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_reftag_test ...[2024-06-07 12:18:15.574006] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.058 [2024-06-07 12:18:15.574277] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=3857c660, Actual=38574660 00:15:52.058 [2024-06-07 12:18:15.574595] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.574874] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.575139] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.575399] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.575682] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=1ab753ed, Actual=81b34797 00:15:52.058 [2024-06-07 12:18:15.575933] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=38574660, Actual=73323e54 00:15:52.058 [2024-06-07 12:18:15.576286] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.058 [2024-06-07 12:18:15.576572] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88018a2d4837a266, Actual=88010a2d4837a266 00:15:52.058 [2024-06-07 12:18:15.576837] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.577102] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=88, Expected=88, Actual=8088 00:15:52.058 [2024-06-07 12:18:15.577373] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.577646] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=88, Expected=58, Actual=800000000058 00:15:52.058 [2024-06-07 12:18:15.577951] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.058 [2024-06-07 12:18:15.578236] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=88, Expected=88010a2d4837a266, Actual=c93f3d06d0e59fa 00:15:52.058 passed 00:15:52.058 Test: dif_copy_sec_512_md_8_prchk_0_single_iov ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_prchk_0_single_iov_test ...passed 00:15:52.058 Test: dif_copy_sec_512_md_8_prchk_0_1_2_4_multi_iovs ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_prchk_0_1_2_4_multi_iovs_test ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_prchk_7_multi_iovs ...passed 00:15:52.058 Test: dif_copy_sec_512_md_8_prchk_7_multi_iovs_split_data ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_prchk_7_multi_iovs_split_data_test ...passed 00:15:52.058 Test: dif_copy_sec_512_md_8_prchk_7_multi_iovs_complex_splits ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_prchk_7_multi_iovs_complex_splits_test ...passed 00:15:52.058 Test: dif_copy_sec_4096_md_128_inject_1_2_4_8_multi_iovs_test ...[2024-06-07 12:18:15.606186] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=7d4c, Actual=fd4c 00:15:52.058 [2024-06-07 12:18:15.607151] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=5002, Actual=d002 00:15:52.058 [2024-06-07 12:18:15.608109] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.609062] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.610043] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.059 [2024-06-07 12:18:15.610994] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.059 [2024-06-07 12:18:15.611938] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=fd4c, Actual=5638 00:15:52.059 [2024-06-07 12:18:15.612883] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=5b17, Actual=56f3 00:15:52.059 [2024-06-07 12:18:15.613841] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.059 [2024-06-07 12:18:15.614823] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=3b14b204, Actual=3b143204 00:15:52.059 [2024-06-07 12:18:15.615777] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.616758] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.617725] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.618707] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.619690] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab753ed, Actual=81b34797 00:15:52.059 [2024-06-07 12:18:15.620644] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=50d983f, Actual=4e68e00b 00:15:52.059 [2024-06-07 12:18:15.621591] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.059 [2024-06-07 12:18:15.622595] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=27dbff21e882e059, Actual=27db7f21e882e059 00:15:52.059 [2024-06-07 12:18:15.623552] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.624504] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.625457] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.626431] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.627408] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.059 [2024-06-07 12:18:15.628394] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=3cc902869290d092, Actual=b85bfb7bb7a92b0e 00:15:52.059 passed 00:15:52.059 Test: dif_copy_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_test ...[2024-06-07 12:18:15.629122] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=7d4c, Actual=fd4c 00:15:52.059 [2024-06-07 12:18:15.629516] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=99d5, Actual=19d5 00:15:52.059 [2024-06-07 12:18:15.629922] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.630322] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.630723] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=8059 00:15:52.059 [2024-06-07 12:18:15.631141] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=8059 00:15:52.059 [2024-06-07 12:18:15.631523] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=fd4c, Actual=5638 00:15:52.059 [2024-06-07 12:18:15.631901] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=92c0, Actual=9f24 00:15:52.059 [2024-06-07 12:18:15.632301] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.059 [2024-06-07 12:18:15.632696] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a5a47a56, Actual=a5a4fa56 00:15:52.059 [2024-06-07 12:18:15.633120] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.633516] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.633901] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.059 [2024-06-07 12:18:15.634340] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.059 [2024-06-07 12:18:15.634709] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=1ab753ed, Actual=81b34797 00:15:52.059 [2024-06-07 12:18:15.635086] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=9bbd506d, Actual=d0d82859 00:15:52.059 [2024-06-07 12:18:15.635516] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.059 [2024-06-07 12:18:15.635894] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=533a1dcc335605a4, Actual=533a9dcc335605a4 00:15:52.059 [2024-06-07 12:18:15.636308] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.636686] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.637079] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.059 [2024-06-07 12:18:15.637473] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.059 [2024-06-07 12:18:15.637872] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.059 [2024-06-07 12:18:15.638306] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=4828e06b4944356f, Actual=ccba19966c7dcef3 00:15:52.059 passed 00:15:52.059 Test: dix_sec_512_md_0_error ...[2024-06-07 12:18:15.638772] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 510:spdk_dif_ctx_init: *ERROR*: Metadata size is smaller than DIF size. 00:15:52.059 passed 00:15:52.059 Test: dix_sec_512_md_8_prchk_0_single_iov ...passed 00:15:52.059 Test: dix_sec_4096_md_128_prchk_0_single_iov_test ...passed 00:15:52.059 Test: dix_sec_512_md_8_prchk_0_1_2_4_multi_iovs ...passed 00:15:52.059 Test: dix_sec_4096_md_128_prchk_0_1_2_4_multi_iovs_test ...passed 00:15:52.059 Test: dix_sec_4096_md_128_prchk_7_multi_iovs ...passed 00:15:52.059 Test: dix_sec_512_md_8_prchk_7_multi_iovs_split_data ...passed 00:15:52.059 Test: dix_sec_4096_md_128_prchk_7_multi_iovs_split_data_test ...passed 00:15:52.059 Test: dix_sec_512_md_8_prchk_7_multi_iovs_complex_splits ...passed 00:15:52.059 Test: dix_sec_4096_md_128_prchk_7_multi_iovs_complex_splits_test ...passed 00:15:52.059 Test: dix_sec_4096_md_128_inject_1_2_4_8_multi_iovs_test ...[2024-06-07 12:18:15.664180] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=7d4c, Actual=fd4c 00:15:52.059 [2024-06-07 12:18:15.664977] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=5002, Actual=d002 00:15:52.059 [2024-06-07 12:18:15.665792] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.666579] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.667385] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.059 [2024-06-07 12:18:15.668158] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=8061 00:15:52.059 [2024-06-07 12:18:15.668920] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=fd4c, Actual=5638 00:15:52.059 [2024-06-07 12:18:15.669710] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=5b17, Actual=56f3 00:15:52.059 [2024-06-07 12:18:15.670492] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.059 [2024-06-07 12:18:15.671282] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=3b14b204, Actual=3b143204 00:15:52.059 [2024-06-07 12:18:15.672064] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.672858] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.673658] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.674462] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.675248] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=1ab753ed, Actual=81b34797 00:15:52.059 [2024-06-07 12:18:15.676002] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=50d983f, Actual=4e68e00b 00:15:52.059 [2024-06-07 12:18:15.676802] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.059 [2024-06-07 12:18:15.677580] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=27dbff21e882e059, Actual=27db7f21e882e059 00:15:52.059 [2024-06-07 12:18:15.678369] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.679116] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=97, Expected=88, Actual=8088 00:15:52.059 [2024-06-07 12:18:15.679895] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.680656] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=97, Expected=61, Actual=800000000061 00:15:52.059 [2024-06-07 12:18:15.681458] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.059 [2024-06-07 12:18:15.682245] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=97, Expected=3cc902869290d092, Actual=b85bfb7bb7a92b0e 00:15:52.059 passed 00:15:52.059 Test: dix_sec_4096_md_128_inject_1_2_4_8_multi_iovs_split_test ...[2024-06-07 12:18:15.682791] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=7d4c, Actual=fd4c 00:15:52.060 [2024-06-07 12:18:15.683065] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=99d5, Actual=19d5 00:15:52.060 [2024-06-07 12:18:15.683354] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.683635] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.683919] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=8059 00:15:52.060 [2024-06-07 12:18:15.684185] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=8059 00:15:52.060 [2024-06-07 12:18:15.684480] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=fd4c, Actual=5638 00:15:52.060 [2024-06-07 12:18:15.684743] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=92c0, Actual=9f24 00:15:52.060 [2024-06-07 12:18:15.685024] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=1ab7d3ed, Actual=1ab753ed 00:15:52.060 [2024-06-07 12:18:15.685294] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a5a47a56, Actual=a5a4fa56 00:15:52.060 [2024-06-07 12:18:15.685581] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.685860] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.686138] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.060 [2024-06-07 12:18:15.686429] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.060 [2024-06-07 12:18:15.686705] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=1ab753ed, Actual=81b34797 00:15:52.060 [2024-06-07 12:18:15.686987] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=9bbd506d, Actual=d0d82859 00:15:52.060 [2024-06-07 12:18:15.687290] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a57627728ecc20d3, Actual=a576a7728ecc20d3 00:15:52.060 [2024-06-07 12:18:15.687566] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=533a1dcc335605a4, Actual=533a9dcc335605a4 00:15:52.060 [2024-06-07 12:18:15.687827] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.688108] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=89, Expected=88, Actual=8088 00:15:52.060 [2024-06-07 12:18:15.688383] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.060 [2024-06-07 12:18:15.688649] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=89, Expected=59, Actual=800000000059 00:15:52.060 [2024-06-07 12:18:15.688906] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=a576a7728ecc20d3, Actual=5d7db9aba42d9da4 00:15:52.060 [2024-06-07 12:18:15.689180] /home/vagrant/spdk_repo/spdk/lib/util/dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=89, Expected=4828e06b4944356f, Actual=ccba19966c7dcef3 00:15:52.060 passed 00:15:52.060 Test: set_md_interleave_iovs_test ...passed 00:15:52.060 Test: set_md_interleave_iovs_split_test ...passed 00:15:52.060 Test: dif_generate_stream_pi_16_test ...passed 00:15:52.060 Test: dif_generate_stream_test ...passed 00:15:52.060 Test: set_md_interleave_iovs_alignment_test ...[2024-06-07 12:18:15.693866] /home/vagrant/spdk_repo/spdk/lib/util/dif.c:1822:spdk_dif_set_md_interleave_iovs: *ERROR*: Buffer overflow will occur. 00:15:52.060 passed 00:15:52.060 Test: dif_generate_split_test ...passed 00:15:52.060 Test: set_md_interleave_iovs_multi_segments_test ...passed 00:15:52.060 Test: dif_verify_split_test ...passed 00:15:52.318 Test: dif_verify_stream_multi_segments_test ...passed 00:15:52.318 Test: update_crc32c_pi_16_test ...passed 00:15:52.318 Test: update_crc32c_test ...passed 00:15:52.318 Test: dif_update_crc32c_split_test ...passed 00:15:52.318 Test: dif_update_crc32c_stream_multi_segments_test ...passed 00:15:52.318 Test: get_range_with_md_test ...passed 00:15:52.318 Test: dif_sec_512_md_8_prchk_7_multi_iovs_remap_pi_16_test ...passed 00:15:52.318 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_remap_test ...passed 00:15:52.318 Test: dif_sec_4096_md_128_prchk_7_multi_iovs_complex_splits_remap_test ...passed 00:15:52.318 Test: dix_sec_4096_md_128_prchk_7_multi_iovs_remap ...passed 00:15:52.318 Test: dix_sec_512_md_8_prchk_7_multi_iovs_complex_splits_remap_pi_16_test ...passed 00:15:52.318 Test: dix_sec_4096_md_128_prchk_7_multi_iovs_complex_splits_remap_test ...passed 00:15:52.318 Test: dif_generate_and_verify_unmap_test ...passed 00:15:52.318 00:15:52.318 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.318 suites 1 1 n/a 0 0 00:15:52.318 tests 79 79 79 0 0 00:15:52.318 asserts 3584 3584 3584 0 n/a 00:15:52.318 00:15:52.318 Elapsed time = 0.202 seconds 00:15:52.318 12:18:15 unittest.unittest_util -- unit/unittest.sh@143 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/iov.c/iov_ut 00:15:52.318 00:15:52.318 00:15:52.318 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.318 http://cunit.sourceforge.net/ 00:15:52.318 00:15:52.318 00:15:52.318 Suite: iov 00:15:52.318 Test: test_single_iov ...passed 00:15:52.318 Test: test_simple_iov ...passed 00:15:52.318 Test: test_complex_iov ...passed 00:15:52.318 Test: test_iovs_to_buf ...passed 00:15:52.318 Test: test_buf_to_iovs ...passed 00:15:52.318 Test: test_memset ...passed 00:15:52.318 Test: test_iov_one ...passed 00:15:52.318 Test: test_iov_xfer ...passed 00:15:52.318 00:15:52.318 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.318 suites 1 1 n/a 0 0 00:15:52.318 tests 8 8 8 0 0 00:15:52.318 asserts 156 156 156 0 n/a 00:15:52.318 00:15:52.318 Elapsed time = 0.000 seconds 00:15:52.318 12:18:15 unittest.unittest_util -- unit/unittest.sh@144 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/math.c/math_ut 00:15:52.318 00:15:52.318 00:15:52.318 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.318 http://cunit.sourceforge.net/ 00:15:52.318 00:15:52.318 00:15:52.318 Suite: math 00:15:52.318 Test: test_serial_number_arithmetic ...passed 00:15:52.318 Suite: erase 00:15:52.318 Test: test_memset_s ...passed 00:15:52.318 00:15:52.318 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.318 suites 2 2 n/a 0 0 00:15:52.318 tests 2 2 2 0 0 00:15:52.318 asserts 18 18 18 0 n/a 00:15:52.318 00:15:52.318 Elapsed time = 0.000 seconds 00:15:52.318 12:18:15 unittest.unittest_util -- unit/unittest.sh@145 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/pipe.c/pipe_ut 00:15:52.318 00:15:52.318 00:15:52.318 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.318 http://cunit.sourceforge.net/ 00:15:52.318 00:15:52.318 00:15:52.319 Suite: pipe 00:15:52.319 Test: test_create_destroy ...passed 00:15:52.319 Test: test_write_get_buffer ...passed 00:15:52.319 Test: test_write_advance ...passed 00:15:52.319 Test: test_read_get_buffer ...passed 00:15:52.319 Test: test_read_advance ...passed 00:15:52.319 Test: test_data ...passed 00:15:52.319 00:15:52.319 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.319 suites 1 1 n/a 0 0 00:15:52.319 tests 6 6 6 0 0 00:15:52.319 asserts 251 251 251 0 n/a 00:15:52.319 00:15:52.319 Elapsed time = 0.000 seconds 00:15:52.319 12:18:15 unittest.unittest_util -- unit/unittest.sh@146 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/util/xor.c/xor_ut 00:15:52.319 00:15:52.319 00:15:52.319 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.319 http://cunit.sourceforge.net/ 00:15:52.319 00:15:52.319 00:15:52.319 Suite: xor 00:15:52.319 Test: test_xor_gen ...passed 00:15:52.319 00:15:52.319 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.319 suites 1 1 n/a 0 0 00:15:52.319 tests 1 1 1 0 0 00:15:52.319 asserts 17 17 17 0 n/a 00:15:52.319 00:15:52.319 Elapsed time = 0.004 seconds 00:15:52.319 00:15:52.319 real 0m0.680s 00:15:52.319 user 0m0.407s 00:15:52.319 sys 0m0.212s 00:15:52.319 12:18:15 unittest.unittest_util -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:52.319 12:18:15 unittest.unittest_util -- common/autotest_common.sh@10 -- # set +x 00:15:52.319 ************************************ 00:15:52.319 END TEST unittest_util 00:15:52.319 ************************************ 00:15:52.319 12:18:15 unittest -- unit/unittest.sh@284 -- # grep -q '#define SPDK_CONFIG_VHOST 1' /home/vagrant/spdk_repo/spdk/include/spdk/config.h 00:15:52.319 12:18:15 unittest -- unit/unittest.sh@285 -- # run_test unittest_vhost /home/vagrant/spdk_repo/spdk/test/unit/lib/vhost/vhost.c/vhost_ut 00:15:52.319 12:18:15 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:52.319 12:18:15 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:52.319 12:18:15 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:52.319 ************************************ 00:15:52.319 START TEST unittest_vhost 00:15:52.319 ************************************ 00:15:52.319 12:18:15 unittest.unittest_vhost -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/vhost/vhost.c/vhost_ut 00:15:52.577 00:15:52.577 00:15:52.577 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.577 http://cunit.sourceforge.net/ 00:15:52.577 00:15:52.577 00:15:52.577 Suite: vhost_suite 00:15:52.577 Test: desc_to_iov_test ...[2024-06-07 12:18:15.964980] /home/vagrant/spdk_repo/spdk/lib/vhost/rte_vhost_user.c: 620:vhost_vring_desc_payload_to_iov: *ERROR*: SPDK_VHOST_IOVS_MAX(129) reached 00:15:52.577 passed 00:15:52.577 Test: create_controller_test ...[2024-06-07 12:18:15.967900] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 80:vhost_parse_core_mask: *ERROR*: one of selected cpu is outside of core mask(=f) 00:15:52.577 [2024-06-07 12:18:15.968089] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 126:vhost_dev_register: *ERROR*: cpumask 0xf0 is invalid (core mask is 0xf) 00:15:52.577 [2024-06-07 12:18:15.968269] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 80:vhost_parse_core_mask: *ERROR*: one of selected cpu is outside of core mask(=f) 00:15:52.577 [2024-06-07 12:18:15.968434] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 126:vhost_dev_register: *ERROR*: cpumask 0xff is invalid (core mask is 0xf) 00:15:52.577 [2024-06-07 12:18:15.968574] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 121:vhost_dev_register: *ERROR*: Can't register controller with no name 00:15:52.578 [2024-06-07 12:18:15.968997] /home/vagrant/spdk_repo/spdk/lib/vhost/rte_vhost_user.c:1781:vhost_user_dev_init: *ERROR*: Resulting socket path for controller is too long: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 00:15:52.578 [2024-06-07 12:18:15.969698] /home/vagrant/spdk_repo/spdk/lib/vhost/vhost.c: 137:vhost_dev_register: *ERROR*: vhost controller vdev_name_0 already exists. 00:15:52.578 passed 00:15:52.578 Test: session_find_by_vid_test ...passed 00:15:52.578 Test: remove_controller_test ...[2024-06-07 12:18:15.971409] /home/vagrant/spdk_repo/spdk/lib/vhost/rte_vhost_user.c:1866:vhost_user_dev_unregister: *ERROR*: Controller vdev_name_0 has still valid connection. 00:15:52.578 passed 00:15:52.578 Test: vq_avail_ring_get_test ...passed 00:15:52.578 Test: vq_packed_ring_test ...passed 00:15:52.578 Test: vhost_blk_construct_test ...passed 00:15:52.578 00:15:52.578 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.578 suites 1 1 n/a 0 0 00:15:52.578 tests 7 7 7 0 0 00:15:52.578 asserts 147 147 147 0 n/a 00:15:52.578 00:15:52.578 Elapsed time = 0.008 seconds 00:15:52.578 00:15:52.578 real 0m0.045s 00:15:52.578 user 0m0.020s 00:15:52.578 sys 0m0.022s 00:15:52.578 12:18:15 unittest.unittest_vhost -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:52.578 12:18:15 unittest.unittest_vhost -- common/autotest_common.sh@10 -- # set +x 00:15:52.578 ************************************ 00:15:52.578 END TEST unittest_vhost 00:15:52.578 ************************************ 00:15:52.578 12:18:16 unittest -- unit/unittest.sh@287 -- # run_test unittest_dma /home/vagrant/spdk_repo/spdk/test/unit/lib/dma/dma.c/dma_ut 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:52.578 ************************************ 00:15:52.578 START TEST unittest_dma 00:15:52.578 ************************************ 00:15:52.578 12:18:16 unittest.unittest_dma -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/dma/dma.c/dma_ut 00:15:52.578 00:15:52.578 00:15:52.578 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.578 http://cunit.sourceforge.net/ 00:15:52.578 00:15:52.578 00:15:52.578 Suite: dma_suite 00:15:52.578 Test: test_dma ...[2024-06-07 12:18:16.075221] /home/vagrant/spdk_repo/spdk/lib/dma/dma.c: 56:spdk_memory_domain_create: *ERROR*: Context size can't be 0 00:15:52.578 passed 00:15:52.578 00:15:52.578 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.578 suites 1 1 n/a 0 0 00:15:52.578 tests 1 1 1 0 0 00:15:52.578 asserts 54 54 54 0 n/a 00:15:52.578 00:15:52.578 Elapsed time = 0.001 seconds 00:15:52.578 00:15:52.578 real 0m0.029s 00:15:52.578 user 0m0.011s 00:15:52.578 sys 0m0.018s 00:15:52.578 12:18:16 unittest.unittest_dma -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:52.578 12:18:16 unittest.unittest_dma -- common/autotest_common.sh@10 -- # set +x 00:15:52.578 ************************************ 00:15:52.578 END TEST unittest_dma 00:15:52.578 ************************************ 00:15:52.578 12:18:16 unittest -- unit/unittest.sh@289 -- # run_test unittest_init unittest_init 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:52.578 12:18:16 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:52.578 ************************************ 00:15:52.578 START TEST unittest_init 00:15:52.578 ************************************ 00:15:52.578 12:18:16 unittest.unittest_init -- common/autotest_common.sh@1124 -- # unittest_init 00:15:52.578 12:18:16 unittest.unittest_init -- unit/unittest.sh@150 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/init/subsystem.c/subsystem_ut 00:15:52.578 00:15:52.578 00:15:52.578 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.578 http://cunit.sourceforge.net/ 00:15:52.578 00:15:52.578 00:15:52.578 Suite: subsystem_suite 00:15:52.578 Test: subsystem_sort_test_depends_on_single ...passed 00:15:52.578 Test: subsystem_sort_test_depends_on_multiple ...passed 00:15:52.578 Test: subsystem_sort_test_missing_dependency ...[2024-06-07 12:18:16.170588] /home/vagrant/spdk_repo/spdk/lib/init/subsystem.c: 196:spdk_subsystem_init: *ERROR*: subsystem A dependency B is missing 00:15:52.578 [2024-06-07 12:18:16.171507] /home/vagrant/spdk_repo/spdk/lib/init/subsystem.c: 191:spdk_subsystem_init: *ERROR*: subsystem C is missing 00:15:52.578 passed 00:15:52.578 00:15:52.578 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.578 suites 1 1 n/a 0 0 00:15:52.578 tests 3 3 3 0 0 00:15:52.578 asserts 20 20 20 0 n/a 00:15:52.578 00:15:52.578 Elapsed time = 0.001 seconds 00:15:52.578 00:15:52.578 real 0m0.041s 00:15:52.578 user 0m0.024s 00:15:52.578 sys 0m0.016s 00:15:52.578 12:18:16 unittest.unittest_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:52.578 12:18:16 unittest.unittest_init -- common/autotest_common.sh@10 -- # set +x 00:15:52.578 ************************************ 00:15:52.578 END TEST unittest_init 00:15:52.578 ************************************ 00:15:52.836 12:18:16 unittest -- unit/unittest.sh@290 -- # run_test unittest_keyring /home/vagrant/spdk_repo/spdk/test/unit/lib/keyring/keyring.c/keyring_ut 00:15:52.836 12:18:16 unittest -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:15:52.836 12:18:16 unittest -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:52.836 12:18:16 unittest -- common/autotest_common.sh@10 -- # set +x 00:15:52.836 ************************************ 00:15:52.836 START TEST unittest_keyring 00:15:52.836 ************************************ 00:15:52.836 12:18:16 unittest.unittest_keyring -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/unit/lib/keyring/keyring.c/keyring_ut 00:15:52.836 00:15:52.836 00:15:52.836 CUnit - A unit testing framework for C - Version 2.1-3 00:15:52.836 http://cunit.sourceforge.net/ 00:15:52.836 00:15:52.836 00:15:52.836 Suite: keyring 00:15:52.836 Test: test_keyring_add_remove ...[2024-06-07 12:18:16.278674] /home/vagrant/spdk_repo/spdk/lib/keyring/keyring.c: 107:spdk_keyring_add_key: *ERROR*: Key 'key0' already exists 00:15:52.836 [2024-06-07 12:18:16.279172] /home/vagrant/spdk_repo/spdk/lib/keyring/keyring.c: 107:spdk_keyring_add_key: *ERROR*: Key ':key0' already exists 00:15:52.836 [2024-06-07 12:18:16.279400] /home/vagrant/spdk_repo/spdk/lib/keyring/keyring.c: 126:spdk_keyring_add_key: *ERROR*: Failed to add key 'key0' to the keyring 00:15:52.836 passed 00:15:52.836 Test: test_keyring_get_put ...passed 00:15:52.836 00:15:52.836 Run Summary: Type Total Ran Passed Failed Inactive 00:15:52.836 suites 1 1 n/a 0 0 00:15:52.836 tests 2 2 2 0 0 00:15:52.836 asserts 44 44 44 0 n/a 00:15:52.836 00:15:52.836 Elapsed time = 0.001 seconds 00:15:52.836 00:15:52.836 real 0m0.034s 00:15:52.836 user 0m0.015s 00:15:52.836 sys 0m0.018s 00:15:52.836 12:18:16 unittest.unittest_keyring -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:52.836 12:18:16 unittest.unittest_keyring -- common/autotest_common.sh@10 -- # set +x 00:15:52.836 ************************************ 00:15:52.836 END TEST unittest_keyring 00:15:52.836 ************************************ 00:15:52.836 12:18:16 unittest -- unit/unittest.sh@292 -- # '[' yes = yes ']' 00:15:52.836 12:18:16 unittest -- unit/unittest.sh@292 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:15:52.836 12:18:16 unittest -- unit/unittest.sh@293 -- # hostname 00:15:52.836 12:18:16 unittest -- unit/unittest.sh@293 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -d . -c -t rocky9-cloud-1711172311-2200 -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_test.info 00:15:53.131 geninfo: WARNING: invalid characters removed from testname! 00:16:25.201 12:18:47 unittest -- unit/unittest.sh@294 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_base.info -a /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_test.info -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_total.info 00:16:31.826 12:18:54 unittest -- unit/unittest.sh@295 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_total.info -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:36.002 12:18:59 unittest -- unit/unittest.sh@296 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info '/home/vagrant/spdk_repo/spdk/app/*' -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:38.535 12:19:01 unittest -- unit/unittest.sh@297 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info '/home/vagrant/spdk_repo/spdk/dpdk/*' -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:41.064 12:19:04 unittest -- unit/unittest.sh@298 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info '/home/vagrant/spdk_repo/spdk/examples/*' -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:44.350 12:19:07 unittest -- unit/unittest.sh@299 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info '/home/vagrant/spdk_repo/spdk/lib/vhost/rte_vhost/*' -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:46.930 12:19:10 unittest -- unit/unittest.sh@300 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info '/home/vagrant/spdk_repo/spdk/test/*' -o /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:49.497 12:19:12 unittest -- unit/unittest.sh@301 -- # rm -f /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_base.info /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_test.info 00:16:49.497 12:19:12 unittest -- unit/unittest.sh@302 -- # genhtml /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info --output-directory /home/vagrant/spdk_repo/spdk/../output/ut_coverage 00:16:49.755 Reading data file /home/vagrant/spdk_repo/spdk/../output/ut_coverage/ut_cov_unit.info 00:16:49.756 Found 320 entries. 00:16:49.756 Found common filename prefix "/home/vagrant/spdk_repo/spdk" 00:16:49.756 Writing .css and .png files. 00:16:49.756 Generating output. 00:16:49.756 Processing file include/linux/virtio_ring.h 00:16:50.013 Processing file include/spdk/bdev_module.h 00:16:50.013 Processing file include/spdk/base64.h 00:16:50.013 Processing file include/spdk/endian.h 00:16:50.014 Processing file include/spdk/mmio.h 00:16:50.014 Processing file include/spdk/util.h 00:16:50.014 Processing file include/spdk/nvmf_transport.h 00:16:50.014 Processing file include/spdk/thread.h 00:16:50.014 Processing file include/spdk/trace.h 00:16:50.014 Processing file include/spdk/nvme_spec.h 00:16:50.014 Processing file include/spdk/nvme.h 00:16:50.014 Processing file include/spdk/histogram_data.h 00:16:50.014 Processing file include/spdk_internal/utf.h 00:16:50.014 Processing file include/spdk_internal/sgl.h 00:16:50.014 Processing file include/spdk_internal/nvme_tcp.h 00:16:50.014 Processing file include/spdk_internal/virtio.h 00:16:50.014 Processing file include/spdk_internal/sock.h 00:16:50.014 Processing file include/spdk_internal/rdma.h 00:16:50.272 Processing file lib/accel/accel_sw.c 00:16:50.272 Processing file lib/accel/accel_rpc.c 00:16:50.272 Processing file lib/accel/accel.c 00:16:50.532 Processing file lib/bdev/bdev_zone.c 00:16:50.532 Processing file lib/bdev/bdev.c 00:16:50.532 Processing file lib/bdev/part.c 00:16:50.532 Processing file lib/bdev/bdev_rpc.c 00:16:50.532 Processing file lib/bdev/scsi_nvme.c 00:16:50.791 Processing file lib/blob/blobstore.h 00:16:50.791 Processing file lib/blob/blobstore.c 00:16:50.791 Processing file lib/blob/zeroes.c 00:16:50.791 Processing file lib/blob/blob_bs_dev.c 00:16:50.791 Processing file lib/blob/request.c 00:16:50.791 Processing file lib/blobfs/blobfs.c 00:16:50.791 Processing file lib/blobfs/tree.c 00:16:51.049 Processing file lib/conf/conf.c 00:16:51.049 Processing file lib/dma/dma.c 00:16:51.306 Processing file lib/env_dpdk/init.c 00:16:51.306 Processing file lib/env_dpdk/pci_idxd.c 00:16:51.306 Processing file lib/env_dpdk/pci_virtio.c 00:16:51.306 Processing file lib/env_dpdk/pci_dpdk_2207.c 00:16:51.306 Processing file lib/env_dpdk/pci_ioat.c 00:16:51.306 Processing file lib/env_dpdk/pci.c 00:16:51.306 Processing file lib/env_dpdk/env.c 00:16:51.306 Processing file lib/env_dpdk/memory.c 00:16:51.306 Processing file lib/env_dpdk/pci_dpdk_2211.c 00:16:51.306 Processing file lib/env_dpdk/sigbus_handler.c 00:16:51.306 Processing file lib/env_dpdk/threads.c 00:16:51.306 Processing file lib/env_dpdk/pci_event.c 00:16:51.306 Processing file lib/env_dpdk/pci_dpdk.c 00:16:51.306 Processing file lib/env_dpdk/pci_vmd.c 00:16:51.306 Processing file lib/event/reactor.c 00:16:51.306 Processing file lib/event/log_rpc.c 00:16:51.306 Processing file lib/event/scheduler_static.c 00:16:51.306 Processing file lib/event/app_rpc.c 00:16:51.306 Processing file lib/event/app.c 00:16:51.873 Processing file lib/ftl/ftl_sb.c 00:16:51.873 Processing file lib/ftl/ftl_io.h 00:16:51.873 Processing file lib/ftl/ftl_io.c 00:16:51.873 Processing file lib/ftl/ftl_p2l.c 00:16:51.873 Processing file lib/ftl/ftl_band_ops.c 00:16:51.873 Processing file lib/ftl/ftl_rq.c 00:16:51.873 Processing file lib/ftl/ftl_core.c 00:16:51.873 Processing file lib/ftl/ftl_band.c 00:16:51.873 Processing file lib/ftl/ftl_writer.c 00:16:51.873 Processing file lib/ftl/ftl_l2p_cache.c 00:16:51.873 Processing file lib/ftl/ftl_debug.h 00:16:51.873 Processing file lib/ftl/ftl_nv_cache.h 00:16:51.873 Processing file lib/ftl/ftl_debug.c 00:16:51.873 Processing file lib/ftl/ftl_reloc.c 00:16:51.873 Processing file lib/ftl/ftl_l2p.c 00:16:51.873 Processing file lib/ftl/ftl_writer.h 00:16:51.873 Processing file lib/ftl/ftl_l2p_flat.c 00:16:51.873 Processing file lib/ftl/ftl_nv_cache_io.h 00:16:51.873 Processing file lib/ftl/ftl_band.h 00:16:51.873 Processing file lib/ftl/ftl_core.h 00:16:51.873 Processing file lib/ftl/ftl_init.c 00:16:51.873 Processing file lib/ftl/ftl_nv_cache.c 00:16:51.873 Processing file lib/ftl/ftl_layout.c 00:16:51.873 Processing file lib/ftl/ftl_trace.c 00:16:51.873 Processing file lib/ftl/base/ftl_base_bdev.c 00:16:51.873 Processing file lib/ftl/base/ftl_base_dev.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_self_test.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_p2l.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_band.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_misc.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_upgrade.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_recovery.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_startup.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_ioch.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_bdev.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_shutdown.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_md.c 00:16:52.131 Processing file lib/ftl/mngt/ftl_mngt_l2p.c 00:16:52.131 Processing file lib/ftl/nvc/ftl_nvc_dev.c 00:16:52.131 Processing file lib/ftl/nvc/ftl_nvc_bdev_vss.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_band_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_chunk_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_sb_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_p2l_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_sb_v5.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_layout_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_trim_upgrade.c 00:16:52.390 Processing file lib/ftl/upgrade/ftl_sb_v3.c 00:16:52.648 Processing file lib/ftl/utils/ftl_md.c 00:16:52.648 Processing file lib/ftl/utils/ftl_conf.c 00:16:52.648 Processing file lib/ftl/utils/ftl_layout_tracker_bdev.c 00:16:52.648 Processing file lib/ftl/utils/ftl_addr_utils.h 00:16:52.648 Processing file lib/ftl/utils/ftl_property.c 00:16:52.648 Processing file lib/ftl/utils/ftl_bitmap.c 00:16:52.648 Processing file lib/ftl/utils/ftl_property.h 00:16:52.648 Processing file lib/ftl/utils/ftl_mempool.c 00:16:52.648 Processing file lib/ftl/utils/ftl_df.h 00:16:52.648 Processing file lib/idxd/idxd.c 00:16:52.648 Processing file lib/idxd/idxd_user.c 00:16:52.648 Processing file lib/idxd/idxd_internal.h 00:16:52.905 Processing file lib/init/json_config.c 00:16:52.905 Processing file lib/init/subsystem_rpc.c 00:16:52.905 Processing file lib/init/rpc.c 00:16:52.905 Processing file lib/init/subsystem.c 00:16:52.905 Processing file lib/ioat/ioat.c 00:16:52.905 Processing file lib/ioat/ioat_internal.h 00:16:53.471 Processing file lib/iscsi/conn.c 00:16:53.471 Processing file lib/iscsi/iscsi.h 00:16:53.471 Processing file lib/iscsi/portal_grp.c 00:16:53.471 Processing file lib/iscsi/tgt_node.c 00:16:53.471 Processing file lib/iscsi/iscsi.c 00:16:53.471 Processing file lib/iscsi/md5.c 00:16:53.471 Processing file lib/iscsi/task.c 00:16:53.471 Processing file lib/iscsi/init_grp.c 00:16:53.471 Processing file lib/iscsi/task.h 00:16:53.471 Processing file lib/iscsi/param.c 00:16:53.471 Processing file lib/iscsi/iscsi_subsystem.c 00:16:53.471 Processing file lib/iscsi/iscsi_rpc.c 00:16:53.471 Processing file lib/json/json_util.c 00:16:53.471 Processing file lib/json/json_parse.c 00:16:53.471 Processing file lib/json/json_write.c 00:16:53.728 Processing file lib/jsonrpc/jsonrpc_server_tcp.c 00:16:53.728 Processing file lib/jsonrpc/jsonrpc_server.c 00:16:53.728 Processing file lib/jsonrpc/jsonrpc_client_tcp.c 00:16:53.728 Processing file lib/jsonrpc/jsonrpc_client.c 00:16:53.728 Processing file lib/keyring/keyring.c 00:16:53.728 Processing file lib/keyring/keyring_rpc.c 00:16:53.728 Processing file lib/log/log.c 00:16:53.728 Processing file lib/log/log_deprecated.c 00:16:53.728 Processing file lib/log/log_flags.c 00:16:53.728 Processing file lib/lvol/lvol.c 00:16:53.985 Processing file lib/nbd/nbd.c 00:16:53.985 Processing file lib/nbd/nbd_rpc.c 00:16:53.985 Processing file lib/notify/notify_rpc.c 00:16:53.985 Processing file lib/notify/notify.c 00:16:54.917 Processing file lib/nvme/nvme_rdma.c 00:16:54.917 Processing file lib/nvme/nvme_ns_cmd.c 00:16:54.917 Processing file lib/nvme/nvme_discovery.c 00:16:54.917 Processing file lib/nvme/nvme_cuse.c 00:16:54.917 Processing file lib/nvme/nvme_internal.h 00:16:54.917 Processing file lib/nvme/nvme_fabric.c 00:16:54.917 Processing file lib/nvme/nvme_tcp.c 00:16:54.917 Processing file lib/nvme/nvme_ctrlr_cmd.c 00:16:54.917 Processing file lib/nvme/nvme_io_msg.c 00:16:54.917 Processing file lib/nvme/nvme_pcie.c 00:16:54.917 Processing file lib/nvme/nvme.c 00:16:54.917 Processing file lib/nvme/nvme_ns.c 00:16:54.917 Processing file lib/nvme/nvme_qpair.c 00:16:54.917 Processing file lib/nvme/nvme_zns.c 00:16:54.917 Processing file lib/nvme/nvme_pcie_common.c 00:16:54.917 Processing file lib/nvme/nvme_opal.c 00:16:54.917 Processing file lib/nvme/nvme_auth.c 00:16:54.917 Processing file lib/nvme/nvme_poll_group.c 00:16:54.917 Processing file lib/nvme/nvme_transport.c 00:16:54.917 Processing file lib/nvme/nvme_quirks.c 00:16:54.917 Processing file lib/nvme/nvme_ns_ocssd_cmd.c 00:16:54.917 Processing file lib/nvme/nvme_ctrlr.c 00:16:54.917 Processing file lib/nvme/nvme_pcie_internal.h 00:16:54.917 Processing file lib/nvme/nvme_ctrlr_ocssd_cmd.c 00:16:55.482 Processing file lib/nvmf/nvmf.c 00:16:55.482 Processing file lib/nvmf/nvmf_rpc.c 00:16:55.482 Processing file lib/nvmf/transport.c 00:16:55.482 Processing file lib/nvmf/ctrlr.c 00:16:55.482 Processing file lib/nvmf/ctrlr_discovery.c 00:16:55.482 Processing file lib/nvmf/tcp.c 00:16:55.482 Processing file lib/nvmf/auth.c 00:16:55.482 Processing file lib/nvmf/rdma.c 00:16:55.482 Processing file lib/nvmf/nvmf_internal.h 00:16:55.482 Processing file lib/nvmf/ctrlr_bdev.c 00:16:55.483 Processing file lib/nvmf/subsystem.c 00:16:55.483 Processing file lib/rdma/rdma_verbs.c 00:16:55.483 Processing file lib/rdma/common.c 00:16:55.740 Processing file lib/rpc/rpc.c 00:16:55.740 Processing file lib/scsi/port.c 00:16:55.740 Processing file lib/scsi/scsi_rpc.c 00:16:55.740 Processing file lib/scsi/lun.c 00:16:55.740 Processing file lib/scsi/dev.c 00:16:55.740 Processing file lib/scsi/scsi_bdev.c 00:16:55.740 Processing file lib/scsi/task.c 00:16:55.740 Processing file lib/scsi/scsi.c 00:16:55.740 Processing file lib/scsi/scsi_pr.c 00:16:55.999 Processing file lib/sock/sock_rpc.c 00:16:55.999 Processing file lib/sock/sock.c 00:16:55.999 Processing file lib/thread/iobuf.c 00:16:55.999 Processing file lib/thread/thread.c 00:16:56.257 Processing file lib/trace/trace_rpc.c 00:16:56.257 Processing file lib/trace/trace.c 00:16:56.257 Processing file lib/trace/trace_flags.c 00:16:56.257 Processing file lib/trace_parser/trace.cpp 00:16:56.257 Processing file lib/ut/ut.c 00:16:56.257 Processing file lib/ut_mock/mock.c 00:16:56.824 Processing file lib/util/strerror_tls.c 00:16:56.824 Processing file lib/util/cpuset.c 00:16:56.824 Processing file lib/util/pipe.c 00:16:56.824 Processing file lib/util/bit_array.c 00:16:56.824 Processing file lib/util/base64.c 00:16:56.824 Processing file lib/util/string.c 00:16:56.824 Processing file lib/util/fd.c 00:16:56.824 Processing file lib/util/hexlify.c 00:16:56.824 Processing file lib/util/dif.c 00:16:56.824 Processing file lib/util/fd_group.c 00:16:56.824 Processing file lib/util/crc32c.c 00:16:56.824 Processing file lib/util/crc64.c 00:16:56.824 Processing file lib/util/crc32.c 00:16:56.824 Processing file lib/util/zipf.c 00:16:56.824 Processing file lib/util/crc32_ieee.c 00:16:56.824 Processing file lib/util/crc16.c 00:16:56.824 Processing file lib/util/uuid.c 00:16:56.824 Processing file lib/util/file.c 00:16:56.824 Processing file lib/util/iov.c 00:16:56.824 Processing file lib/util/xor.c 00:16:56.824 Processing file lib/util/math.c 00:16:56.824 Processing file lib/vfio_user/host/vfio_user_pci.c 00:16:56.824 Processing file lib/vfio_user/host/vfio_user.c 00:16:56.824 Processing file lib/vhost/vhost_internal.h 00:16:56.824 Processing file lib/vhost/vhost.c 00:16:56.824 Processing file lib/vhost/rte_vhost_user.c 00:16:56.824 Processing file lib/vhost/vhost_blk.c 00:16:56.824 Processing file lib/vhost/vhost_scsi.c 00:16:56.824 Processing file lib/vhost/vhost_rpc.c 00:16:57.082 Processing file lib/virtio/virtio_vhost_user.c 00:16:57.082 Processing file lib/virtio/virtio_pci.c 00:16:57.082 Processing file lib/virtio/virtio_vfio_user.c 00:16:57.082 Processing file lib/virtio/virtio.c 00:16:57.082 Processing file lib/vmd/led.c 00:16:57.082 Processing file lib/vmd/vmd.c 00:16:57.363 Processing file module/accel/dsa/accel_dsa_rpc.c 00:16:57.363 Processing file module/accel/dsa/accel_dsa.c 00:16:57.363 Processing file module/accel/error/accel_error_rpc.c 00:16:57.363 Processing file module/accel/error/accel_error.c 00:16:57.363 Processing file module/accel/iaa/accel_iaa_rpc.c 00:16:57.363 Processing file module/accel/iaa/accel_iaa.c 00:16:57.363 Processing file module/accel/ioat/accel_ioat.c 00:16:57.363 Processing file module/accel/ioat/accel_ioat_rpc.c 00:16:57.622 Processing file module/bdev/aio/bdev_aio.c 00:16:57.622 Processing file module/bdev/aio/bdev_aio_rpc.c 00:16:57.622 Processing file module/bdev/delay/vbdev_delay.c 00:16:57.622 Processing file module/bdev/delay/vbdev_delay_rpc.c 00:16:57.881 Processing file module/bdev/error/vbdev_error_rpc.c 00:16:57.881 Processing file module/bdev/error/vbdev_error.c 00:16:57.881 Processing file module/bdev/ftl/bdev_ftl_rpc.c 00:16:57.881 Processing file module/bdev/ftl/bdev_ftl.c 00:16:57.881 Processing file module/bdev/gpt/gpt.c 00:16:57.881 Processing file module/bdev/gpt/gpt.h 00:16:57.881 Processing file module/bdev/gpt/vbdev_gpt.c 00:16:58.140 Processing file module/bdev/iscsi/bdev_iscsi_rpc.c 00:16:58.140 Processing file module/bdev/iscsi/bdev_iscsi.c 00:16:58.140 Processing file module/bdev/lvol/vbdev_lvol.c 00:16:58.140 Processing file module/bdev/lvol/vbdev_lvol_rpc.c 00:16:58.140 Processing file module/bdev/malloc/bdev_malloc.c 00:16:58.140 Processing file module/bdev/malloc/bdev_malloc_rpc.c 00:16:58.398 Processing file module/bdev/null/bdev_null_rpc.c 00:16:58.398 Processing file module/bdev/null/bdev_null.c 00:16:58.657 Processing file module/bdev/nvme/vbdev_opal.c 00:16:58.657 Processing file module/bdev/nvme/nvme_rpc.c 00:16:58.657 Processing file module/bdev/nvme/bdev_nvme.c 00:16:58.657 Processing file module/bdev/nvme/bdev_nvme_rpc.c 00:16:58.657 Processing file module/bdev/nvme/vbdev_opal_rpc.c 00:16:58.657 Processing file module/bdev/nvme/bdev_nvme_cuse_rpc.c 00:16:58.657 Processing file module/bdev/nvme/bdev_mdns_client.c 00:16:58.657 Processing file module/bdev/passthru/vbdev_passthru.c 00:16:58.657 Processing file module/bdev/passthru/vbdev_passthru_rpc.c 00:16:58.916 Processing file module/bdev/raid/bdev_raid.c 00:16:58.916 Processing file module/bdev/raid/raid0.c 00:16:58.916 Processing file module/bdev/raid/raid1.c 00:16:58.916 Processing file module/bdev/raid/bdev_raid_sb.c 00:16:58.916 Processing file module/bdev/raid/concat.c 00:16:58.916 Processing file module/bdev/raid/bdev_raid_rpc.c 00:16:58.916 Processing file module/bdev/raid/bdev_raid.h 00:16:58.916 Processing file module/bdev/split/vbdev_split.c 00:16:58.916 Processing file module/bdev/split/vbdev_split_rpc.c 00:16:59.175 Processing file module/bdev/virtio/bdev_virtio_scsi.c 00:16:59.175 Processing file module/bdev/virtio/bdev_virtio_rpc.c 00:16:59.175 Processing file module/bdev/virtio/bdev_virtio_blk.c 00:16:59.175 Processing file module/bdev/zone_block/vbdev_zone_block.c 00:16:59.175 Processing file module/bdev/zone_block/vbdev_zone_block_rpc.c 00:16:59.175 Processing file module/blob/bdev/blob_bdev.c 00:16:59.434 Processing file module/blobfs/bdev/blobfs_bdev.c 00:16:59.434 Processing file module/blobfs/bdev/blobfs_bdev_rpc.c 00:16:59.434 Processing file module/env_dpdk/env_dpdk_rpc.c 00:16:59.434 Processing file module/event/subsystems/accel/accel.c 00:16:59.434 Processing file module/event/subsystems/bdev/bdev.c 00:16:59.692 Processing file module/event/subsystems/iobuf/iobuf_rpc.c 00:16:59.692 Processing file module/event/subsystems/iobuf/iobuf.c 00:16:59.692 Processing file module/event/subsystems/iscsi/iscsi.c 00:16:59.692 Processing file module/event/subsystems/keyring/keyring.c 00:16:59.692 Processing file module/event/subsystems/nbd/nbd.c 00:16:59.949 Processing file module/event/subsystems/nvmf/nvmf_rpc.c 00:16:59.949 Processing file module/event/subsystems/nvmf/nvmf_tgt.c 00:16:59.949 Processing file module/event/subsystems/scheduler/scheduler.c 00:16:59.949 Processing file module/event/subsystems/scsi/scsi.c 00:16:59.949 Processing file module/event/subsystems/sock/sock.c 00:17:00.207 Processing file module/event/subsystems/vhost_blk/vhost_blk.c 00:17:00.207 Processing file module/event/subsystems/vhost_scsi/vhost_scsi.c 00:17:00.207 Processing file module/event/subsystems/vmd/vmd_rpc.c 00:17:00.207 Processing file module/event/subsystems/vmd/vmd.c 00:17:00.465 Processing file module/keyring/file/keyring.c 00:17:00.465 Processing file module/keyring/file/keyring_rpc.c 00:17:00.465 Processing file module/keyring/linux/keyring.c 00:17:00.465 Processing file module/keyring/linux/keyring_rpc.c 00:17:00.465 Processing file module/scheduler/dpdk_governor/dpdk_governor.c 00:17:00.465 Processing file module/scheduler/dynamic/scheduler_dynamic.c 00:17:00.724 Processing file module/scheduler/gscheduler/gscheduler.c 00:17:00.724 Processing file module/sock/sock_kernel.h 00:17:00.724 Processing file module/sock/posix/posix.c 00:17:00.724 Writing directory view page. 00:17:00.724 Overall coverage rate: 00:17:00.724 lines......: 38.5% (40324 of 104781 lines) 00:17:00.724 functions..: 42.1% (3672 of 8720 functions) 00:17:00.724 00:17:00.724 00:17:00.724 ===================== 00:17:00.724 All unit tests passed 00:17:00.724 ===================== 00:17:00.724 Note: coverage report is here: /home/vagrant/spdk_repo/spdk//home/vagrant/spdk_repo/spdk/../output/ut_coverage 00:17:00.724 12:19:24 unittest -- unit/unittest.sh@305 -- # set +x 00:17:00.724 00:17:00.724 00:17:00.724 00:17:00.724 real 2m34.433s 00:17:00.724 user 2m6.184s 00:17:00.724 sys 0m18.116s 00:17:00.724 12:19:24 unittest -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:00.724 12:19:24 unittest -- common/autotest_common.sh@10 -- # set +x 00:17:00.724 ************************************ 00:17:00.724 END TEST unittest 00:17:00.724 ************************************ 00:17:00.724 12:19:24 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:17:00.724 12:19:24 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:17:00.724 12:19:24 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:17:00.724 12:19:24 -- spdk/autotest.sh@162 -- # timing_enter lib 00:17:00.724 12:19:24 -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:00.724 12:19:24 -- common/autotest_common.sh@10 -- # set +x 00:17:00.724 12:19:24 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:17:00.724 12:19:24 -- spdk/autotest.sh@168 -- # run_test env /home/vagrant/spdk_repo/spdk/test/env/env.sh 00:17:00.724 12:19:24 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:00.724 12:19:24 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:00.724 12:19:24 -- common/autotest_common.sh@10 -- # set +x 00:17:01.044 ************************************ 00:17:01.044 START TEST env 00:17:01.044 ************************************ 00:17:01.044 12:19:24 env -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/env.sh 00:17:01.044 * Looking for test storage... 00:17:01.044 * Found test storage at /home/vagrant/spdk_repo/spdk/test/env 00:17:01.044 12:19:24 env -- env/env.sh@10 -- # run_test env_memory /home/vagrant/spdk_repo/spdk/test/env/memory/memory_ut 00:17:01.044 12:19:24 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:01.044 12:19:24 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:01.044 12:19:24 env -- common/autotest_common.sh@10 -- # set +x 00:17:01.044 ************************************ 00:17:01.044 START TEST env_memory 00:17:01.044 ************************************ 00:17:01.044 12:19:24 env.env_memory -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/memory/memory_ut 00:17:01.044 00:17:01.044 00:17:01.044 CUnit - A unit testing framework for C - Version 2.1-3 00:17:01.044 http://cunit.sourceforge.net/ 00:17:01.044 00:17:01.044 00:17:01.044 Suite: memory 00:17:01.044 Test: alloc and free memory map ...[2024-06-07 12:19:24.522694] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:17:01.044 passed 00:17:01.044 Test: mem map translation ...[2024-06-07 12:19:24.552615] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:17:01.044 [2024-06-07 12:19:24.553001] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:17:01.044 [2024-06-07 12:19:24.553310] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:17:01.044 [2024-06-07 12:19:24.553628] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:17:01.044 passed 00:17:01.044 Test: mem map registration ...[2024-06-07 12:19:24.585163] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:17:01.044 [2024-06-07 12:19:24.585518] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:17:01.044 passed 00:17:01.044 Test: mem map adjacent registrations ...passed 00:17:01.044 00:17:01.044 Run Summary: Type Total Ran Passed Failed Inactive 00:17:01.044 suites 1 1 n/a 0 0 00:17:01.044 tests 4 4 4 0 0 00:17:01.044 asserts 152 152 152 0 n/a 00:17:01.044 00:17:01.044 Elapsed time = 0.121 seconds 00:17:01.044 00:17:01.044 real 0m0.149s 00:17:01.044 user 0m0.127s 00:17:01.044 sys 0m0.017s 00:17:01.321 12:19:24 env.env_memory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:01.321 12:19:24 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:17:01.321 ************************************ 00:17:01.321 END TEST env_memory 00:17:01.321 ************************************ 00:17:01.321 12:19:24 env -- env/env.sh@11 -- # run_test env_vtophys /home/vagrant/spdk_repo/spdk/test/env/vtophys/vtophys 00:17:01.321 12:19:24 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:01.321 12:19:24 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:01.321 12:19:24 env -- common/autotest_common.sh@10 -- # set +x 00:17:01.321 ************************************ 00:17:01.321 START TEST env_vtophys 00:17:01.321 ************************************ 00:17:01.321 12:19:24 env.env_vtophys -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/vtophys/vtophys 00:17:01.321 EAL: lib.eal log level changed from notice to debug 00:17:01.321 EAL: Detected lcore 0 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 1 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 2 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 3 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 4 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 5 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 6 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 7 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 8 as core 0 on socket 0 00:17:01.321 EAL: Detected lcore 9 as core 0 on socket 0 00:17:01.321 EAL: Maximum logical cores by configuration: 128 00:17:01.321 EAL: Detected CPU lcores: 10 00:17:01.321 EAL: Detected NUMA nodes: 1 00:17:01.321 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:17:01.321 EAL: Checking presence of .so 'librte_eal.so.23' 00:17:01.321 EAL: Checking presence of .so 'librte_eal.so' 00:17:01.321 EAL: Detected static linkage of DPDK 00:17:01.321 EAL: No shared files mode enabled, IPC will be disabled 00:17:01.321 EAL: Selected IOVA mode 'PA' 00:17:01.321 EAL: Probing VFIO support... 00:17:01.321 EAL: Module /sys/module/vfio not found! error 2 (No such file or directory) 00:17:01.321 EAL: VFIO modules not loaded, skipping VFIO support... 00:17:01.321 EAL: Ask a virtual area of 0x2e000 bytes 00:17:01.321 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:17:01.321 EAL: Setting up physically contiguous memory... 00:17:01.321 EAL: Setting maximum number of open files to 524288 00:17:01.321 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:17:01.321 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:17:01.321 EAL: Ask a virtual area of 0x61000 bytes 00:17:01.321 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:17:01.321 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:17:01.321 EAL: Ask a virtual area of 0x400000000 bytes 00:17:01.321 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:17:01.321 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:17:01.321 EAL: Ask a virtual area of 0x61000 bytes 00:17:01.321 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:17:01.321 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:17:01.321 EAL: Ask a virtual area of 0x400000000 bytes 00:17:01.321 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:17:01.321 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:17:01.321 EAL: Ask a virtual area of 0x61000 bytes 00:17:01.321 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:17:01.321 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:17:01.321 EAL: Ask a virtual area of 0x400000000 bytes 00:17:01.321 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:17:01.321 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:17:01.321 EAL: Ask a virtual area of 0x61000 bytes 00:17:01.321 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:17:01.321 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:17:01.321 EAL: Ask a virtual area of 0x400000000 bytes 00:17:01.321 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:17:01.321 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:17:01.321 EAL: Hugepages will be freed exactly as allocated. 00:17:01.321 EAL: No shared files mode enabled, IPC is disabled 00:17:01.321 EAL: No shared files mode enabled, IPC is disabled 00:17:01.321 EAL: TSC frequency is ~2100000 KHz 00:17:01.321 EAL: Main lcore 0 is ready (tid=7f2a9ce36a40;cpuset=[0]) 00:17:01.321 EAL: Trying to obtain current memory policy. 00:17:01.321 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:01.321 EAL: Restoring previous memory policy: 0 00:17:01.321 EAL: request: mp_malloc_sync 00:17:01.321 EAL: No shared files mode enabled, IPC is disabled 00:17:01.321 EAL: Heap on socket 0 was expanded by 2MB 00:17:01.321 EAL: Module /sys/module/vfio not found! error 2 (No such file or directory) 00:17:01.321 EAL: Mem event callback 'spdk:(nil)' registered 00:17:01.321 EAL: Module /sys/module/vfio_pci not found! error 2 (No such file or directory) 00:17:01.321 00:17:01.321 00:17:01.321 CUnit - A unit testing framework for C - Version 2.1-3 00:17:01.321 http://cunit.sourceforge.net/ 00:17:01.321 00:17:01.321 00:17:01.321 Suite: components_suite 00:17:02.257 Test: vtophys_malloc_test ...passed 00:17:02.257 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:17:02.257 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.257 EAL: Restoring previous memory policy: 4 00:17:02.257 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.257 EAL: request: mp_malloc_sync 00:17:02.257 EAL: No shared files mode enabled, IPC is disabled 00:17:02.257 EAL: Heap on socket 0 was expanded by 4MB 00:17:02.257 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.257 EAL: request: mp_malloc_sync 00:17:02.257 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 4MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 6MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 6MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 10MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 10MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 18MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 18MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 34MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 34MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 66MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 66MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.258 EAL: Restoring previous memory policy: 4 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was expanded by 130MB 00:17:02.258 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.258 EAL: request: mp_malloc_sync 00:17:02.258 EAL: No shared files mode enabled, IPC is disabled 00:17:02.258 EAL: Heap on socket 0 was shrunk by 130MB 00:17:02.258 EAL: Trying to obtain current memory policy. 00:17:02.258 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.516 EAL: Restoring previous memory policy: 4 00:17:02.516 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.516 EAL: request: mp_malloc_sync 00:17:02.516 EAL: No shared files mode enabled, IPC is disabled 00:17:02.516 EAL: Heap on socket 0 was expanded by 258MB 00:17:02.516 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.516 EAL: request: mp_malloc_sync 00:17:02.516 EAL: No shared files mode enabled, IPC is disabled 00:17:02.516 EAL: Heap on socket 0 was shrunk by 258MB 00:17:02.516 EAL: Trying to obtain current memory policy. 00:17:02.516 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:02.774 EAL: Restoring previous memory policy: 4 00:17:02.774 EAL: Calling mem event callback 'spdk:(nil)' 00:17:02.774 EAL: request: mp_malloc_sync 00:17:02.774 EAL: No shared files mode enabled, IPC is disabled 00:17:02.774 EAL: Heap on socket 0 was expanded by 514MB 00:17:03.033 EAL: Calling mem event callback 'spdk:(nil)' 00:17:03.033 EAL: request: mp_malloc_sync 00:17:03.033 EAL: No shared files mode enabled, IPC is disabled 00:17:03.033 EAL: Heap on socket 0 was shrunk by 514MB 00:17:03.033 EAL: Trying to obtain current memory policy. 00:17:03.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:17:03.600 EAL: Restoring previous memory policy: 4 00:17:03.600 EAL: Calling mem event callback 'spdk:(nil)' 00:17:03.600 EAL: request: mp_malloc_sync 00:17:03.600 EAL: No shared files mode enabled, IPC is disabled 00:17:03.600 EAL: Heap on socket 0 was expanded by 1026MB 00:17:03.858 EAL: Calling mem event callback 'spdk:(nil)' 00:17:04.116 EAL: request: mp_malloc_sync 00:17:04.116 EAL: No shared files mode enabled, IPC is disabled 00:17:04.116 EAL: Heap on socket 0 was shrunk by 1026MB 00:17:04.116 passed 00:17:04.116 00:17:04.116 Run Summary: Type Total Ran Passed Failed Inactive 00:17:04.116 suites 1 1 n/a 0 0 00:17:04.116 tests 2 2 2 0 0 00:17:04.116 asserts 6457 6457 6457 0 n/a 00:17:04.116 00:17:04.116 Elapsed time = 2.710 seconds 00:17:04.116 EAL: Calling mem event callback 'spdk:(nil)' 00:17:04.116 EAL: request: mp_malloc_sync 00:17:04.116 EAL: No shared files mode enabled, IPC is disabled 00:17:04.116 EAL: Heap on socket 0 was shrunk by 2MB 00:17:04.116 EAL: No shared files mode enabled, IPC is disabled 00:17:04.116 EAL: No shared files mode enabled, IPC is disabled 00:17:04.116 EAL: No shared files mode enabled, IPC is disabled 00:17:04.116 00:17:04.116 real 0m3.006s 00:17:04.116 user 0m1.506s 00:17:04.116 sys 0m1.331s 00:17:04.116 12:19:27 env.env_vtophys -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.116 12:19:27 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:17:04.116 ************************************ 00:17:04.116 END TEST env_vtophys 00:17:04.116 ************************************ 00:17:04.375 12:19:27 env -- env/env.sh@12 -- # run_test env_pci /home/vagrant/spdk_repo/spdk/test/env/pci/pci_ut 00:17:04.375 12:19:27 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:04.375 12:19:27 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:04.375 12:19:27 env -- common/autotest_common.sh@10 -- # set +x 00:17:04.375 ************************************ 00:17:04.375 START TEST env_pci 00:17:04.375 ************************************ 00:17:04.375 12:19:27 env.env_pci -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/pci/pci_ut 00:17:04.375 00:17:04.375 00:17:04.375 CUnit - A unit testing framework for C - Version 2.1-3 00:17:04.375 http://cunit.sourceforge.net/ 00:17:04.375 00:17:04.375 00:17:04.375 Suite: pci 00:17:04.375 Test: pci_hook ...[2024-06-07 12:19:27.863034] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 190543 has claimed it 00:17:04.375 EAL: Cannot find device (10000:00:01.0) 00:17:04.375 EAL: Failed to attach device on primary process 00:17:04.375 passed 00:17:04.375 00:17:04.375 Run Summary: Type Total Ran Passed Failed Inactive 00:17:04.375 suites 1 1 n/a 0 0 00:17:04.375 tests 1 1 1 0 0 00:17:04.375 asserts 25 25 25 0 n/a 00:17:04.375 00:17:04.375 Elapsed time = 0.013 seconds 00:17:04.375 00:17:04.375 real 0m0.126s 00:17:04.375 user 0m0.061s 00:17:04.375 sys 0m0.060s 00:17:04.375 12:19:27 env.env_pci -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.375 12:19:27 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:17:04.375 ************************************ 00:17:04.375 END TEST env_pci 00:17:04.375 ************************************ 00:17:04.375 12:19:27 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:17:04.375 12:19:27 env -- env/env.sh@15 -- # uname 00:17:04.375 12:19:27 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:17:04.375 12:19:27 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:17:04.375 12:19:27 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /home/vagrant/spdk_repo/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:17:04.375 12:19:27 env -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:04.375 12:19:27 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:04.375 12:19:27 env -- common/autotest_common.sh@10 -- # set +x 00:17:04.375 ************************************ 00:17:04.375 START TEST env_dpdk_post_init 00:17:04.375 ************************************ 00:17:04.375 12:19:27 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:17:04.683 EAL: Detected CPU lcores: 10 00:17:04.683 EAL: Detected NUMA nodes: 1 00:17:04.683 EAL: Detected static linkage of DPDK 00:17:04.683 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:17:04.683 EAL: Selected IOVA mode 'PA' 00:17:04.683 TELEMETRY: No legacy callbacks, legacy socket not created 00:17:04.683 EAL: Probe PCI driver: spdk_nvme (1b36:0010) device: 0000:00:10.0 (socket 0) 00:17:04.683 Starting DPDK initialization... 00:17:04.683 Starting SPDK post initialization... 00:17:04.683 SPDK NVMe probe 00:17:04.683 Attaching to 0000:00:10.0 00:17:04.683 Attached to 0000:00:10.0 00:17:04.683 Cleaning up... 00:17:04.683 00:17:04.683 real 0m0.213s 00:17:04.683 user 0m0.046s 00:17:04.683 sys 0m0.067s 00:17:04.683 12:19:28 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.683 12:19:28 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:17:04.683 ************************************ 00:17:04.683 END TEST env_dpdk_post_init 00:17:04.683 ************************************ 00:17:04.683 12:19:28 env -- env/env.sh@26 -- # uname 00:17:04.683 12:19:28 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:17:04.683 12:19:28 env -- env/env.sh@29 -- # run_test env_mem_callbacks /home/vagrant/spdk_repo/spdk/test/env/mem_callbacks/mem_callbacks 00:17:04.683 12:19:28 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:04.683 12:19:28 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:04.683 12:19:28 env -- common/autotest_common.sh@10 -- # set +x 00:17:04.967 ************************************ 00:17:04.967 START TEST env_mem_callbacks 00:17:04.967 ************************************ 00:17:04.967 12:19:28 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/env/mem_callbacks/mem_callbacks 00:17:04.967 EAL: Detected CPU lcores: 10 00:17:04.967 EAL: Detected NUMA nodes: 1 00:17:04.967 EAL: Detected static linkage of DPDK 00:17:04.967 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:17:04.967 EAL: Selected IOVA mode 'PA' 00:17:04.967 TELEMETRY: No legacy callbacks, legacy socket not created 00:17:04.967 00:17:04.967 00:17:04.967 CUnit - A unit testing framework for C - Version 2.1-3 00:17:04.967 http://cunit.sourceforge.net/ 00:17:04.967 00:17:04.967 00:17:04.967 Suite: memory 00:17:04.967 Test: test ... 00:17:04.967 register 0x200000200000 2097152 00:17:04.967 malloc 3145728 00:17:04.967 register 0x200000400000 4194304 00:17:04.967 buf 0x200000500000 len 3145728 PASSED 00:17:04.967 malloc 64 00:17:04.967 buf 0x2000004fff40 len 64 PASSED 00:17:04.967 malloc 4194304 00:17:04.967 register 0x200000800000 6291456 00:17:04.967 buf 0x200000a00000 len 4194304 PASSED 00:17:04.967 free 0x200000500000 3145728 00:17:04.967 free 0x2000004fff40 64 00:17:04.967 unregister 0x200000400000 4194304 PASSED 00:17:04.967 free 0x200000a00000 4194304 00:17:04.967 unregister 0x200000800000 6291456 PASSED 00:17:04.967 malloc 8388608 00:17:04.967 register 0x200000400000 10485760 00:17:04.967 buf 0x200000600000 len 8388608 PASSED 00:17:04.967 free 0x200000600000 8388608 00:17:04.967 unregister 0x200000400000 10485760 PASSED 00:17:04.967 passed 00:17:04.967 00:17:04.967 Run Summary: Type Total Ran Passed Failed Inactive 00:17:04.967 suites 1 1 n/a 0 0 00:17:04.967 tests 1 1 1 0 0 00:17:04.967 asserts 15 15 15 0 n/a 00:17:04.967 00:17:04.967 Elapsed time = 0.009 seconds 00:17:04.967 00:17:04.967 real 0m0.213s 00:17:04.967 user 0m0.043s 00:17:04.967 sys 0m0.069s 00:17:04.967 12:19:28 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.967 12:19:28 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:17:04.967 ************************************ 00:17:04.967 END TEST env_mem_callbacks 00:17:04.967 ************************************ 00:17:04.967 00:17:04.967 real 0m4.186s 00:17:04.967 user 0m1.953s 00:17:04.967 sys 0m1.845s 00:17:04.967 12:19:28 env -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:04.967 12:19:28 env -- common/autotest_common.sh@10 -- # set +x 00:17:04.967 ************************************ 00:17:04.967 END TEST env 00:17:04.967 ************************************ 00:17:05.225 12:19:28 -- spdk/autotest.sh@169 -- # run_test rpc /home/vagrant/spdk_repo/spdk/test/rpc/rpc.sh 00:17:05.225 12:19:28 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:05.225 12:19:28 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:05.225 12:19:28 -- common/autotest_common.sh@10 -- # set +x 00:17:05.225 ************************************ 00:17:05.225 START TEST rpc 00:17:05.225 ************************************ 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/rpc/rpc.sh 00:17:05.225 * Looking for test storage... 00:17:05.225 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc 00:17:05.225 12:19:28 rpc -- rpc/rpc.sh@65 -- # spdk_pid=190675 00:17:05.225 12:19:28 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:17:05.225 12:19:28 rpc -- rpc/rpc.sh@64 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -e bdev 00:17:05.225 12:19:28 rpc -- rpc/rpc.sh@67 -- # waitforlisten 190675 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@830 -- # '[' -z 190675 ']' 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:05.225 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:05.225 12:19:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:05.226 [2024-06-07 12:19:28.768652] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:05.226 [2024-06-07 12:19:28.768943] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190675 ] 00:17:05.484 [2024-06-07 12:19:28.912398] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:05.484 [2024-06-07 12:19:29.003591] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:17:05.484 [2024-06-07 12:19:29.003694] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 190675' to capture a snapshot of events at runtime. 00:17:05.484 [2024-06-07 12:19:29.003764] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:17:05.485 [2024-06-07 12:19:29.003801] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:17:05.485 [2024-06-07 12:19:29.003854] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid190675 for offline analysis/debug. 00:17:05.485 [2024-06-07 12:19:29.003924] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:06.421 12:19:29 rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:06.421 12:19:29 rpc -- common/autotest_common.sh@863 -- # return 0 00:17:06.421 12:19:29 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/test/rpc 00:17:06.421 12:19:29 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/test/rpc 00:17:06.421 12:19:29 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:17:06.421 12:19:29 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:17:06.421 12:19:29 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:06.421 12:19:29 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:06.421 12:19:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 ************************************ 00:17:06.421 START TEST rpc_integrity 00:17:06.421 ************************************ 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:17:06.421 { 00:17:06.421 "name": "Malloc0", 00:17:06.421 "aliases": [ 00:17:06.421 "58f23a91-2537-4819-b736-4e8f8505e0e3" 00:17:06.421 ], 00:17:06.421 "product_name": "Malloc disk", 00:17:06.421 "block_size": 512, 00:17:06.421 "num_blocks": 16384, 00:17:06.421 "uuid": "58f23a91-2537-4819-b736-4e8f8505e0e3", 00:17:06.421 "assigned_rate_limits": { 00:17:06.421 "rw_ios_per_sec": 0, 00:17:06.421 "rw_mbytes_per_sec": 0, 00:17:06.421 "r_mbytes_per_sec": 0, 00:17:06.421 "w_mbytes_per_sec": 0 00:17:06.421 }, 00:17:06.421 "claimed": false, 00:17:06.421 "zoned": false, 00:17:06.421 "supported_io_types": { 00:17:06.421 "read": true, 00:17:06.421 "write": true, 00:17:06.421 "unmap": true, 00:17:06.421 "write_zeroes": true, 00:17:06.421 "flush": true, 00:17:06.421 "reset": true, 00:17:06.421 "compare": false, 00:17:06.421 "compare_and_write": false, 00:17:06.421 "abort": true, 00:17:06.421 "nvme_admin": false, 00:17:06.421 "nvme_io": false 00:17:06.421 }, 00:17:06.421 "memory_domains": [ 00:17:06.421 { 00:17:06.421 "dma_device_id": "system", 00:17:06.421 "dma_device_type": 1 00:17:06.421 }, 00:17:06.421 { 00:17:06.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.421 "dma_device_type": 2 00:17:06.421 } 00:17:06.421 ], 00:17:06.421 "driver_specific": {} 00:17:06.421 } 00:17:06.421 ]' 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 [2024-06-07 12:19:29.936575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:17:06.421 [2024-06-07 12:19:29.936714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.421 [2024-06-07 12:19:29.936766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006080 00:17:06.421 [2024-06-07 12:19:29.936809] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.421 [2024-06-07 12:19:29.939410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.421 [2024-06-07 12:19:29.939501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:17:06.421 Passthru0 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.421 12:19:29 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.421 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:17:06.421 { 00:17:06.421 "name": "Malloc0", 00:17:06.421 "aliases": [ 00:17:06.421 "58f23a91-2537-4819-b736-4e8f8505e0e3" 00:17:06.421 ], 00:17:06.421 "product_name": "Malloc disk", 00:17:06.421 "block_size": 512, 00:17:06.421 "num_blocks": 16384, 00:17:06.421 "uuid": "58f23a91-2537-4819-b736-4e8f8505e0e3", 00:17:06.421 "assigned_rate_limits": { 00:17:06.421 "rw_ios_per_sec": 0, 00:17:06.421 "rw_mbytes_per_sec": 0, 00:17:06.421 "r_mbytes_per_sec": 0, 00:17:06.421 "w_mbytes_per_sec": 0 00:17:06.421 }, 00:17:06.421 "claimed": true, 00:17:06.421 "claim_type": "exclusive_write", 00:17:06.421 "zoned": false, 00:17:06.421 "supported_io_types": { 00:17:06.421 "read": true, 00:17:06.421 "write": true, 00:17:06.421 "unmap": true, 00:17:06.421 "write_zeroes": true, 00:17:06.421 "flush": true, 00:17:06.421 "reset": true, 00:17:06.421 "compare": false, 00:17:06.421 "compare_and_write": false, 00:17:06.421 "abort": true, 00:17:06.421 "nvme_admin": false, 00:17:06.421 "nvme_io": false 00:17:06.421 }, 00:17:06.421 "memory_domains": [ 00:17:06.421 { 00:17:06.421 "dma_device_id": "system", 00:17:06.421 "dma_device_type": 1 00:17:06.421 }, 00:17:06.421 { 00:17:06.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.421 "dma_device_type": 2 00:17:06.421 } 00:17:06.421 ], 00:17:06.421 "driver_specific": {} 00:17:06.421 }, 00:17:06.421 { 00:17:06.421 "name": "Passthru0", 00:17:06.421 "aliases": [ 00:17:06.421 "e0d7410b-b066-5934-837f-5fab92c9a214" 00:17:06.421 ], 00:17:06.421 "product_name": "passthru", 00:17:06.421 "block_size": 512, 00:17:06.421 "num_blocks": 16384, 00:17:06.421 "uuid": "e0d7410b-b066-5934-837f-5fab92c9a214", 00:17:06.421 "assigned_rate_limits": { 00:17:06.421 "rw_ios_per_sec": 0, 00:17:06.421 "rw_mbytes_per_sec": 0, 00:17:06.421 "r_mbytes_per_sec": 0, 00:17:06.421 "w_mbytes_per_sec": 0 00:17:06.421 }, 00:17:06.421 "claimed": false, 00:17:06.421 "zoned": false, 00:17:06.421 "supported_io_types": { 00:17:06.421 "read": true, 00:17:06.421 "write": true, 00:17:06.421 "unmap": true, 00:17:06.421 "write_zeroes": true, 00:17:06.421 "flush": true, 00:17:06.421 "reset": true, 00:17:06.421 "compare": false, 00:17:06.421 "compare_and_write": false, 00:17:06.421 "abort": true, 00:17:06.421 "nvme_admin": false, 00:17:06.421 "nvme_io": false 00:17:06.421 }, 00:17:06.421 "memory_domains": [ 00:17:06.421 { 00:17:06.421 "dma_device_id": "system", 00:17:06.421 "dma_device_type": 1 00:17:06.422 }, 00:17:06.422 { 00:17:06.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.422 "dma_device_type": 2 00:17:06.422 } 00:17:06.422 ], 00:17:06.422 "driver_specific": { 00:17:06.422 "passthru": { 00:17:06.422 "name": "Passthru0", 00:17:06.422 "base_bdev_name": "Malloc0" 00:17:06.422 } 00:17:06.422 } 00:17:06.422 } 00:17:06.422 ]' 00:17:06.422 12:19:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.422 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:17:06.422 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:17:06.681 12:19:30 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:17:06.681 00:17:06.681 real 0m0.309s 00:17:06.681 user 0m0.188s 00:17:06.681 sys 0m0.050s 00:17:06.681 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:06.681 12:19:30 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:06.681 ************************************ 00:17:06.681 END TEST rpc_integrity 00:17:06.681 ************************************ 00:17:06.681 12:19:30 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:17:06.681 12:19:30 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:06.681 12:19:30 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:06.681 12:19:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:06.681 ************************************ 00:17:06.681 START TEST rpc_plugins 00:17:06.681 ************************************ 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # rpc_plugins 00:17:06.681 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.681 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:17:06.681 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:17:06.681 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.681 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:17:06.681 { 00:17:06.681 "name": "Malloc1", 00:17:06.681 "aliases": [ 00:17:06.681 "151bb486-8f61-4ae5-bebb-a566ebc397a3" 00:17:06.681 ], 00:17:06.681 "product_name": "Malloc disk", 00:17:06.681 "block_size": 4096, 00:17:06.681 "num_blocks": 256, 00:17:06.681 "uuid": "151bb486-8f61-4ae5-bebb-a566ebc397a3", 00:17:06.681 "assigned_rate_limits": { 00:17:06.681 "rw_ios_per_sec": 0, 00:17:06.681 "rw_mbytes_per_sec": 0, 00:17:06.681 "r_mbytes_per_sec": 0, 00:17:06.681 "w_mbytes_per_sec": 0 00:17:06.681 }, 00:17:06.681 "claimed": false, 00:17:06.681 "zoned": false, 00:17:06.681 "supported_io_types": { 00:17:06.681 "read": true, 00:17:06.681 "write": true, 00:17:06.681 "unmap": true, 00:17:06.681 "write_zeroes": true, 00:17:06.681 "flush": true, 00:17:06.681 "reset": true, 00:17:06.681 "compare": false, 00:17:06.681 "compare_and_write": false, 00:17:06.681 "abort": true, 00:17:06.681 "nvme_admin": false, 00:17:06.681 "nvme_io": false 00:17:06.681 }, 00:17:06.681 "memory_domains": [ 00:17:06.681 { 00:17:06.681 "dma_device_id": "system", 00:17:06.681 "dma_device_type": 1 00:17:06.681 }, 00:17:06.681 { 00:17:06.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.681 "dma_device_type": 2 00:17:06.681 } 00:17:06.681 ], 00:17:06.681 "driver_specific": {} 00:17:06.681 } 00:17:06.681 ]' 00:17:06.681 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:17:06.682 12:19:30 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:17:06.682 00:17:06.682 real 0m0.123s 00:17:06.682 user 0m0.070s 00:17:06.682 sys 0m0.018s 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:06.682 ************************************ 00:17:06.682 END TEST rpc_plugins 00:17:06.682 ************************************ 00:17:06.682 12:19:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:17:06.682 12:19:30 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:17:06.682 12:19:30 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:06.682 12:19:30 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:06.682 12:19:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:06.940 ************************************ 00:17:06.940 START TEST rpc_trace_cmd_test 00:17:06.940 ************************************ 00:17:06.940 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # rpc_trace_cmd_test 00:17:06.940 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:17:06.940 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:17:06.940 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:06.940 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:17:06.941 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid190675", 00:17:06.941 "tpoint_group_mask": "0x8", 00:17:06.941 "iscsi_conn": { 00:17:06.941 "mask": "0x2", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "scsi": { 00:17:06.941 "mask": "0x4", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "bdev": { 00:17:06.941 "mask": "0x8", 00:17:06.941 "tpoint_mask": "0xffffffffffffffff" 00:17:06.941 }, 00:17:06.941 "nvmf_rdma": { 00:17:06.941 "mask": "0x10", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "nvmf_tcp": { 00:17:06.941 "mask": "0x20", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "ftl": { 00:17:06.941 "mask": "0x40", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "blobfs": { 00:17:06.941 "mask": "0x80", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "dsa": { 00:17:06.941 "mask": "0x200", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "thread": { 00:17:06.941 "mask": "0x400", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "nvme_pcie": { 00:17:06.941 "mask": "0x800", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "iaa": { 00:17:06.941 "mask": "0x1000", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "nvme_tcp": { 00:17:06.941 "mask": "0x2000", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "bdev_nvme": { 00:17:06.941 "mask": "0x4000", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 }, 00:17:06.941 "sock": { 00:17:06.941 "mask": "0x8000", 00:17:06.941 "tpoint_mask": "0x0" 00:17:06.941 } 00:17:06.941 }' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:17:06.941 00:17:06.941 real 0m0.231s 00:17:06.941 user 0m0.195s 00:17:06.941 sys 0m0.027s 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:06.941 12:19:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.941 ************************************ 00:17:06.941 END TEST rpc_trace_cmd_test 00:17:06.941 ************************************ 00:17:07.200 12:19:30 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:17:07.200 12:19:30 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:17:07.200 12:19:30 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:17:07.200 12:19:30 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:07.200 12:19:30 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:07.200 12:19:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:07.200 ************************************ 00:17:07.200 START TEST rpc_daemon_integrity 00:17:07.200 ************************************ 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:17:07.200 { 00:17:07.200 "name": "Malloc2", 00:17:07.200 "aliases": [ 00:17:07.200 "2181b0e4-bc68-418d-9548-72ae10669e3a" 00:17:07.200 ], 00:17:07.200 "product_name": "Malloc disk", 00:17:07.200 "block_size": 512, 00:17:07.200 "num_blocks": 16384, 00:17:07.200 "uuid": "2181b0e4-bc68-418d-9548-72ae10669e3a", 00:17:07.200 "assigned_rate_limits": { 00:17:07.200 "rw_ios_per_sec": 0, 00:17:07.200 "rw_mbytes_per_sec": 0, 00:17:07.200 "r_mbytes_per_sec": 0, 00:17:07.200 "w_mbytes_per_sec": 0 00:17:07.200 }, 00:17:07.200 "claimed": false, 00:17:07.200 "zoned": false, 00:17:07.200 "supported_io_types": { 00:17:07.200 "read": true, 00:17:07.200 "write": true, 00:17:07.200 "unmap": true, 00:17:07.200 "write_zeroes": true, 00:17:07.200 "flush": true, 00:17:07.200 "reset": true, 00:17:07.200 "compare": false, 00:17:07.200 "compare_and_write": false, 00:17:07.200 "abort": true, 00:17:07.200 "nvme_admin": false, 00:17:07.200 "nvme_io": false 00:17:07.200 }, 00:17:07.200 "memory_domains": [ 00:17:07.200 { 00:17:07.200 "dma_device_id": "system", 00:17:07.200 "dma_device_type": 1 00:17:07.200 }, 00:17:07.200 { 00:17:07.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.200 "dma_device_type": 2 00:17:07.200 } 00:17:07.200 ], 00:17:07.200 "driver_specific": {} 00:17:07.200 } 00:17:07.200 ]' 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.200 [2024-06-07 12:19:30.760319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:17:07.200 [2024-06-07 12:19:30.760417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.200 [2024-06-07 12:19:30.760453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:17:07.200 [2024-06-07 12:19:30.760487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.200 [2024-06-07 12:19:30.762604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.200 [2024-06-07 12:19:30.762654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:17:07.200 Passthru0 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.200 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.201 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.201 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:17:07.201 { 00:17:07.201 "name": "Malloc2", 00:17:07.201 "aliases": [ 00:17:07.201 "2181b0e4-bc68-418d-9548-72ae10669e3a" 00:17:07.201 ], 00:17:07.201 "product_name": "Malloc disk", 00:17:07.201 "block_size": 512, 00:17:07.201 "num_blocks": 16384, 00:17:07.201 "uuid": "2181b0e4-bc68-418d-9548-72ae10669e3a", 00:17:07.201 "assigned_rate_limits": { 00:17:07.201 "rw_ios_per_sec": 0, 00:17:07.201 "rw_mbytes_per_sec": 0, 00:17:07.201 "r_mbytes_per_sec": 0, 00:17:07.201 "w_mbytes_per_sec": 0 00:17:07.201 }, 00:17:07.201 "claimed": true, 00:17:07.201 "claim_type": "exclusive_write", 00:17:07.201 "zoned": false, 00:17:07.201 "supported_io_types": { 00:17:07.201 "read": true, 00:17:07.201 "write": true, 00:17:07.201 "unmap": true, 00:17:07.201 "write_zeroes": true, 00:17:07.201 "flush": true, 00:17:07.201 "reset": true, 00:17:07.201 "compare": false, 00:17:07.201 "compare_and_write": false, 00:17:07.201 "abort": true, 00:17:07.201 "nvme_admin": false, 00:17:07.201 "nvme_io": false 00:17:07.201 }, 00:17:07.201 "memory_domains": [ 00:17:07.201 { 00:17:07.201 "dma_device_id": "system", 00:17:07.201 "dma_device_type": 1 00:17:07.201 }, 00:17:07.201 { 00:17:07.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.201 "dma_device_type": 2 00:17:07.201 } 00:17:07.201 ], 00:17:07.201 "driver_specific": {} 00:17:07.201 }, 00:17:07.201 { 00:17:07.201 "name": "Passthru0", 00:17:07.201 "aliases": [ 00:17:07.201 "fc3fbb32-d219-5290-8754-1d7196167331" 00:17:07.201 ], 00:17:07.201 "product_name": "passthru", 00:17:07.201 "block_size": 512, 00:17:07.201 "num_blocks": 16384, 00:17:07.201 "uuid": "fc3fbb32-d219-5290-8754-1d7196167331", 00:17:07.201 "assigned_rate_limits": { 00:17:07.201 "rw_ios_per_sec": 0, 00:17:07.201 "rw_mbytes_per_sec": 0, 00:17:07.201 "r_mbytes_per_sec": 0, 00:17:07.201 "w_mbytes_per_sec": 0 00:17:07.201 }, 00:17:07.201 "claimed": false, 00:17:07.201 "zoned": false, 00:17:07.201 "supported_io_types": { 00:17:07.201 "read": true, 00:17:07.201 "write": true, 00:17:07.201 "unmap": true, 00:17:07.201 "write_zeroes": true, 00:17:07.201 "flush": true, 00:17:07.201 "reset": true, 00:17:07.201 "compare": false, 00:17:07.201 "compare_and_write": false, 00:17:07.201 "abort": true, 00:17:07.201 "nvme_admin": false, 00:17:07.201 "nvme_io": false 00:17:07.201 }, 00:17:07.201 "memory_domains": [ 00:17:07.201 { 00:17:07.201 "dma_device_id": "system", 00:17:07.201 "dma_device_type": 1 00:17:07.201 }, 00:17:07.201 { 00:17:07.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.201 "dma_device_type": 2 00:17:07.201 } 00:17:07.201 ], 00:17:07.201 "driver_specific": { 00:17:07.201 "passthru": { 00:17:07.201 "name": "Passthru0", 00:17:07.201 "base_bdev_name": "Malloc2" 00:17:07.201 } 00:17:07.201 } 00:17:07.201 } 00:17:07.201 ]' 00:17:07.201 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:17:07.459 00:17:07.459 real 0m0.328s 00:17:07.459 user 0m0.218s 00:17:07.459 sys 0m0.045s 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:07.459 12:19:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:17:07.459 ************************************ 00:17:07.459 END TEST rpc_daemon_integrity 00:17:07.459 ************************************ 00:17:07.459 12:19:30 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:17:07.459 12:19:31 rpc -- rpc/rpc.sh@84 -- # killprocess 190675 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@949 -- # '[' -z 190675 ']' 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@953 -- # kill -0 190675 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@954 -- # uname 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 190675 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:07.459 killing process with pid 190675 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 190675' 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@968 -- # kill 190675 00:17:07.459 12:19:31 rpc -- common/autotest_common.sh@973 -- # wait 190675 00:17:08.027 ************************************ 00:17:08.027 END TEST rpc 00:17:08.027 ************************************ 00:17:08.027 00:17:08.027 real 0m3.025s 00:17:08.027 user 0m3.587s 00:17:08.027 sys 0m0.921s 00:17:08.027 12:19:31 rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:08.027 12:19:31 rpc -- common/autotest_common.sh@10 -- # set +x 00:17:08.286 12:19:31 -- spdk/autotest.sh@170 -- # run_test skip_rpc /home/vagrant/spdk_repo/spdk/test/rpc/skip_rpc.sh 00:17:08.286 12:19:31 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:08.286 12:19:31 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:08.286 12:19:31 -- common/autotest_common.sh@10 -- # set +x 00:17:08.286 ************************************ 00:17:08.286 START TEST skip_rpc 00:17:08.286 ************************************ 00:17:08.286 12:19:31 skip_rpc -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/rpc/skip_rpc.sh 00:17:08.286 * Looking for test storage... 00:17:08.286 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc 00:17:08.286 12:19:31 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:17:08.286 12:19:31 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:17:08.286 12:19:31 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:17:08.286 12:19:31 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:08.286 12:19:31 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:08.286 12:19:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:08.286 ************************************ 00:17:08.286 START TEST skip_rpc 00:17:08.286 ************************************ 00:17:08.286 12:19:31 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # test_skip_rpc 00:17:08.286 12:19:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=190903 00:17:08.286 12:19:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:17:08.286 12:19:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:17:08.286 12:19:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:17:08.286 [2024-06-07 12:19:31.865396] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:08.286 [2024-06-07 12:19:31.866487] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190903 ] 00:17:08.545 [2024-06-07 12:19:32.013689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:08.545 [2024-06-07 12:19:32.105298] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@649 -- # local es=0 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd spdk_get_version 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # rpc_cmd spdk_get_version 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # es=1 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 190903 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@949 -- # '[' -z 190903 ']' 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # kill -0 190903 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # uname 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 190903 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 190903' 00:17:13.824 killing process with pid 190903 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # kill 190903 00:17:13.824 12:19:36 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # wait 190903 00:17:14.083 00:17:14.083 real 0m5.665s 00:17:14.083 user 0m5.121s 00:17:14.083 sys 0m0.439s 00:17:14.083 12:19:37 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:14.083 12:19:37 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:14.083 ************************************ 00:17:14.083 END TEST skip_rpc 00:17:14.083 ************************************ 00:17:14.083 12:19:37 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:17:14.083 12:19:37 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:14.083 12:19:37 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:14.083 12:19:37 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:14.083 ************************************ 00:17:14.083 START TEST skip_rpc_with_json 00:17:14.083 ************************************ 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_json 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=190996 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 190996 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@830 -- # '[' -z 190996 ']' 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:14.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:14.083 12:19:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:17:14.083 [2024-06-07 12:19:37.603616] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:14.083 [2024-06-07 12:19:37.604177] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190996 ] 00:17:14.340 [2024-06-07 12:19:37.750695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:14.340 [2024-06-07 12:19:37.843440] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@863 -- # return 0 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:17:15.347 [2024-06-07 12:19:38.646152] nvmf_rpc.c:2558:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:17:15.347 request: 00:17:15.347 { 00:17:15.347 "trtype": "tcp", 00:17:15.347 "method": "nvmf_get_transports", 00:17:15.347 "req_id": 1 00:17:15.347 } 00:17:15.347 Got JSON-RPC error response 00:17:15.347 response: 00:17:15.347 { 00:17:15.347 "code": -19, 00:17:15.347 "message": "No such device" 00:17:15.347 } 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:17:15.347 [2024-06-07 12:19:38.662300] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:15.347 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:17:15.347 { 00:17:15.347 "subsystems": [ 00:17:15.347 { 00:17:15.347 "subsystem": "scheduler", 00:17:15.347 "config": [ 00:17:15.347 { 00:17:15.347 "method": "framework_set_scheduler", 00:17:15.347 "params": { 00:17:15.347 "name": "static" 00:17:15.347 } 00:17:15.347 } 00:17:15.347 ] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "vmd", 00:17:15.347 "config": [] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "sock", 00:17:15.347 "config": [ 00:17:15.347 { 00:17:15.347 "method": "sock_set_default_impl", 00:17:15.347 "params": { 00:17:15.347 "impl_name": "posix" 00:17:15.347 } 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "method": "sock_impl_set_options", 00:17:15.347 "params": { 00:17:15.347 "impl_name": "ssl", 00:17:15.347 "recv_buf_size": 4096, 00:17:15.347 "send_buf_size": 4096, 00:17:15.347 "enable_recv_pipe": true, 00:17:15.347 "enable_quickack": false, 00:17:15.347 "enable_placement_id": 0, 00:17:15.347 "enable_zerocopy_send_server": true, 00:17:15.347 "enable_zerocopy_send_client": false, 00:17:15.347 "zerocopy_threshold": 0, 00:17:15.347 "tls_version": 0, 00:17:15.347 "enable_ktls": false 00:17:15.347 } 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "method": "sock_impl_set_options", 00:17:15.347 "params": { 00:17:15.347 "impl_name": "posix", 00:17:15.347 "recv_buf_size": 2097152, 00:17:15.347 "send_buf_size": 2097152, 00:17:15.347 "enable_recv_pipe": true, 00:17:15.347 "enable_quickack": false, 00:17:15.347 "enable_placement_id": 0, 00:17:15.347 "enable_zerocopy_send_server": true, 00:17:15.347 "enable_zerocopy_send_client": false, 00:17:15.347 "zerocopy_threshold": 0, 00:17:15.347 "tls_version": 0, 00:17:15.347 "enable_ktls": false 00:17:15.347 } 00:17:15.347 } 00:17:15.347 ] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "iobuf", 00:17:15.347 "config": [ 00:17:15.347 { 00:17:15.347 "method": "iobuf_set_options", 00:17:15.347 "params": { 00:17:15.347 "small_pool_count": 8192, 00:17:15.347 "large_pool_count": 1024, 00:17:15.347 "small_bufsize": 8192, 00:17:15.347 "large_bufsize": 135168 00:17:15.347 } 00:17:15.347 } 00:17:15.347 ] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "keyring", 00:17:15.347 "config": [] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "accel", 00:17:15.347 "config": [ 00:17:15.347 { 00:17:15.347 "method": "accel_set_options", 00:17:15.347 "params": { 00:17:15.347 "small_cache_size": 128, 00:17:15.347 "large_cache_size": 16, 00:17:15.347 "task_count": 2048, 00:17:15.347 "sequence_count": 2048, 00:17:15.347 "buf_count": 2048 00:17:15.347 } 00:17:15.347 } 00:17:15.347 ] 00:17:15.347 }, 00:17:15.347 { 00:17:15.347 "subsystem": "bdev", 00:17:15.347 "config": [ 00:17:15.347 { 00:17:15.347 "method": "bdev_set_options", 00:17:15.347 "params": { 00:17:15.347 "bdev_io_pool_size": 65535, 00:17:15.347 "bdev_io_cache_size": 256, 00:17:15.347 "bdev_auto_examine": true, 00:17:15.347 "iobuf_small_cache_size": 128, 00:17:15.347 "iobuf_large_cache_size": 16 00:17:15.347 } 00:17:15.347 }, 00:17:15.348 { 00:17:15.348 "method": "bdev_raid_set_options", 00:17:15.348 "params": { 00:17:15.348 "process_window_size_kb": 1024 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "bdev_nvme_set_options", 00:17:15.348 "params": { 00:17:15.348 "action_on_timeout": "none", 00:17:15.348 "timeout_us": 0, 00:17:15.348 "timeout_admin_us": 0, 00:17:15.348 "keep_alive_timeout_ms": 10000, 00:17:15.348 "arbitration_burst": 0, 00:17:15.348 "low_priority_weight": 0, 00:17:15.348 "medium_priority_weight": 0, 00:17:15.348 "high_priority_weight": 0, 00:17:15.348 "nvme_adminq_poll_period_us": 10000, 00:17:15.348 "nvme_ioq_poll_period_us": 0, 00:17:15.348 "io_queue_requests": 0, 00:17:15.348 "delay_cmd_submit": true, 00:17:15.348 "transport_retry_count": 4, 00:17:15.348 "bdev_retry_count": 3, 00:17:15.348 "transport_ack_timeout": 0, 00:17:15.348 "ctrlr_loss_timeout_sec": 0, 00:17:15.348 "reconnect_delay_sec": 0, 00:17:15.348 "fast_io_fail_timeout_sec": 0, 00:17:15.348 "disable_auto_failback": false, 00:17:15.348 "generate_uuids": false, 00:17:15.348 "transport_tos": 0, 00:17:15.348 "nvme_error_stat": false, 00:17:15.348 "rdma_srq_size": 0, 00:17:15.348 "io_path_stat": false, 00:17:15.348 "allow_accel_sequence": false, 00:17:15.348 "rdma_max_cq_size": 0, 00:17:15.348 "rdma_cm_event_timeout_ms": 0, 00:17:15.348 "dhchap_digests": [ 00:17:15.348 "sha256", 00:17:15.348 "sha384", 00:17:15.348 "sha512" 00:17:15.348 ], 00:17:15.348 "dhchap_dhgroups": [ 00:17:15.348 "null", 00:17:15.348 "ffdhe2048", 00:17:15.348 "ffdhe3072", 00:17:15.348 "ffdhe4096", 00:17:15.348 "ffdhe6144", 00:17:15.348 "ffdhe8192" 00:17:15.348 ] 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "bdev_nvme_set_hotplug", 00:17:15.348 "params": { 00:17:15.348 "period_us": 100000, 00:17:15.348 "enable": false 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "bdev_iscsi_set_options", 00:17:15.348 "params": { 00:17:15.348 "timeout_sec": 30 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "bdev_wait_for_examine" 00:17:15.348 } 00:17:15.348 ] 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "nvmf", 00:17:15.348 "config": [ 00:17:15.348 { 00:17:15.348 "method": "nvmf_set_config", 00:17:15.348 "params": { 00:17:15.348 "discovery_filter": "match_any", 00:17:15.348 "admin_cmd_passthru": { 00:17:15.348 "identify_ctrlr": false 00:17:15.348 } 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "nvmf_set_max_subsystems", 00:17:15.348 "params": { 00:17:15.348 "max_subsystems": 1024 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "nvmf_set_crdt", 00:17:15.348 "params": { 00:17:15.348 "crdt1": 0, 00:17:15.348 "crdt2": 0, 00:17:15.348 "crdt3": 0 00:17:15.348 } 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "method": "nvmf_create_transport", 00:17:15.348 "params": { 00:17:15.348 "trtype": "TCP", 00:17:15.348 "max_queue_depth": 128, 00:17:15.348 "max_io_qpairs_per_ctrlr": 127, 00:17:15.348 "in_capsule_data_size": 4096, 00:17:15.348 "max_io_size": 131072, 00:17:15.348 "io_unit_size": 131072, 00:17:15.348 "max_aq_depth": 128, 00:17:15.348 "num_shared_buffers": 511, 00:17:15.348 "buf_cache_size": 4294967295, 00:17:15.348 "dif_insert_or_strip": false, 00:17:15.348 "zcopy": false, 00:17:15.348 "c2h_success": true, 00:17:15.348 "sock_priority": 0, 00:17:15.348 "abort_timeout_sec": 1, 00:17:15.348 "ack_timeout": 0, 00:17:15.348 "data_wr_pool_size": 0 00:17:15.348 } 00:17:15.348 } 00:17:15.348 ] 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "nbd", 00:17:15.348 "config": [] 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "vhost_blk", 00:17:15.348 "config": [] 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "scsi", 00:17:15.348 "config": null 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "iscsi", 00:17:15.348 "config": [ 00:17:15.348 { 00:17:15.348 "method": "iscsi_set_options", 00:17:15.348 "params": { 00:17:15.348 "node_base": "iqn.2016-06.io.spdk", 00:17:15.348 "max_sessions": 128, 00:17:15.348 "max_connections_per_session": 2, 00:17:15.348 "max_queue_depth": 64, 00:17:15.348 "default_time2wait": 2, 00:17:15.348 "default_time2retain": 20, 00:17:15.348 "first_burst_length": 8192, 00:17:15.348 "immediate_data": true, 00:17:15.348 "allow_duplicated_isid": false, 00:17:15.348 "error_recovery_level": 0, 00:17:15.348 "nop_timeout": 60, 00:17:15.348 "nop_in_interval": 30, 00:17:15.348 "disable_chap": false, 00:17:15.348 "require_chap": false, 00:17:15.348 "mutual_chap": false, 00:17:15.348 "chap_group": 0, 00:17:15.348 "max_large_datain_per_connection": 64, 00:17:15.348 "max_r2t_per_connection": 4, 00:17:15.348 "pdu_pool_size": 36864, 00:17:15.348 "immediate_data_pool_size": 16384, 00:17:15.348 "data_out_pool_size": 2048 00:17:15.348 } 00:17:15.348 } 00:17:15.348 ] 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "subsystem": "vhost_scsi", 00:17:15.348 "config": [] 00:17:15.348 } 00:17:15.348 ] 00:17:15.348 } 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 190996 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 190996 ']' 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 190996 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 190996 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 190996' 00:17:15.348 killing process with pid 190996 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 190996 00:17:15.348 12:19:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 190996 00:17:15.913 12:19:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=191036 00:17:15.913 12:19:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:17:15.913 12:19:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 191036 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 191036 ']' 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 191036 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191036 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191036' 00:17:21.179 killing process with pid 191036 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 191036 00:17:21.179 12:19:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 191036 00:17:21.746 12:19:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:17:21.747 00:17:21.747 real 0m7.568s 00:17:21.747 user 0m6.963s 00:17:21.747 sys 0m0.998s 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:17:21.747 ************************************ 00:17:21.747 END TEST skip_rpc_with_json 00:17:21.747 ************************************ 00:17:21.747 12:19:45 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:21.747 ************************************ 00:17:21.747 START TEST skip_rpc_with_delay 00:17:21.747 ************************************ 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_delay 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@649 -- # local es=0 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt ]] 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:17:21.747 [2024-06-07 12:19:45.236882] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:17:21.747 [2024-06-07 12:19:45.237569] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # es=1 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:17:21.747 00:17:21.747 real 0m0.088s 00:17:21.747 user 0m0.040s 00:17:21.747 sys 0m0.045s 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:21.747 12:19:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:17:21.747 ************************************ 00:17:21.747 END TEST skip_rpc_with_delay 00:17:21.747 ************************************ 00:17:21.747 12:19:45 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:17:21.747 12:19:45 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:17:21.747 12:19:45 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:21.747 12:19:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:21.747 ************************************ 00:17:21.747 START TEST exit_on_failed_rpc_init 00:17:21.747 ************************************ 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # test_exit_on_failed_rpc_init 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=191158 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 191158 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@830 -- # '[' -z 191158 ']' 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:21.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:21.747 12:19:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:17:22.005 [2024-06-07 12:19:45.390617] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:22.005 [2024-06-07 12:19:45.391158] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191158 ] 00:17:22.005 [2024-06-07 12:19:45.543560] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:22.005 [2024-06-07 12:19:45.633380] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@863 -- # return 0 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@649 -- # local es=0 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt ]] 00:17:22.940 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:17:22.940 [2024-06-07 12:19:46.357456] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:22.940 [2024-06-07 12:19:46.357984] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191181 ] 00:17:22.940 [2024-06-07 12:19:46.505048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:23.224 [2024-06-07 12:19:46.596221] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:23.224 [2024-06-07 12:19:46.596642] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:17:23.224 [2024-06-07 12:19:46.596794] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:17:23.224 [2024-06-07 12:19:46.596937] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # es=234 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # es=106 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # case "$es" in 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@669 -- # es=1 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 191158 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@949 -- # '[' -z 191158 ']' 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # kill -0 191158 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # uname 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191158 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191158' 00:17:23.224 killing process with pid 191158 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # kill 191158 00:17:23.224 12:19:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # wait 191158 00:17:24.160 00:17:24.160 real 0m2.097s 00:17:24.160 user 0m2.177s 00:17:24.160 sys 0m0.631s 00:17:24.160 12:19:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:24.160 12:19:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:17:24.160 ************************************ 00:17:24.160 END TEST exit_on_failed_rpc_init 00:17:24.160 ************************************ 00:17:24.160 12:19:47 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:17:24.160 00:17:24.160 real 0m15.795s 00:17:24.160 user 0m14.436s 00:17:24.160 sys 0m2.355s 00:17:24.160 12:19:47 skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:24.160 12:19:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:24.160 ************************************ 00:17:24.160 END TEST skip_rpc 00:17:24.160 ************************************ 00:17:24.160 12:19:47 -- spdk/autotest.sh@171 -- # run_test rpc_client /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client.sh 00:17:24.160 12:19:47 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:24.160 12:19:47 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:24.160 12:19:47 -- common/autotest_common.sh@10 -- # set +x 00:17:24.160 ************************************ 00:17:24.160 START TEST rpc_client 00:17:24.160 ************************************ 00:17:24.160 12:19:47 rpc_client -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client.sh 00:17:24.160 * Looking for test storage... 00:17:24.160 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc_client 00:17:24.160 12:19:47 rpc_client -- rpc_client/rpc_client.sh@10 -- # /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client_test 00:17:24.160 OK 00:17:24.160 12:19:47 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:17:24.160 00:17:24.160 real 0m0.142s 00:17:24.160 user 0m0.064s 00:17:24.160 sys 0m0.086s 00:17:24.160 12:19:47 rpc_client -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:24.160 12:19:47 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:17:24.160 ************************************ 00:17:24.160 END TEST rpc_client 00:17:24.160 ************************************ 00:17:24.160 12:19:47 -- spdk/autotest.sh@172 -- # run_test json_config /home/vagrant/spdk_repo/spdk/test/json_config/json_config.sh 00:17:24.160 12:19:47 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:24.160 12:19:47 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:24.160 12:19:47 -- common/autotest_common.sh@10 -- # set +x 00:17:24.160 ************************************ 00:17:24.160 START TEST json_config 00:17:24.160 ************************************ 00:17:24.160 12:19:47 json_config -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_config.sh 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@8 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@7 -- # uname -s 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:17:24.419 12:19:47 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:24.419 12:19:47 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:24.419 12:19:47 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:24.419 12:19:47 json_config -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:24.419 12:19:47 json_config -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:24.419 12:19:47 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:24.419 12:19:47 json_config -- paths/export.sh@5 -- # export PATH 00:17:24.419 12:19:47 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@47 -- # : 0 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:17:24.419 12:19:47 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/json_config/common.sh 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/home/vagrant/spdk_repo/spdk/spdk_tgt_config.json' ['initiator']='/home/vagrant/spdk_repo/spdk/spdk_initiator_config.json') 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:17:24.419 INFO: JSON configuration test init 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:24.419 12:19:47 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:17:24.419 12:19:47 json_config -- json_config/common.sh@9 -- # local app=target 00:17:24.419 12:19:47 json_config -- json_config/common.sh@10 -- # shift 00:17:24.419 12:19:47 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:17:24.419 12:19:47 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:17:24.419 12:19:47 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:17:24.419 12:19:47 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:24.419 12:19:47 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:24.419 12:19:47 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=191320 00:17:24.419 12:19:47 json_config -- json_config/common.sh@21 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:17:24.419 12:19:47 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:17:24.419 Waiting for target to run... 00:17:24.419 12:19:47 json_config -- json_config/common.sh@25 -- # waitforlisten 191320 /var/tmp/spdk_tgt.sock 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@830 -- # '[' -z 191320 ']' 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:17:24.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:24.419 12:19:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:24.419 [2024-06-07 12:19:47.907983] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:24.419 [2024-06-07 12:19:47.908719] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191320 ] 00:17:24.985 [2024-06-07 12:19:48.493375] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:24.985 [2024-06-07 12:19:48.549341] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@863 -- # return 0 00:17:25.552 12:19:48 json_config -- json_config/common.sh@26 -- # echo '' 00:17:25.552 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@95 -- # [[ 0 -eq 1 ]] 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:25.552 12:19:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@273 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:17:25.552 12:19:48 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:17:25.552 12:19:48 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:17:25.811 12:19:49 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:25.811 12:19:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:17:25.811 12:19:49 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:17:25.811 12:19:49 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@48 -- # local get_types 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:17:26.070 12:19:49 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:26.070 12:19:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@55 -- # return 0 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:17:26.070 12:19:49 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:26.070 12:19:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:17:26.070 12:19:49 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:17:26.070 12:19:49 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:17:26.328 12:19:49 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:17:26.328 12:19:49 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:17:26.587 Nvme0n1p0 Nvme0n1p1 00:17:26.587 12:19:50 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:17:26.587 12:19:50 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:17:26.846 [2024-06-07 12:19:50.366986] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:17:26.846 [2024-06-07 12:19:50.367386] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:17:26.846 00:17:26.846 12:19:50 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:17:26.846 12:19:50 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:17:27.104 Malloc3 00:17:27.104 12:19:50 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:17:27.104 12:19:50 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:17:27.363 [2024-06-07 12:19:50.859172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:17:27.363 [2024-06-07 12:19:50.859801] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.363 [2024-06-07 12:19:50.860082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006f80 00:17:27.363 [2024-06-07 12:19:50.860298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.363 [2024-06-07 12:19:50.863022] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.363 [2024-06-07 12:19:50.863286] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:17:27.363 PTBdevFromMalloc3 00:17:27.363 12:19:50 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:17:27.363 12:19:50 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:17:27.620 Null0 00:17:27.620 12:19:51 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:17:27.621 12:19:51 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:17:27.878 Malloc0 00:17:27.878 12:19:51 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:17:27.878 12:19:51 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:17:28.136 Malloc1 00:17:28.136 12:19:51 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:17:28.136 12:19:51 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:17:28.394 102400+0 records in 00:17:28.394 102400+0 records out 00:17:28.394 104857600 bytes (105 MB, 100 MiB) copied, 0.319996 s, 328 MB/s 00:17:28.394 12:19:51 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:17:28.394 12:19:51 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:17:28.652 aio_disk 00:17:28.652 12:19:52 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:17:28.652 12:19:52 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:17:28.652 12:19:52 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:17:28.910 b5808331-8b80-42ff-8814-f8a86f582e8e 00:17:29.168 12:19:52 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:17:29.168 12:19:52 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:17:29.168 12:19:52 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:17:29.425 12:19:52 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:17:29.425 12:19:52 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:17:29.682 12:19:53 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:17:29.682 12:19:53 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:17:29.940 12:19:53 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:17:29.940 12:19:53 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@157 -- # [[ 0 -eq 1 ]] 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@71 -- # sort 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@72 -- # sort 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:17:30.199 12:19:53 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:17:30.199 12:19:53 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:17:30.458 12:19:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 bdev_register:aio_disk bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\0\8\6\f\1\8\2\-\d\b\8\e\-\4\f\c\b\-\b\2\5\f\-\0\3\6\7\6\1\1\f\e\4\6\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\9\b\6\0\c\0\c\-\b\8\b\5\-\4\f\b\e\-\a\7\1\c\-\4\f\0\5\2\f\2\e\9\f\f\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\0\3\0\b\8\f\0\-\9\2\2\7\-\4\4\5\7\-\8\6\8\a\-\3\3\d\0\0\1\1\b\0\d\2\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\5\b\e\9\7\5\5\-\3\8\d\d\-\4\6\f\b\-\9\4\8\6\-\2\a\f\6\1\8\8\d\f\b\7\3 ]] 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@86 -- # cat 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 bdev_register:aio_disk bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 00:17:30.459 Expected events matched: 00:17:30.459 bdev_register:6086f182-db8e-4fcb-b25f-0367611fe46d 00:17:30.459 bdev_register:79b60c0c-b8b5-4fbe-a71c-4f052f2e9ffa 00:17:30.459 bdev_register:8030b8f0-9227-4457-868a-33d0011b0d2b 00:17:30.459 bdev_register:Malloc0 00:17:30.459 bdev_register:Malloc0p0 00:17:30.459 bdev_register:Malloc0p1 00:17:30.459 bdev_register:Malloc0p2 00:17:30.459 bdev_register:Malloc1 00:17:30.459 bdev_register:Malloc3 00:17:30.459 bdev_register:Null0 00:17:30.459 bdev_register:Nvme0n1 00:17:30.459 bdev_register:Nvme0n1p0 00:17:30.459 bdev_register:Nvme0n1p1 00:17:30.459 bdev_register:PTBdevFromMalloc3 00:17:30.459 bdev_register:aio_disk 00:17:30.459 bdev_register:b5be9755-38dd-46fb-9486-2af6188dfb73 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:17:30.459 12:19:53 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:30.459 12:19:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:17:30.459 12:19:53 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:30.459 12:19:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:17:30.459 12:19:53 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:17:30.459 12:19:53 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:17:30.717 MallocBdevForConfigChangeCheck 00:17:30.717 12:19:54 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:17:30.717 12:19:54 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:30.717 12:19:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:30.977 12:19:54 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:17:30.977 12:19:54 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:17:31.236 12:19:54 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:17:31.236 INFO: shutting down applications... 00:17:31.236 12:19:54 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:17:31.236 12:19:54 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:17:31.236 12:19:54 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:17:31.236 12:19:54 json_config -- json_config/json_config.sh@333 -- # /home/vagrant/spdk_repo/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:17:31.495 [2024-06-07 12:19:54.936838] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:17:31.495 Calling clear_vhost_scsi_subsystem 00:17:31.495 Calling clear_iscsi_subsystem 00:17:31.495 Calling clear_vhost_blk_subsystem 00:17:31.495 Calling clear_nbd_subsystem 00:17:31.495 Calling clear_nvmf_subsystem 00:17:31.495 Calling clear_bdev_subsystem 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@337 -- # local config_filter=/home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@343 -- # count=100 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@345 -- # /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@345 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:17:31.495 12:19:55 json_config -- json_config/json_config.sh@345 -- # /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method check_empty 00:17:32.062 12:19:55 json_config -- json_config/json_config.sh@345 -- # break 00:17:32.062 12:19:55 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:17:32.062 12:19:55 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:17:32.062 12:19:55 json_config -- json_config/common.sh@31 -- # local app=target 00:17:32.062 12:19:55 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:17:32.062 12:19:55 json_config -- json_config/common.sh@35 -- # [[ -n 191320 ]] 00:17:32.062 12:19:55 json_config -- json_config/common.sh@38 -- # kill -SIGINT 191320 00:17:32.062 12:19:55 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:17:32.062 12:19:55 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:17:32.062 12:19:55 json_config -- json_config/common.sh@41 -- # kill -0 191320 00:17:32.062 12:19:55 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:17:32.627 12:19:56 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:17:32.627 12:19:56 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:17:32.627 12:19:56 json_config -- json_config/common.sh@41 -- # kill -0 191320 00:17:32.627 12:19:56 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:17:32.627 12:19:56 json_config -- json_config/common.sh@43 -- # break 00:17:32.627 12:19:56 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:17:32.628 12:19:56 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:17:32.628 SPDK target shutdown done 00:17:32.628 12:19:56 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:17:32.628 INFO: relaunching applications... 00:17:32.628 12:19:56 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:32.628 12:19:56 json_config -- json_config/common.sh@9 -- # local app=target 00:17:32.628 12:19:56 json_config -- json_config/common.sh@10 -- # shift 00:17:32.628 12:19:56 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:17:32.628 12:19:56 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:17:32.628 12:19:56 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:17:32.628 12:19:56 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:32.628 12:19:56 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:32.628 12:19:56 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=191569 00:17:32.628 12:19:56 json_config -- json_config/common.sh@21 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:32.628 Waiting for target to run... 00:17:32.628 12:19:56 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:17:32.628 12:19:56 json_config -- json_config/common.sh@25 -- # waitforlisten 191569 /var/tmp/spdk_tgt.sock 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@830 -- # '[' -z 191569 ']' 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:17:32.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:32.628 12:19:56 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:32.628 [2024-06-07 12:19:56.106746] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:32.628 [2024-06-07 12:19:56.107729] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191569 ] 00:17:33.195 [2024-06-07 12:19:56.671524] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:33.195 [2024-06-07 12:19:56.720374] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:33.468 [2024-06-07 12:19:56.876046] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:17:33.468 [2024-06-07 12:19:56.876436] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:17:33.468 [2024-06-07 12:19:56.883987] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:17:33.468 [2024-06-07 12:19:56.884256] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:17:33.468 [2024-06-07 12:19:56.892051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:17:33.468 [2024-06-07 12:19:56.892369] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:17:33.468 [2024-06-07 12:19:56.892544] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:17:33.468 [2024-06-07 12:19:56.974689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:17:33.468 [2024-06-07 12:19:56.975104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.468 [2024-06-07 12:19:56.975203] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:17:33.468 [2024-06-07 12:19:56.975517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.468 [2024-06-07 12:19:56.976138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.468 [2024-06-07 12:19:56.976330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:17:33.729 12:19:57 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:33.729 12:19:57 json_config -- common/autotest_common.sh@863 -- # return 0 00:17:33.729 00:17:33.729 12:19:57 json_config -- json_config/common.sh@26 -- # echo '' 00:17:33.729 12:19:57 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:17:33.729 12:19:57 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:17:33.729 INFO: Checking if target configuration is the same... 00:17:33.729 12:19:57 json_config -- json_config/json_config.sh@378 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_diff.sh /dev/fd/62 /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:33.729 12:19:57 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:17:33.729 12:19:57 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:17:33.729 + '[' 2 -ne 2 ']' 00:17:33.729 +++ dirname /home/vagrant/spdk_repo/spdk/test/json_config/json_diff.sh 00:17:33.729 ++ readlink -f /home/vagrant/spdk_repo/spdk/test/json_config/../.. 00:17:33.729 + rootdir=/home/vagrant/spdk_repo/spdk 00:17:33.729 +++ basename /dev/fd/62 00:17:33.729 ++ mktemp /tmp/62.XXX 00:17:33.729 + tmp_file_1=/tmp/62.k60 00:17:33.729 +++ basename /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:33.729 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:17:33.730 + tmp_file_2=/tmp/spdk_tgt_config.json.UOC 00:17:33.730 + ret=0 00:17:33.730 + /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method sort 00:17:33.988 + /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method sort 00:17:33.988 + diff -u /tmp/62.k60 /tmp/spdk_tgt_config.json.UOC 00:17:33.988 + echo 'INFO: JSON config files are the same' 00:17:33.988 INFO: JSON config files are the same 00:17:33.989 + rm /tmp/62.k60 /tmp/spdk_tgt_config.json.UOC 00:17:33.989 + exit 0 00:17:33.989 12:19:57 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:17:33.989 INFO: changing configuration and checking if this can be detected... 00:17:33.989 12:19:57 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:17:33.989 12:19:57 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:17:33.989 12:19:57 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:17:34.557 12:19:57 json_config -- json_config/json_config.sh@387 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_diff.sh /dev/fd/62 /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:34.557 12:19:57 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:17:34.557 12:19:57 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:17:34.557 + '[' 2 -ne 2 ']' 00:17:34.557 +++ dirname /home/vagrant/spdk_repo/spdk/test/json_config/json_diff.sh 00:17:34.557 ++ readlink -f /home/vagrant/spdk_repo/spdk/test/json_config/../.. 00:17:34.557 + rootdir=/home/vagrant/spdk_repo/spdk 00:17:34.557 +++ basename /dev/fd/62 00:17:34.557 ++ mktemp /tmp/62.XXX 00:17:34.557 + tmp_file_1=/tmp/62.S4q 00:17:34.557 +++ basename /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:34.557 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:17:34.557 + tmp_file_2=/tmp/spdk_tgt_config.json.Djr 00:17:34.557 + ret=0 00:17:34.557 + /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method sort 00:17:34.816 + /home/vagrant/spdk_repo/spdk/test/json_config/config_filter.py -method sort 00:17:34.816 + diff -u /tmp/62.S4q /tmp/spdk_tgt_config.json.Djr 00:17:34.816 + ret=1 00:17:34.816 + echo '=== Start of file: /tmp/62.S4q ===' 00:17:34.816 + cat /tmp/62.S4q 00:17:34.816 + echo '=== End of file: /tmp/62.S4q ===' 00:17:34.816 + echo '' 00:17:34.816 + echo '=== Start of file: /tmp/spdk_tgt_config.json.Djr ===' 00:17:34.816 + cat /tmp/spdk_tgt_config.json.Djr 00:17:34.816 + echo '=== End of file: /tmp/spdk_tgt_config.json.Djr ===' 00:17:34.816 + echo '' 00:17:34.816 + rm /tmp/62.S4q /tmp/spdk_tgt_config.json.Djr 00:17:34.816 + exit 1 00:17:34.816 INFO: configuration change detected. 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:17:34.816 12:19:58 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:34.816 12:19:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@317 -- # [[ -n 191569 ]] 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:17:34.816 12:19:58 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:34.816 12:19:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:17:34.816 12:19:58 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:17:34.816 12:19:58 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:17:35.076 12:19:58 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:17:35.076 12:19:58 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:17:35.336 12:19:58 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:17:35.336 12:19:58 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:17:35.594 12:19:59 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:17:35.594 12:19:59 json_config -- json_config/common.sh@57 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@193 -- # uname -s 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:35.853 12:19:59 json_config -- json_config/json_config.sh@323 -- # killprocess 191569 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@949 -- # '[' -z 191569 ']' 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@953 -- # kill -0 191569 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@954 -- # uname 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191569 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:35.853 killing process with pid 191569 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191569' 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@968 -- # kill 191569 00:17:35.853 12:19:59 json_config -- common/autotest_common.sh@973 -- # wait 191569 00:17:36.421 12:19:59 json_config -- json_config/json_config.sh@326 -- # rm -f /home/vagrant/spdk_repo/spdk/spdk_initiator_config.json /home/vagrant/spdk_repo/spdk/spdk_tgt_config.json 00:17:36.421 12:19:59 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:17:36.421 12:19:59 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:36.421 12:19:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:36.421 12:19:59 json_config -- json_config/json_config.sh@328 -- # return 0 00:17:36.421 12:19:59 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:17:36.421 INFO: Success 00:17:36.421 00:17:36.421 real 0m12.214s 00:17:36.421 user 0m18.250s 00:17:36.421 sys 0m3.044s 00:17:36.421 12:19:59 json_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:36.421 12:19:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:17:36.421 ************************************ 00:17:36.421 END TEST json_config 00:17:36.421 ************************************ 00:17:36.421 12:20:00 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /home/vagrant/spdk_repo/spdk/test/json_config/json_config_extra_key.sh 00:17:36.421 12:20:00 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:36.421 12:20:00 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:36.421 12:20:00 -- common/autotest_common.sh@10 -- # set +x 00:17:36.421 ************************************ 00:17:36.421 START TEST json_config_extra_key 00:17:36.421 ************************************ 00:17:36.421 12:20:00 json_config_extra_key -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_config_extra_key.sh 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=de0b6253-4c30-4fab-8e86-8d05558b7c6b 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:17:36.681 12:20:00 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:36.681 12:20:00 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:36.681 12:20:00 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:36.681 12:20:00 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:36.681 12:20:00 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:36.681 12:20:00 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:36.681 12:20:00 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:17:36.681 12:20:00 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:17:36.681 12:20:00 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/json_config/common.sh 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json') 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:17:36.681 INFO: launching applications... 00:17:36.681 12:20:00 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=191739 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:17:36.681 Waiting for target to run... 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@21 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json 00:17:36.681 12:20:00 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 191739 /var/tmp/spdk_tgt.sock 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@830 -- # '[' -z 191739 ']' 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:17:36.681 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:36.681 12:20:00 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:17:36.681 [2024-06-07 12:20:00.181046] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:36.681 [2024-06-07 12:20:00.181667] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191739 ] 00:17:37.248 [2024-06-07 12:20:00.752585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:37.248 [2024-06-07 12:20:00.801559] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:37.813 12:20:01 json_config_extra_key -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:37.813 12:20:01 json_config_extra_key -- common/autotest_common.sh@863 -- # return 0 00:17:37.813 12:20:01 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:17:37.813 00:17:37.813 12:20:01 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:17:37.813 INFO: shutting down applications... 00:17:37.813 12:20:01 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:17:37.813 12:20:01 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:17:37.813 12:20:01 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:17:37.813 12:20:01 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 191739 ]] 00:17:37.814 12:20:01 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 191739 00:17:37.814 12:20:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:17:37.814 12:20:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:17:37.814 12:20:01 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 191739 00:17:37.814 12:20:01 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:17:38.380 12:20:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:17:38.380 12:20:01 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:17:38.380 12:20:01 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 191739 00:17:38.380 12:20:01 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 191739 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@43 -- # break 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:17:38.638 12:20:02 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:17:38.638 SPDK target shutdown done 00:17:38.638 12:20:02 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:17:38.638 Success 00:17:38.638 00:17:38.638 real 0m2.201s 00:17:38.638 user 0m1.661s 00:17:38.638 sys 0m0.688s 00:17:38.638 12:20:02 json_config_extra_key -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:38.638 12:20:02 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:17:38.638 ************************************ 00:17:38.638 END TEST json_config_extra_key 00:17:38.638 ************************************ 00:17:38.896 12:20:02 -- spdk/autotest.sh@174 -- # run_test alias_rpc /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:17:38.896 12:20:02 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:38.896 12:20:02 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:38.896 12:20:02 -- common/autotest_common.sh@10 -- # set +x 00:17:38.896 ************************************ 00:17:38.896 START TEST alias_rpc 00:17:38.896 ************************************ 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:17:38.896 * Looking for test storage... 00:17:38.896 * Found test storage at /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc 00:17:38.896 12:20:02 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:17:38.896 12:20:02 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=191821 00:17:38.896 12:20:02 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:38.896 12:20:02 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 191821 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@830 -- # '[' -z 191821 ']' 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:38.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:38.896 12:20:02 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:38.896 [2024-06-07 12:20:02.450079] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:38.896 [2024-06-07 12:20:02.450843] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191821 ] 00:17:39.154 [2024-06-07 12:20:02.586144] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:39.154 [2024-06-07 12:20:02.677678] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:39.412 12:20:03 alias_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:39.412 12:20:03 alias_rpc -- common/autotest_common.sh@863 -- # return 0 00:17:39.412 12:20:03 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py load_config -i 00:17:39.695 12:20:03 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 191821 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@949 -- # '[' -z 191821 ']' 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@953 -- # kill -0 191821 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@954 -- # uname 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191821 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:39.695 killing process with pid 191821 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191821' 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@968 -- # kill 191821 00:17:39.695 12:20:03 alias_rpc -- common/autotest_common.sh@973 -- # wait 191821 00:17:40.629 00:17:40.629 real 0m1.671s 00:17:40.629 user 0m1.550s 00:17:40.629 sys 0m0.573s 00:17:40.629 12:20:03 alias_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:40.629 12:20:03 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:17:40.629 ************************************ 00:17:40.629 END TEST alias_rpc 00:17:40.629 ************************************ 00:17:40.629 12:20:04 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:17:40.629 12:20:04 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /home/vagrant/spdk_repo/spdk/test/spdkcli/tcp.sh 00:17:40.629 12:20:04 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:40.629 12:20:04 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:40.629 12:20:04 -- common/autotest_common.sh@10 -- # set +x 00:17:40.629 ************************************ 00:17:40.629 START TEST spdkcli_tcp 00:17:40.629 ************************************ 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/spdkcli/tcp.sh 00:17:40.629 * Looking for test storage... 00:17:40.629 * Found test storage at /home/vagrant/spdk_repo/spdk/test/spdkcli 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/spdkcli/common.sh 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/home/vagrant/spdk_repo/spdk/test/spdkcli/spdkcli_job.py 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/home/vagrant/spdk_repo/spdk/test/json_config/clear_config.py 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@723 -- # xtrace_disable 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=191906 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 191906 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@830 -- # '[' -z 191906 ']' 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:40.629 12:20:04 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:40.629 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:40.629 12:20:04 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:17:40.629 [2024-06-07 12:20:04.192154] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:40.629 [2024-06-07 12:20:04.192457] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191906 ] 00:17:40.888 [2024-06-07 12:20:04.344572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:17:40.888 [2024-06-07 12:20:04.437732] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:40.888 [2024-06-07 12:20:04.437732] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:41.822 12:20:05 spdkcli_tcp -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:41.822 12:20:05 spdkcli_tcp -- common/autotest_common.sh@863 -- # return 0 00:17:41.822 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=191928 00:17:41.822 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:17:41.822 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:17:42.082 [ 00:17:42.082 "spdk_get_version", 00:17:42.082 "rpc_get_methods", 00:17:42.082 "keyring_get_keys", 00:17:42.082 "trace_get_info", 00:17:42.082 "trace_get_tpoint_group_mask", 00:17:42.082 "trace_disable_tpoint_group", 00:17:42.082 "trace_enable_tpoint_group", 00:17:42.082 "trace_clear_tpoint_mask", 00:17:42.082 "trace_set_tpoint_mask", 00:17:42.082 "framework_get_pci_devices", 00:17:42.082 "framework_get_config", 00:17:42.082 "framework_get_subsystems", 00:17:42.082 "iobuf_get_stats", 00:17:42.082 "iobuf_set_options", 00:17:42.082 "sock_get_default_impl", 00:17:42.082 "sock_set_default_impl", 00:17:42.082 "sock_impl_set_options", 00:17:42.082 "sock_impl_get_options", 00:17:42.082 "vmd_rescan", 00:17:42.082 "vmd_remove_device", 00:17:42.082 "vmd_enable", 00:17:42.082 "accel_get_stats", 00:17:42.082 "accel_set_options", 00:17:42.082 "accel_set_driver", 00:17:42.082 "accel_crypto_key_destroy", 00:17:42.082 "accel_crypto_keys_get", 00:17:42.082 "accel_crypto_key_create", 00:17:42.082 "accel_assign_opc", 00:17:42.082 "accel_get_module_info", 00:17:42.082 "accel_get_opc_assignments", 00:17:42.082 "notify_get_notifications", 00:17:42.082 "notify_get_types", 00:17:42.082 "bdev_get_histogram", 00:17:42.082 "bdev_enable_histogram", 00:17:42.082 "bdev_set_qos_limit", 00:17:42.082 "bdev_set_qd_sampling_period", 00:17:42.082 "bdev_get_bdevs", 00:17:42.082 "bdev_reset_iostat", 00:17:42.082 "bdev_get_iostat", 00:17:42.082 "bdev_examine", 00:17:42.082 "bdev_wait_for_examine", 00:17:42.082 "bdev_set_options", 00:17:42.082 "scsi_get_devices", 00:17:42.082 "thread_set_cpumask", 00:17:42.082 "framework_get_scheduler", 00:17:42.082 "framework_set_scheduler", 00:17:42.082 "framework_get_reactors", 00:17:42.082 "thread_get_io_channels", 00:17:42.082 "thread_get_pollers", 00:17:42.082 "thread_get_stats", 00:17:42.082 "framework_monitor_context_switch", 00:17:42.082 "spdk_kill_instance", 00:17:42.082 "log_enable_timestamps", 00:17:42.082 "log_get_flags", 00:17:42.082 "log_clear_flag", 00:17:42.082 "log_set_flag", 00:17:42.082 "log_get_level", 00:17:42.082 "log_set_level", 00:17:42.082 "log_get_print_level", 00:17:42.082 "log_set_print_level", 00:17:42.082 "framework_enable_cpumask_locks", 00:17:42.082 "framework_disable_cpumask_locks", 00:17:42.082 "framework_wait_init", 00:17:42.082 "framework_start_init", 00:17:42.082 "virtio_blk_create_transport", 00:17:42.082 "virtio_blk_get_transports", 00:17:42.083 "vhost_controller_set_coalescing", 00:17:42.083 "vhost_get_controllers", 00:17:42.083 "vhost_delete_controller", 00:17:42.083 "vhost_create_blk_controller", 00:17:42.083 "vhost_scsi_controller_remove_target", 00:17:42.083 "vhost_scsi_controller_add_target", 00:17:42.083 "vhost_start_scsi_controller", 00:17:42.083 "vhost_create_scsi_controller", 00:17:42.083 "nbd_get_disks", 00:17:42.083 "nbd_stop_disk", 00:17:42.083 "nbd_start_disk", 00:17:42.083 "env_dpdk_get_mem_stats", 00:17:42.083 "nvmf_stop_mdns_prr", 00:17:42.083 "nvmf_publish_mdns_prr", 00:17:42.083 "nvmf_subsystem_get_listeners", 00:17:42.083 "nvmf_subsystem_get_qpairs", 00:17:42.083 "nvmf_subsystem_get_controllers", 00:17:42.083 "nvmf_get_stats", 00:17:42.083 "nvmf_get_transports", 00:17:42.083 "nvmf_create_transport", 00:17:42.083 "nvmf_get_targets", 00:17:42.083 "nvmf_delete_target", 00:17:42.083 "nvmf_create_target", 00:17:42.083 "nvmf_subsystem_allow_any_host", 00:17:42.083 "nvmf_subsystem_remove_host", 00:17:42.083 "nvmf_subsystem_add_host", 00:17:42.083 "nvmf_ns_remove_host", 00:17:42.083 "nvmf_ns_add_host", 00:17:42.083 "nvmf_subsystem_remove_ns", 00:17:42.083 "nvmf_subsystem_add_ns", 00:17:42.083 "nvmf_subsystem_listener_set_ana_state", 00:17:42.083 "nvmf_discovery_get_referrals", 00:17:42.083 "nvmf_discovery_remove_referral", 00:17:42.083 "nvmf_discovery_add_referral", 00:17:42.083 "nvmf_subsystem_remove_listener", 00:17:42.083 "nvmf_subsystem_add_listener", 00:17:42.083 "nvmf_delete_subsystem", 00:17:42.083 "nvmf_create_subsystem", 00:17:42.083 "nvmf_get_subsystems", 00:17:42.083 "nvmf_set_crdt", 00:17:42.083 "nvmf_set_config", 00:17:42.083 "nvmf_set_max_subsystems", 00:17:42.083 "iscsi_get_histogram", 00:17:42.083 "iscsi_enable_histogram", 00:17:42.083 "iscsi_set_options", 00:17:42.083 "iscsi_get_auth_groups", 00:17:42.083 "iscsi_auth_group_remove_secret", 00:17:42.083 "iscsi_auth_group_add_secret", 00:17:42.083 "iscsi_delete_auth_group", 00:17:42.083 "iscsi_create_auth_group", 00:17:42.083 "iscsi_set_discovery_auth", 00:17:42.083 "iscsi_get_options", 00:17:42.083 "iscsi_target_node_request_logout", 00:17:42.083 "iscsi_target_node_set_redirect", 00:17:42.083 "iscsi_target_node_set_auth", 00:17:42.083 "iscsi_target_node_add_lun", 00:17:42.083 "iscsi_get_stats", 00:17:42.083 "iscsi_get_connections", 00:17:42.083 "iscsi_portal_group_set_auth", 00:17:42.083 "iscsi_start_portal_group", 00:17:42.083 "iscsi_delete_portal_group", 00:17:42.083 "iscsi_create_portal_group", 00:17:42.083 "iscsi_get_portal_groups", 00:17:42.083 "iscsi_delete_target_node", 00:17:42.083 "iscsi_target_node_remove_pg_ig_maps", 00:17:42.083 "iscsi_target_node_add_pg_ig_maps", 00:17:42.083 "iscsi_create_target_node", 00:17:42.083 "iscsi_get_target_nodes", 00:17:42.083 "iscsi_delete_initiator_group", 00:17:42.083 "iscsi_initiator_group_remove_initiators", 00:17:42.083 "iscsi_initiator_group_add_initiators", 00:17:42.083 "iscsi_create_initiator_group", 00:17:42.083 "iscsi_get_initiator_groups", 00:17:42.083 "keyring_linux_set_options", 00:17:42.083 "keyring_file_remove_key", 00:17:42.083 "keyring_file_add_key", 00:17:42.083 "iaa_scan_accel_module", 00:17:42.083 "dsa_scan_accel_module", 00:17:42.083 "ioat_scan_accel_module", 00:17:42.083 "accel_error_inject_error", 00:17:42.083 "bdev_iscsi_delete", 00:17:42.083 "bdev_iscsi_create", 00:17:42.083 "bdev_iscsi_set_options", 00:17:42.083 "bdev_virtio_attach_controller", 00:17:42.083 "bdev_virtio_scsi_get_devices", 00:17:42.083 "bdev_virtio_detach_controller", 00:17:42.083 "bdev_virtio_blk_set_hotplug", 00:17:42.083 "bdev_ftl_set_property", 00:17:42.083 "bdev_ftl_get_properties", 00:17:42.083 "bdev_ftl_get_stats", 00:17:42.083 "bdev_ftl_unmap", 00:17:42.083 "bdev_ftl_unload", 00:17:42.083 "bdev_ftl_delete", 00:17:42.083 "bdev_ftl_load", 00:17:42.083 "bdev_ftl_create", 00:17:42.083 "bdev_aio_delete", 00:17:42.083 "bdev_aio_rescan", 00:17:42.083 "bdev_aio_create", 00:17:42.083 "blobfs_create", 00:17:42.083 "blobfs_detect", 00:17:42.083 "blobfs_set_cache_size", 00:17:42.083 "bdev_zone_block_delete", 00:17:42.083 "bdev_zone_block_create", 00:17:42.083 "bdev_delay_delete", 00:17:42.083 "bdev_delay_create", 00:17:42.083 "bdev_delay_update_latency", 00:17:42.083 "bdev_split_delete", 00:17:42.083 "bdev_split_create", 00:17:42.083 "bdev_error_inject_error", 00:17:42.083 "bdev_error_delete", 00:17:42.083 "bdev_error_create", 00:17:42.083 "bdev_raid_set_options", 00:17:42.083 "bdev_raid_remove_base_bdev", 00:17:42.083 "bdev_raid_add_base_bdev", 00:17:42.083 "bdev_raid_delete", 00:17:42.083 "bdev_raid_create", 00:17:42.083 "bdev_raid_get_bdevs", 00:17:42.083 "bdev_lvol_set_parent_bdev", 00:17:42.083 "bdev_lvol_set_parent", 00:17:42.083 "bdev_lvol_check_shallow_copy", 00:17:42.083 "bdev_lvol_start_shallow_copy", 00:17:42.083 "bdev_lvol_grow_lvstore", 00:17:42.083 "bdev_lvol_get_lvols", 00:17:42.083 "bdev_lvol_get_lvstores", 00:17:42.083 "bdev_lvol_delete", 00:17:42.083 "bdev_lvol_set_read_only", 00:17:42.083 "bdev_lvol_resize", 00:17:42.083 "bdev_lvol_decouple_parent", 00:17:42.083 "bdev_lvol_inflate", 00:17:42.083 "bdev_lvol_rename", 00:17:42.083 "bdev_lvol_clone_bdev", 00:17:42.083 "bdev_lvol_clone", 00:17:42.083 "bdev_lvol_snapshot", 00:17:42.083 "bdev_lvol_create", 00:17:42.083 "bdev_lvol_delete_lvstore", 00:17:42.083 "bdev_lvol_rename_lvstore", 00:17:42.083 "bdev_lvol_create_lvstore", 00:17:42.083 "bdev_passthru_delete", 00:17:42.083 "bdev_passthru_create", 00:17:42.083 "bdev_nvme_cuse_unregister", 00:17:42.083 "bdev_nvme_cuse_register", 00:17:42.083 "bdev_opal_new_user", 00:17:42.083 "bdev_opal_set_lock_state", 00:17:42.083 "bdev_opal_delete", 00:17:42.083 "bdev_opal_get_info", 00:17:42.083 "bdev_opal_create", 00:17:42.083 "bdev_nvme_opal_revert", 00:17:42.083 "bdev_nvme_opal_init", 00:17:42.083 "bdev_nvme_send_cmd", 00:17:42.083 "bdev_nvme_get_path_iostat", 00:17:42.083 "bdev_nvme_get_mdns_discovery_info", 00:17:42.083 "bdev_nvme_stop_mdns_discovery", 00:17:42.083 "bdev_nvme_start_mdns_discovery", 00:17:42.083 "bdev_nvme_set_multipath_policy", 00:17:42.083 "bdev_nvme_set_preferred_path", 00:17:42.083 "bdev_nvme_get_io_paths", 00:17:42.083 "bdev_nvme_remove_error_injection", 00:17:42.083 "bdev_nvme_add_error_injection", 00:17:42.083 "bdev_nvme_get_discovery_info", 00:17:42.083 "bdev_nvme_stop_discovery", 00:17:42.083 "bdev_nvme_start_discovery", 00:17:42.083 "bdev_nvme_get_controller_health_info", 00:17:42.083 "bdev_nvme_disable_controller", 00:17:42.083 "bdev_nvme_enable_controller", 00:17:42.083 "bdev_nvme_reset_controller", 00:17:42.083 "bdev_nvme_get_transport_statistics", 00:17:42.083 "bdev_nvme_apply_firmware", 00:17:42.083 "bdev_nvme_detach_controller", 00:17:42.083 "bdev_nvme_get_controllers", 00:17:42.083 "bdev_nvme_attach_controller", 00:17:42.083 "bdev_nvme_set_hotplug", 00:17:42.083 "bdev_nvme_set_options", 00:17:42.083 "bdev_null_resize", 00:17:42.083 "bdev_null_delete", 00:17:42.083 "bdev_null_create", 00:17:42.083 "bdev_malloc_delete", 00:17:42.083 "bdev_malloc_create" 00:17:42.083 ] 00:17:42.083 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@729 -- # xtrace_disable 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:17:42.083 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:17:42.083 12:20:05 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 191906 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@949 -- # '[' -z 191906 ']' 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@953 -- # kill -0 191906 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@954 -- # uname 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 191906 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:42.083 killing process with pid 191906 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@967 -- # echo 'killing process with pid 191906' 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@968 -- # kill 191906 00:17:42.083 12:20:05 spdkcli_tcp -- common/autotest_common.sh@973 -- # wait 191906 00:17:42.650 00:17:42.650 real 0m2.175s 00:17:42.650 user 0m3.832s 00:17:42.650 sys 0m0.725s 00:17:42.650 12:20:06 spdkcli_tcp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:42.650 12:20:06 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:17:42.650 ************************************ 00:17:42.650 END TEST spdkcli_tcp 00:17:42.650 ************************************ 00:17:42.650 12:20:06 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:17:42.650 12:20:06 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:42.650 12:20:06 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:42.650 12:20:06 -- common/autotest_common.sh@10 -- # set +x 00:17:42.650 ************************************ 00:17:42.650 START TEST dpdk_mem_utility 00:17:42.650 ************************************ 00:17:42.650 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:17:42.909 * Looking for test storage... 00:17:42.909 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility 00:17:42.909 12:20:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py 00:17:42.909 12:20:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=192008 00:17:42.909 12:20:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:42.909 12:20:06 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 192008 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@830 -- # '[' -z 192008 ']' 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:42.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:42.909 12:20:06 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:17:42.909 [2024-06-07 12:20:06.402682] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:42.909 [2024-06-07 12:20:06.403103] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192008 ] 00:17:43.167 [2024-06-07 12:20:06.558573] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:43.167 [2024-06-07 12:20:06.649588] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:43.735 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:43.735 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@863 -- # return 0 00:17:43.735 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:17:43.735 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:17:43.735 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:43.735 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:17:43.735 { 00:17:43.735 "filename": "/tmp/spdk_mem_dump.txt" 00:17:43.735 } 00:17:43.735 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:43.735 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py 00:17:43.995 DPDK memory size 814.000000 MiB in 1 heap(s) 00:17:43.995 1 heaps totaling size 814.000000 MiB 00:17:43.995 size: 814.000000 MiB heap id: 0 00:17:43.995 end heaps---------- 00:17:43.995 8 mempools totaling size 598.116089 MiB 00:17:43.995 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:17:43.995 size: 158.602051 MiB name: PDU_data_out_Pool 00:17:43.995 size: 84.521057 MiB name: bdev_io_192008 00:17:43.995 size: 51.011292 MiB name: evtpool_192008 00:17:43.995 size: 50.003479 MiB name: msgpool_192008 00:17:43.995 size: 21.763794 MiB name: PDU_Pool 00:17:43.995 size: 19.513306 MiB name: SCSI_TASK_Pool 00:17:43.995 size: 0.026123 MiB name: Session_Pool 00:17:43.995 end mempools------- 00:17:43.995 6 memzones totaling size 4.142822 MiB 00:17:43.995 size: 1.000366 MiB name: RG_ring_0_192008 00:17:43.995 size: 1.000366 MiB name: RG_ring_1_192008 00:17:43.995 size: 1.000366 MiB name: RG_ring_4_192008 00:17:43.995 size: 1.000366 MiB name: RG_ring_5_192008 00:17:43.995 size: 0.125366 MiB name: RG_ring_2_192008 00:17:43.995 size: 0.015991 MiB name: RG_ring_3_192008 00:17:43.995 end memzones------- 00:17:43.995 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py -m 0 00:17:43.995 heap id: 0 total size: 814.000000 MiB number of busy elements: 227 number of free elements: 15 00:17:43.995 list of free elements. size: 12.485291 MiB 00:17:43.995 element at address: 0x200000400000 with size: 1.999512 MiB 00:17:43.995 element at address: 0x200018e00000 with size: 0.999878 MiB 00:17:43.995 element at address: 0x200019000000 with size: 0.999878 MiB 00:17:43.995 element at address: 0x200003e00000 with size: 0.996277 MiB 00:17:43.995 element at address: 0x200031c00000 with size: 0.994446 MiB 00:17:43.995 element at address: 0x200013800000 with size: 0.978699 MiB 00:17:43.995 element at address: 0x200007000000 with size: 0.959839 MiB 00:17:43.995 element at address: 0x200019200000 with size: 0.936584 MiB 00:17:43.995 element at address: 0x200000200000 with size: 0.836853 MiB 00:17:43.995 element at address: 0x20001aa00000 with size: 0.567871 MiB 00:17:43.995 element at address: 0x20000b200000 with size: 0.489624 MiB 00:17:43.995 element at address: 0x200000800000 with size: 0.487061 MiB 00:17:43.995 element at address: 0x200019400000 with size: 0.485657 MiB 00:17:43.995 element at address: 0x200027e00000 with size: 0.402161 MiB 00:17:43.995 element at address: 0x200003a00000 with size: 0.350952 MiB 00:17:43.995 list of standard malloc elements. size: 199.252136 MiB 00:17:43.995 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:17:43.995 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:17:43.995 element at address: 0x200018efff80 with size: 1.000122 MiB 00:17:43.995 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:17:43.995 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:17:43.995 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:17:43.995 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:17:43.995 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:17:43.995 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:17:43.995 element at address: 0x2000002d63c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6480 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6540 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6600 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d66c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6780 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6840 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6900 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d69c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6a80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6b40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6c00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6cc0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6d80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6e40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6f00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d6fc0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d71c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7280 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7340 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7400 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d74c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7580 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7640 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7700 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d77c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7880 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7940 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7a00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a59d80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a59e40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a59f00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a59fc0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a080 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a140 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a200 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a2c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a380 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a440 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a500 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a5c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a680 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a740 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a800 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a8c0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5a980 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5aa40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5ab00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5abc0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5ac80 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5ad40 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5ae00 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5aec0 with size: 0.000183 MiB 00:17:43.995 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003adb300 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003adb500 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003affa80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003affb40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:17:43.996 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e66f40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e67000 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6dc00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:17:43.996 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:17:43.996 list of memzone associated elements. size: 602.262573 MiB 00:17:43.997 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:17:43.997 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:17:43.997 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:17:43.997 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:17:43.997 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:17:43.997 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_192008_0 00:17:43.997 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:17:43.997 associated memzone info: size: 48.002930 MiB name: MP_evtpool_192008_0 00:17:43.997 element at address: 0x200003fff380 with size: 48.003052 MiB 00:17:43.997 associated memzone info: size: 48.002930 MiB name: MP_msgpool_192008_0 00:17:43.997 element at address: 0x2000195be940 with size: 20.255554 MiB 00:17:43.997 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:17:43.997 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:17:43.997 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:17:43.997 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:17:43.997 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_192008 00:17:43.997 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:17:43.997 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_192008 00:17:43.997 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:17:43.997 associated memzone info: size: 1.007996 MiB name: MP_evtpool_192008 00:17:43.997 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:17:43.997 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:17:43.997 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:17:43.997 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:17:43.997 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:17:43.997 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:17:43.997 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:17:43.997 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:17:43.997 element at address: 0x200003eff180 with size: 1.000488 MiB 00:17:43.997 associated memzone info: size: 1.000366 MiB name: RG_ring_0_192008 00:17:43.997 element at address: 0x200003affc00 with size: 1.000488 MiB 00:17:43.997 associated memzone info: size: 1.000366 MiB name: RG_ring_1_192008 00:17:43.997 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:17:43.997 associated memzone info: size: 1.000366 MiB name: RG_ring_4_192008 00:17:43.997 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:17:43.997 associated memzone info: size: 1.000366 MiB name: RG_ring_5_192008 00:17:43.997 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:17:43.997 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_192008 00:17:43.997 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:17:43.997 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:17:43.997 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:17:43.997 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:17:43.997 element at address: 0x20001947c540 with size: 0.250488 MiB 00:17:43.997 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:17:43.997 element at address: 0x200003adf880 with size: 0.125488 MiB 00:17:43.997 associated memzone info: size: 0.125366 MiB name: RG_ring_2_192008 00:17:43.997 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:17:43.997 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:17:43.997 element at address: 0x200027e670c0 with size: 0.023743 MiB 00:17:43.997 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:17:43.997 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:17:43.997 associated memzone info: size: 0.015991 MiB name: RG_ring_3_192008 00:17:43.997 element at address: 0x200027e6d200 with size: 0.002441 MiB 00:17:43.997 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:17:43.997 element at address: 0x2000002d7080 with size: 0.000305 MiB 00:17:43.997 associated memzone info: size: 0.000183 MiB name: MP_msgpool_192008 00:17:43.997 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:17:43.997 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_192008 00:17:43.997 element at address: 0x200027e6dcc0 with size: 0.000305 MiB 00:17:43.997 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:17:43.997 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:17:43.997 12:20:07 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 192008 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@949 -- # '[' -z 192008 ']' 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@953 -- # kill -0 192008 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@954 -- # uname 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192008 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192008' 00:17:43.997 killing process with pid 192008 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@968 -- # kill 192008 00:17:43.997 12:20:07 dpdk_mem_utility -- common/autotest_common.sh@973 -- # wait 192008 00:17:44.563 00:17:44.563 real 0m1.877s 00:17:44.563 user 0m1.806s 00:17:44.563 sys 0m0.593s 00:17:44.563 12:20:08 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:44.563 12:20:08 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:17:44.563 ************************************ 00:17:44.563 END TEST dpdk_mem_utility 00:17:44.563 ************************************ 00:17:44.563 12:20:08 -- spdk/autotest.sh@181 -- # run_test event /home/vagrant/spdk_repo/spdk/test/event/event.sh 00:17:44.563 12:20:08 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:44.563 12:20:08 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:44.563 12:20:08 -- common/autotest_common.sh@10 -- # set +x 00:17:44.822 ************************************ 00:17:44.822 START TEST event 00:17:44.822 ************************************ 00:17:44.822 12:20:08 event -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/event.sh 00:17:44.822 * Looking for test storage... 00:17:44.822 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event 00:17:44.822 12:20:08 event -- event/event.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:17:44.822 12:20:08 event -- bdev/nbd_common.sh@6 -- # set -e 00:17:44.822 12:20:08 event -- event/event.sh@45 -- # run_test event_perf /home/vagrant/spdk_repo/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:17:44.822 12:20:08 event -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:17:44.822 12:20:08 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:44.822 12:20:08 event -- common/autotest_common.sh@10 -- # set +x 00:17:44.822 ************************************ 00:17:44.822 START TEST event_perf 00:17:44.822 ************************************ 00:17:44.822 12:20:08 event.event_perf -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:17:44.822 Running I/O for 1 seconds...[2024-06-07 12:20:08.346969] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:44.822 [2024-06-07 12:20:08.347177] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192101 ] 00:17:45.080 [2024-06-07 12:20:08.504589] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:17:45.080 [2024-06-07 12:20:08.596997] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:45.080 [2024-06-07 12:20:08.597141] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:17:45.080 [2024-06-07 12:20:08.597315] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:45.080 [2024-06-07 12:20:08.597321] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:17:46.467 Running I/O for 1 seconds... 00:17:46.467 lcore 0: 257587 00:17:46.467 lcore 1: 257585 00:17:46.467 lcore 2: 257587 00:17:46.467 lcore 3: 257587 00:17:46.467 done. 00:17:46.467 00:17:46.467 real 0m1.427s 00:17:46.467 user 0m4.183s 00:17:46.467 sys 0m0.122s 00:17:46.467 12:20:09 event.event_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:46.467 12:20:09 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:17:46.467 ************************************ 00:17:46.467 END TEST event_perf 00:17:46.467 ************************************ 00:17:46.467 12:20:09 event -- event/event.sh@46 -- # run_test event_reactor /home/vagrant/spdk_repo/spdk/test/event/reactor/reactor -t 1 00:17:46.467 12:20:09 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:17:46.467 12:20:09 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:46.467 12:20:09 event -- common/autotest_common.sh@10 -- # set +x 00:17:46.467 ************************************ 00:17:46.467 START TEST event_reactor 00:17:46.467 ************************************ 00:17:46.467 12:20:09 event.event_reactor -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/reactor/reactor -t 1 00:17:46.467 [2024-06-07 12:20:09.844373] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:46.467 [2024-06-07 12:20:09.844643] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192148 ] 00:17:46.467 [2024-06-07 12:20:09.995732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:46.726 [2024-06-07 12:20:10.112243] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:47.662 test_start 00:17:47.662 oneshot 00:17:47.662 tick 100 00:17:47.662 tick 100 00:17:47.662 tick 250 00:17:47.662 tick 100 00:17:47.662 tick 100 00:17:47.662 tick 100 00:17:47.662 tick 250 00:17:47.662 tick 500 00:17:47.662 tick 100 00:17:47.662 tick 100 00:17:47.662 tick 250 00:17:47.662 tick 100 00:17:47.662 tick 100 00:17:47.662 test_end 00:17:47.662 00:17:47.662 real 0m1.445s 00:17:47.662 user 0m1.215s 00:17:47.662 sys 0m0.122s 00:17:47.662 12:20:11 event.event_reactor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:47.662 12:20:11 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:17:47.662 ************************************ 00:17:47.662 END TEST event_reactor 00:17:47.662 ************************************ 00:17:47.920 12:20:11 event -- event/event.sh@47 -- # run_test event_reactor_perf /home/vagrant/spdk_repo/spdk/test/event/reactor_perf/reactor_perf -t 1 00:17:47.920 12:20:11 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:17:47.920 12:20:11 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:47.920 12:20:11 event -- common/autotest_common.sh@10 -- # set +x 00:17:47.920 ************************************ 00:17:47.920 START TEST event_reactor_perf 00:17:47.920 ************************************ 00:17:47.920 12:20:11 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/reactor_perf/reactor_perf -t 1 00:17:47.920 [2024-06-07 12:20:11.366865] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:47.920 [2024-06-07 12:20:11.367160] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192186 ] 00:17:47.920 [2024-06-07 12:20:11.516249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:48.178 [2024-06-07 12:20:11.604786] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:49.132 test_start 00:17:49.132 test_end 00:17:49.132 Performance: 689386 events per second 00:17:49.132 00:17:49.132 real 0m1.422s 00:17:49.132 user 0m1.193s 00:17:49.132 sys 0m0.117s 00:17:49.132 12:20:12 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:49.132 12:20:12 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:17:49.132 ************************************ 00:17:49.132 END TEST event_reactor_perf 00:17:49.132 ************************************ 00:17:49.413 12:20:12 event -- event/event.sh@49 -- # uname -s 00:17:49.413 12:20:12 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:17:49.413 12:20:12 event -- event/event.sh@50 -- # run_test event_scheduler /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler.sh 00:17:49.413 12:20:12 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:49.413 12:20:12 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:49.413 12:20:12 event -- common/autotest_common.sh@10 -- # set +x 00:17:49.413 ************************************ 00:17:49.413 START TEST event_scheduler 00:17:49.413 ************************************ 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler.sh 00:17:49.413 * Looking for test storage... 00:17:49.413 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event/scheduler 00:17:49.413 12:20:12 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:17:49.413 12:20:12 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=192264 00:17:49.413 12:20:12 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:17:49.413 12:20:12 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 192264 00:17:49.413 12:20:12 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@830 -- # '[' -z 192264 ']' 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:49.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:49.413 12:20:12 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:17:49.413 [2024-06-07 12:20:12.968550] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:49.413 [2024-06-07 12:20:12.969373] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192264 ] 00:17:49.670 [2024-06-07 12:20:13.139924] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:17:49.670 [2024-06-07 12:20:13.236836] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:49.670 [2024-06-07 12:20:13.236946] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:49.670 [2024-06-07 12:20:13.237098] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:17:49.670 [2024-06-07 12:20:13.237113] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@863 -- # return 0 00:17:50.604 12:20:13 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 POWER: Env isn't set yet! 00:17:50.604 POWER: Attempting to initialise ACPI cpufreq power management... 00:17:50.604 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:17:50.604 POWER: Cannot set governor of lcore 0 to userspace 00:17:50.604 POWER: Attempting to initialise PSTAT power management... 00:17:50.604 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:17:50.604 POWER: Cannot set governor of lcore 0 to performance 00:17:50.604 POWER: Attempting to initialise CPPC power management... 00:17:50.604 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:17:50.604 POWER: Cannot set governor of lcore 0 to userspace 00:17:50.604 POWER: Attempting to initialise VM power management... 00:17:50.604 GUEST_CHANNEL: Unable to to connect to '/dev/virtio-ports/virtio.serial.port.poweragent.0' with error No such file or directory 00:17:50.604 POWER: Unable to set Power Management Environment for lcore 0 00:17:50.604 [2024-06-07 12:20:13.973405] dpdk_governor.c: 88:_init_core: *ERROR*: Failed to initialize on core0 00:17:50.604 [2024-06-07 12:20:13.973554] dpdk_governor.c: 118:_init: *ERROR*: Failed to initialize on core0 00:17:50.604 [2024-06-07 12:20:13.973648] scheduler_dynamic.c: 238:init: *NOTICE*: Unable to initialize dpdk governor 00:17:50.604 [2024-06-07 12:20:13.973795] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:17:50.604 [2024-06-07 12:20:13.973912] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:17:50.604 [2024-06-07 12:20:13.974005] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:13 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:13 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 [2024-06-07 12:20:14.100083] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:17:50.604 12:20:14 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:17:50.604 12:20:14 event.event_scheduler -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:50.604 12:20:14 event.event_scheduler -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 ************************************ 00:17:50.604 START TEST scheduler_create_thread 00:17:50.604 ************************************ 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # scheduler_create_thread 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 2 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 3 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 4 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 5 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 6 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.604 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.604 7 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.605 8 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.605 9 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.605 10 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:50.605 12:20:14 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:52.505 12:20:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:52.505 12:20:15 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:17:52.505 12:20:15 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:17:52.505 12:20:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:17:52.505 12:20:15 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:53.440 12:20:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:17:53.440 00:17:53.440 real 0m2.613s 00:17:53.440 user 0m0.014s 00:17:53.440 sys 0m0.007s 00:17:53.440 ************************************ 00:17:53.440 END TEST scheduler_create_thread 00:17:53.440 ************************************ 00:17:53.440 12:20:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:53.440 12:20:16 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:17:53.440 12:20:16 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:17:53.440 12:20:16 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 192264 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@949 -- # '[' -z 192264 ']' 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@953 -- # kill -0 192264 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@954 -- # uname 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192264 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:17:53.440 killing process with pid 192264 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192264' 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@968 -- # kill 192264 00:17:53.440 12:20:16 event.event_scheduler -- common/autotest_common.sh@973 -- # wait 192264 00:17:53.698 [2024-06-07 12:20:17.209243] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:17:53.957 00:17:53.957 real 0m4.774s 00:17:53.957 user 0m8.752s 00:17:53.957 sys 0m0.518s 00:17:53.957 12:20:17 event.event_scheduler -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:53.957 ************************************ 00:17:53.957 END TEST event_scheduler 00:17:53.957 ************************************ 00:17:53.957 12:20:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:17:54.216 12:20:17 event -- event/event.sh@51 -- # modprobe -n nbd 00:17:54.216 12:20:17 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:17:54.216 12:20:17 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:17:54.216 12:20:17 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:54.216 12:20:17 event -- common/autotest_common.sh@10 -- # set +x 00:17:54.216 ************************************ 00:17:54.216 START TEST app_repeat 00:17:54.216 ************************************ 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@1124 -- # app_repeat_test 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@19 -- # repeat_pid=192375 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@18 -- # /home/vagrant/spdk_repo/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:17:54.216 Process app_repeat pid: 192375 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 192375' 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:17:54.216 spdk_app_start Round 0 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:17:54.216 12:20:17 event.app_repeat -- event/event.sh@25 -- # waitforlisten 192375 /var/tmp/spdk-nbd.sock 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 192375 ']' 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:54.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:54.216 12:20:17 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:17:54.216 [2024-06-07 12:20:17.705792] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:17:54.216 [2024-06-07 12:20:17.706558] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192375 ] 00:17:54.216 [2024-06-07 12:20:17.852093] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:17:54.475 [2024-06-07 12:20:17.942286] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:54.475 [2024-06-07 12:20:17.942290] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:55.100 12:20:18 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:55.100 12:20:18 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:17:55.100 12:20:18 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:17:55.359 Malloc0 00:17:55.359 12:20:18 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:17:55.616 Malloc1 00:17:55.616 12:20:19 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:17:55.616 12:20:19 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:17:55.617 12:20:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:17:55.875 /dev/nbd0 00:17:55.875 12:20:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:17:55.875 12:20:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:17:55.875 1+0 records in 00:17:55.875 1+0 records out 00:17:55.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027594 s, 14.8 MB/s 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:17:55.875 12:20:19 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:17:55.875 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:17:55.875 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:17:55.875 12:20:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:17:56.133 /dev/nbd1 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:17:56.133 1+0 records in 00:17:56.133 1+0 records out 00:17:56.133 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000402939 s, 10.2 MB/s 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:17:56.133 12:20:19 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:56.133 12:20:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:17:56.391 12:20:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:17:56.391 { 00:17:56.391 "nbd_device": "/dev/nbd0", 00:17:56.391 "bdev_name": "Malloc0" 00:17:56.392 }, 00:17:56.392 { 00:17:56.392 "nbd_device": "/dev/nbd1", 00:17:56.392 "bdev_name": "Malloc1" 00:17:56.392 } 00:17:56.392 ]' 00:17:56.392 12:20:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:17:56.392 { 00:17:56.392 "nbd_device": "/dev/nbd0", 00:17:56.392 "bdev_name": "Malloc0" 00:17:56.392 }, 00:17:56.392 { 00:17:56.392 "nbd_device": "/dev/nbd1", 00:17:56.392 "bdev_name": "Malloc1" 00:17:56.392 } 00:17:56.392 ]' 00:17:56.392 12:20:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:17:56.651 /dev/nbd1' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:17:56.651 /dev/nbd1' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:17:56.651 256+0 records in 00:17:56.651 256+0 records out 00:17:56.651 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0063852 s, 164 MB/s 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:17:56.651 256+0 records in 00:17:56.651 256+0 records out 00:17:56.651 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211159 s, 49.7 MB/s 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:17:56.651 256+0 records in 00:17:56.651 256+0 records out 00:17:56.651 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0284508 s, 36.9 MB/s 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:17:56.651 12:20:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:17:56.910 12:20:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:17:56.911 12:20:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:17:57.169 12:20:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:17:57.428 12:20:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:17:57.428 12:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:17:57.428 12:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:17:57.735 12:20:21 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:17:57.735 12:20:21 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:17:57.735 12:20:21 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:17:58.302 [2024-06-07 12:20:21.684550] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:17:58.302 [2024-06-07 12:20:21.775496] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:17:58.302 [2024-06-07 12:20:21.775500] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:58.302 [2024-06-07 12:20:21.854975] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:17:58.302 [2024-06-07 12:20:21.857632] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:18:00.832 12:20:24 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:18:00.832 12:20:24 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:18:00.832 spdk_app_start Round 1 00:18:00.832 12:20:24 event.app_repeat -- event/event.sh@25 -- # waitforlisten 192375 /var/tmp/spdk-nbd.sock 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 192375 ']' 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:18:00.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:00.832 12:20:24 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:18:01.089 12:20:24 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:01.089 12:20:24 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:18:01.089 12:20:24 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:18:01.347 Malloc0 00:18:01.606 12:20:25 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:18:01.865 Malloc1 00:18:01.865 12:20:25 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:01.865 12:20:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:18:02.123 /dev/nbd0 00:18:02.123 12:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:02.123 12:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:18:02.123 1+0 records in 00:18:02.123 1+0 records out 00:18:02.123 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310177 s, 13.2 MB/s 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:18:02.123 12:20:25 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:18:02.123 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:02.123 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:02.123 12:20:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:18:02.382 /dev/nbd1 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:18:02.382 1+0 records in 00:18:02.382 1+0 records out 00:18:02.382 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000563985 s, 7.3 MB/s 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:18:02.382 12:20:25 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:02.382 12:20:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:18:02.641 { 00:18:02.641 "nbd_device": "/dev/nbd0", 00:18:02.641 "bdev_name": "Malloc0" 00:18:02.641 }, 00:18:02.641 { 00:18:02.641 "nbd_device": "/dev/nbd1", 00:18:02.641 "bdev_name": "Malloc1" 00:18:02.641 } 00:18:02.641 ]' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:18:02.641 { 00:18:02.641 "nbd_device": "/dev/nbd0", 00:18:02.641 "bdev_name": "Malloc0" 00:18:02.641 }, 00:18:02.641 { 00:18:02.641 "nbd_device": "/dev/nbd1", 00:18:02.641 "bdev_name": "Malloc1" 00:18:02.641 } 00:18:02.641 ]' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:18:02.641 /dev/nbd1' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:18:02.641 /dev/nbd1' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:18:02.641 256+0 records in 00:18:02.641 256+0 records out 00:18:02.641 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00828049 s, 127 MB/s 00:18:02.641 12:20:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:18:02.642 12:20:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:18:02.642 256+0 records in 00:18:02.642 256+0 records out 00:18:02.642 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0232891 s, 45.0 MB/s 00:18:02.642 12:20:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:18:02.642 12:20:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:18:02.901 256+0 records in 00:18:02.901 256+0 records out 00:18:02.901 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0268516 s, 39.1 MB/s 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:02.901 12:20:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:03.159 12:20:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:03.418 12:20:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:18:03.676 12:20:27 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:18:03.676 12:20:27 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:18:03.935 12:20:27 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:18:04.193 [2024-06-07 12:20:27.788676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:18:04.452 [2024-06-07 12:20:27.879763] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:18:04.452 [2024-06-07 12:20:27.879765] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:04.452 [2024-06-07 12:20:27.960702] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:18:04.452 [2024-06-07 12:20:27.961109] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:18:06.985 12:20:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:18:06.985 12:20:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:18:06.985 spdk_app_start Round 2 00:18:06.985 12:20:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 192375 /var/tmp/spdk-nbd.sock 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 192375 ']' 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:18:06.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:06.985 12:20:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:18:07.244 12:20:30 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:07.244 12:20:30 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:18:07.244 12:20:30 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:18:07.502 Malloc0 00:18:07.503 12:20:31 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:18:07.761 Malloc1 00:18:07.761 12:20:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:07.761 12:20:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:18:08.020 /dev/nbd0 00:18:08.020 12:20:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:08.020 12:20:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:18:08.020 1+0 records in 00:18:08.020 1+0 records out 00:18:08.020 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000382676 s, 10.7 MB/s 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:18:08.020 12:20:31 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:18:08.020 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:08.020 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:08.020 12:20:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:18:08.278 /dev/nbd1 00:18:08.278 12:20:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:08.278 12:20:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:08.278 12:20:31 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:18:08.537 12:20:31 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:18:08.538 1+0 records in 00:18:08.538 1+0 records out 00:18:08.538 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0005686 s, 7.2 MB/s 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:18:08.538 12:20:31 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:18:08.538 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:08.538 12:20:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:08.538 12:20:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:18:08.538 12:20:31 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:08.538 12:20:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:18:08.797 { 00:18:08.797 "nbd_device": "/dev/nbd0", 00:18:08.797 "bdev_name": "Malloc0" 00:18:08.797 }, 00:18:08.797 { 00:18:08.797 "nbd_device": "/dev/nbd1", 00:18:08.797 "bdev_name": "Malloc1" 00:18:08.797 } 00:18:08.797 ]' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:18:08.797 { 00:18:08.797 "nbd_device": "/dev/nbd0", 00:18:08.797 "bdev_name": "Malloc0" 00:18:08.797 }, 00:18:08.797 { 00:18:08.797 "nbd_device": "/dev/nbd1", 00:18:08.797 "bdev_name": "Malloc1" 00:18:08.797 } 00:18:08.797 ]' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:18:08.797 /dev/nbd1' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:18:08.797 /dev/nbd1' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:18:08.797 256+0 records in 00:18:08.797 256+0 records out 00:18:08.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00702695 s, 149 MB/s 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:18:08.797 256+0 records in 00:18:08.797 256+0 records out 00:18:08.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0260261 s, 40.3 MB/s 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:18:08.797 256+0 records in 00:18:08.797 256+0 records out 00:18:08.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198235 s, 52.9 MB/s 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:08.797 12:20:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:09.371 12:20:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:18:09.673 12:20:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:18:09.930 12:20:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:18:09.930 12:20:33 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:18:10.188 12:20:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:18:10.446 [2024-06-07 12:20:34.053539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:18:10.703 [2024-06-07 12:20:34.147593] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:18:10.703 [2024-06-07 12:20:34.147597] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:10.703 [2024-06-07 12:20:34.230331] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:18:10.703 [2024-06-07 12:20:34.230618] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:18:13.244 12:20:36 event.app_repeat -- event/event.sh@38 -- # waitforlisten 192375 /var/tmp/spdk-nbd.sock 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 192375 ']' 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:18:13.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:13.244 12:20:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:18:13.503 12:20:36 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:13.503 12:20:36 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:18:13.503 12:20:36 event.app_repeat -- event/event.sh@39 -- # killprocess 192375 00:18:13.503 12:20:36 event.app_repeat -- common/autotest_common.sh@949 -- # '[' -z 192375 ']' 00:18:13.503 12:20:36 event.app_repeat -- common/autotest_common.sh@953 -- # kill -0 192375 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@954 -- # uname 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192375 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192375' 00:18:13.503 killing process with pid 192375 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@968 -- # kill 192375 00:18:13.503 12:20:37 event.app_repeat -- common/autotest_common.sh@973 -- # wait 192375 00:18:13.760 spdk_app_start is called in Round 0. 00:18:13.760 Shutdown signal received, stop current app iteration 00:18:13.760 Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 reinitialization... 00:18:13.760 spdk_app_start is called in Round 1. 00:18:13.760 Shutdown signal received, stop current app iteration 00:18:13.760 Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 reinitialization... 00:18:13.760 spdk_app_start is called in Round 2. 00:18:13.760 Shutdown signal received, stop current app iteration 00:18:13.760 Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 reinitialization... 00:18:13.760 spdk_app_start is called in Round 3. 00:18:13.760 Shutdown signal received, stop current app iteration 00:18:13.760 12:20:37 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:18:13.760 12:20:37 event.app_repeat -- event/event.sh@42 -- # return 0 00:18:13.760 00:18:13.760 real 0m19.693s 00:18:13.760 user 0m42.784s 00:18:13.760 sys 0m4.188s 00:18:13.760 12:20:37 event.app_repeat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:13.760 12:20:37 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:18:13.760 ************************************ 00:18:13.760 END TEST app_repeat 00:18:13.760 ************************************ 00:18:14.018 12:20:37 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:18:14.018 12:20:37 event -- event/event.sh@55 -- # run_test cpu_locks /home/vagrant/spdk_repo/spdk/test/event/cpu_locks.sh 00:18:14.018 12:20:37 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:14.018 12:20:37 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:14.018 12:20:37 event -- common/autotest_common.sh@10 -- # set +x 00:18:14.018 ************************************ 00:18:14.018 START TEST cpu_locks 00:18:14.018 ************************************ 00:18:14.018 12:20:37 event.cpu_locks -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/event/cpu_locks.sh 00:18:14.018 * Looking for test storage... 00:18:14.018 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event 00:18:14.018 12:20:37 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:18:14.018 12:20:37 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:18:14.018 12:20:37 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:18:14.018 12:20:37 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:18:14.018 12:20:37 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:14.018 12:20:37 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:14.018 12:20:37 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:14.018 ************************************ 00:18:14.018 START TEST default_locks 00:18:14.018 ************************************ 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@1124 -- # default_locks 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=192853 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 192853 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@830 -- # '[' -z 192853 ']' 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:14.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:14.018 12:20:37 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:18:14.018 [2024-06-07 12:20:37.584443] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:14.018 [2024-06-07 12:20:37.585648] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192853 ] 00:18:14.277 [2024-06-07 12:20:37.723674] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:14.277 [2024-06-07 12:20:37.822157] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:15.233 12:20:38 event.cpu_locks.default_locks -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:15.233 12:20:38 event.cpu_locks.default_locks -- common/autotest_common.sh@863 -- # return 0 00:18:15.233 12:20:38 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 192853 00:18:15.233 12:20:38 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 192853 00:18:15.233 12:20:38 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:18:15.491 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 192853 00:18:15.491 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@949 -- # '[' -z 192853 ']' 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@953 -- # kill -0 192853 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # uname 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192853 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192853' 00:18:15.492 killing process with pid 192853 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # kill 192853 00:18:15.492 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # wait 192853 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 192853 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@649 -- # local es=0 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@651 -- # valid_exec_arg waitforlisten 192853 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@637 -- # local arg=waitforlisten 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@641 -- # type -t waitforlisten 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # waitforlisten 192853 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@830 -- # '[' -z 192853 ']' 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:16.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:18:16.424 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 845: kill: (192853) - No such process 00:18:16.424 ERROR: process (pid: 192853) is no longer running 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@863 -- # return 1 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # es=1 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:18:16.424 00:18:16.424 real 0m2.200s 00:18:16.424 user 0m2.130s 00:18:16.424 sys 0m0.791s 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:16.424 12:20:39 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:18:16.424 ************************************ 00:18:16.424 END TEST default_locks 00:18:16.424 ************************************ 00:18:16.424 12:20:39 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:18:16.424 12:20:39 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:16.424 12:20:39 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:16.424 12:20:39 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:16.424 ************************************ 00:18:16.424 START TEST default_locks_via_rpc 00:18:16.424 ************************************ 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1124 -- # default_locks_via_rpc 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=192913 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 192913 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@830 -- # '[' -z 192913 ']' 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:16.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:16.424 12:20:39 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:16.424 [2024-06-07 12:20:39.845370] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:16.424 [2024-06-07 12:20:39.845996] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192913 ] 00:18:16.424 [2024-06-07 12:20:39.987826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:16.682 [2024-06-07 12:20:40.093076] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:17.247 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@863 -- # return 0 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:18:17.248 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 192913 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 192913 00:18:17.506 12:20:40 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 192913 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@949 -- # '[' -z 192913 ']' 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@953 -- # kill -0 192913 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # uname 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192913 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192913' 00:18:18.071 killing process with pid 192913 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # kill 192913 00:18:18.071 12:20:41 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # wait 192913 00:18:18.636 00:18:18.636 real 0m2.356s 00:18:18.636 user 0m2.384s 00:18:18.636 sys 0m0.771s 00:18:18.636 12:20:42 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:18.636 12:20:42 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:18.636 ************************************ 00:18:18.636 END TEST default_locks_via_rpc 00:18:18.636 ************************************ 00:18:18.636 12:20:42 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:18:18.636 12:20:42 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:18.636 12:20:42 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:18.636 12:20:42 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:18.636 ************************************ 00:18:18.636 START TEST non_locking_app_on_locked_coremask 00:18:18.636 ************************************ 00:18:18.636 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1124 -- # non_locking_app_on_locked_coremask 00:18:18.636 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=192973 00:18:18.636 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:18:18.636 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 192973 /var/tmp/spdk.sock 00:18:18.636 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@830 -- # '[' -z 192973 ']' 00:18:18.637 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:18.637 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:18.637 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:18.637 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:18.637 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:18.637 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:18.637 [2024-06-07 12:20:42.266100] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:18.637 [2024-06-07 12:20:42.267044] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192973 ] 00:18:18.894 [2024-06-07 12:20:42.415465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:18.894 [2024-06-07 12:20:42.522647] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=192985 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 192985 /var/tmp/spdk2.sock 00:18:19.458 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@830 -- # '[' -z 192985 ']' 00:18:19.459 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:19.459 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:19.459 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:19.459 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:19.459 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:19.459 12:20:42 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:19.459 [2024-06-07 12:20:42.929999] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:19.459 [2024-06-07 12:20:42.930825] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192985 ] 00:18:19.459 [2024-06-07 12:20:43.081665] app.c: 906:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:18:19.459 [2024-06-07 12:20:43.081823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:19.715 [2024-06-07 12:20:43.304600] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.647 12:20:44 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:20.647 12:20:44 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:20.647 12:20:44 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 192973 00:18:20.647 12:20:44 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 192973 00:18:20.647 12:20:44 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 192973 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@949 -- # '[' -z 192973 ']' 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@953 -- # kill -0 192973 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # uname 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192973 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192973' 00:18:21.578 killing process with pid 192973 00:18:21.578 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # kill 192973 00:18:21.579 12:20:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # wait 192973 00:18:22.953 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 192985 00:18:22.953 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@949 -- # '[' -z 192985 ']' 00:18:22.953 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@953 -- # kill -0 192985 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # uname 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 192985 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 192985' 00:18:22.954 killing process with pid 192985 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # kill 192985 00:18:22.954 12:20:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # wait 192985 00:18:23.579 00:18:23.579 real 0m4.817s 00:18:23.579 user 0m5.012s 00:18:23.579 sys 0m1.485s 00:18:23.579 12:20:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:23.579 12:20:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:23.579 ************************************ 00:18:23.579 END TEST non_locking_app_on_locked_coremask 00:18:23.579 ************************************ 00:18:23.579 12:20:47 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:18:23.579 12:20:47 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:23.579 12:20:47 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:23.579 12:20:47 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:23.579 ************************************ 00:18:23.579 START TEST locking_app_on_unlocked_coremask 00:18:23.579 ************************************ 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1124 -- # locking_app_on_unlocked_coremask 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=193073 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 193073 /var/tmp/spdk.sock 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@830 -- # '[' -z 193073 ']' 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:23.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:23.579 12:20:47 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:23.579 [2024-06-07 12:20:47.148076] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:23.579 [2024-06-07 12:20:47.148807] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193073 ] 00:18:23.836 [2024-06-07 12:20:47.312675] app.c: 906:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:18:23.836 [2024-06-07 12:20:47.312974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:23.836 [2024-06-07 12:20:47.407612] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:24.768 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=193094 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 193094 /var/tmp/spdk2.sock 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@830 -- # '[' -z 193094 ']' 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:24.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:24.769 12:20:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:24.769 [2024-06-07 12:20:48.230988] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:24.769 [2024-06-07 12:20:48.231520] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193094 ] 00:18:24.769 [2024-06-07 12:20:48.368215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:25.026 [2024-06-07 12:20:48.554902] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:25.961 12:20:49 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:25.961 12:20:49 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:25.961 12:20:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 193094 00:18:25.961 12:20:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 193094 00:18:25.961 12:20:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 193073 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@949 -- # '[' -z 193073 ']' 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@953 -- # kill -0 193073 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # uname 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:26.527 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193073 00:18:26.796 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:26.796 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:26.796 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193073' 00:18:26.796 killing process with pid 193073 00:18:26.796 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # kill 193073 00:18:26.796 12:20:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # wait 193073 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 193094 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@949 -- # '[' -z 193094 ']' 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@953 -- # kill -0 193094 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # uname 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193094 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193094' 00:18:28.245 killing process with pid 193094 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # kill 193094 00:18:28.245 12:20:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # wait 193094 00:18:28.504 00:18:28.504 real 0m5.012s 00:18:28.504 user 0m5.202s 00:18:28.504 sys 0m1.523s 00:18:28.504 12:20:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:28.504 12:20:52 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:28.504 ************************************ 00:18:28.504 END TEST locking_app_on_unlocked_coremask 00:18:28.504 ************************************ 00:18:28.764 12:20:52 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:18:28.764 12:20:52 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:28.764 12:20:52 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:28.764 12:20:52 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:28.764 ************************************ 00:18:28.764 START TEST locking_app_on_locked_coremask 00:18:28.764 ************************************ 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1124 -- # locking_app_on_locked_coremask 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=193180 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 193180 /var/tmp/spdk.sock 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@830 -- # '[' -z 193180 ']' 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:28.764 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:28.764 12:20:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:28.764 [2024-06-07 12:20:52.199213] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:28.764 [2024-06-07 12:20:52.200017] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193180 ] 00:18:28.764 [2024-06-07 12:20:52.331668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:29.022 [2024-06-07 12:20:52.424624] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=193201 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 193201 /var/tmp/spdk2.sock 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@649 -- # local es=0 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@651 -- # valid_exec_arg waitforlisten 193201 /var/tmp/spdk2.sock 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@637 -- # local arg=waitforlisten 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@641 -- # type -t waitforlisten 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # waitforlisten 193201 /var/tmp/spdk2.sock 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@830 -- # '[' -z 193201 ']' 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:29.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:29.591 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:29.591 [2024-06-07 12:20:53.202725] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:29.591 [2024-06-07 12:20:53.203329] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193201 ] 00:18:29.850 [2024-06-07 12:20:53.357299] app.c: 771:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 193180 has claimed it. 00:18:29.850 [2024-06-07 12:20:53.357441] app.c: 902:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:18:30.418 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 845: kill: (193201) - No such process 00:18:30.418 ERROR: process (pid: 193201) is no longer running 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@863 -- # return 1 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # es=1 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 193180 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 193180 00:18:30.418 12:20:53 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 193180 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@949 -- # '[' -z 193180 ']' 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@953 -- # kill -0 193180 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # uname 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193180 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193180' 00:18:30.986 killing process with pid 193180 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # kill 193180 00:18:30.986 12:20:54 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # wait 193180 00:18:31.553 00:18:31.553 real 0m2.927s 00:18:31.553 user 0m3.131s 00:18:31.553 sys 0m0.871s 00:18:31.553 12:20:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:31.553 12:20:55 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:31.553 ************************************ 00:18:31.553 END TEST locking_app_on_locked_coremask 00:18:31.553 ************************************ 00:18:31.553 12:20:55 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:18:31.553 12:20:55 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:31.553 12:20:55 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:31.553 12:20:55 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:31.553 ************************************ 00:18:31.553 START TEST locking_overlapped_coremask 00:18:31.553 ************************************ 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1124 -- # locking_overlapped_coremask 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=193258 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 193258 /var/tmp/spdk.sock 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x7 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@830 -- # '[' -z 193258 ']' 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:31.553 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:31.553 12:20:55 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:31.812 [2024-06-07 12:20:55.202964] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:31.812 [2024-06-07 12:20:55.203525] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193258 ] 00:18:31.812 [2024-06-07 12:20:55.365104] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:18:32.072 [2024-06-07 12:20:55.464207] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:18:32.072 [2024-06-07 12:20:55.464385] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:32.072 [2024-06-07 12:20:55.464385] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@863 -- # return 0 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=193281 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 193281 /var/tmp/spdk2.sock 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@649 -- # local es=0 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@651 -- # valid_exec_arg waitforlisten 193281 /var/tmp/spdk2.sock 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@637 -- # local arg=waitforlisten 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@641 -- # type -t waitforlisten 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # waitforlisten 193281 /var/tmp/spdk2.sock 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@830 -- # '[' -z 193281 ']' 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:32.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:32.639 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:32.639 [2024-06-07 12:20:56.282754] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:32.639 [2024-06-07 12:20:56.283277] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193281 ] 00:18:32.930 [2024-06-07 12:20:56.448693] app.c: 771:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 193258 has claimed it. 00:18:32.930 [2024-06-07 12:20:56.448792] app.c: 902:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:18:33.552 ERROR: process (pid: 193281) is no longer running 00:18:33.552 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 845: kill: (193281) - No such process 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@863 -- # return 1 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # es=1 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 193258 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@949 -- # '[' -z 193258 ']' 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@953 -- # kill -0 193258 00:18:33.552 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # uname 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193258 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193258' 00:18:33.553 killing process with pid 193258 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # kill 193258 00:18:33.553 12:20:56 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # wait 193258 00:18:34.120 00:18:34.120 real 0m2.445s 00:18:34.120 user 0m6.437s 00:18:34.120 sys 0m0.653s 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:18:34.120 ************************************ 00:18:34.120 END TEST locking_overlapped_coremask 00:18:34.120 ************************************ 00:18:34.120 12:20:57 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:18:34.120 12:20:57 event.cpu_locks -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:34.120 12:20:57 event.cpu_locks -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:34.120 12:20:57 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:34.120 ************************************ 00:18:34.120 START TEST locking_overlapped_coremask_via_rpc 00:18:34.120 ************************************ 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1124 -- # locking_overlapped_coremask_via_rpc 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=193326 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 193326 /var/tmp/spdk.sock 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@830 -- # '[' -z 193326 ']' 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:34.120 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:34.120 12:20:57 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:34.120 [2024-06-07 12:20:57.721287] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:34.120 [2024-06-07 12:20:57.721812] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193326 ] 00:18:34.379 [2024-06-07 12:20:57.884798] app.c: 906:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:18:34.379 [2024-06-07 12:20:57.885047] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:18:34.379 [2024-06-07 12:20:57.977866] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:18:34.379 [2024-06-07 12:20:57.978000] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:34.379 [2024-06-07 12:20:57.977998] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@863 -- # return 0 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=193349 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 193349 /var/tmp/spdk2.sock 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@830 -- # '[' -z 193349 ']' 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:35.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:35.314 12:20:58 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:35.314 [2024-06-07 12:20:58.731220] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:35.314 [2024-06-07 12:20:58.732254] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193349 ] 00:18:35.314 [2024-06-07 12:20:58.903775] app.c: 906:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:18:35.314 [2024-06-07 12:20:58.903852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:18:35.573 [2024-06-07 12:20:59.086817] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:18:35.573 [2024-06-07 12:20:59.097382] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:18:35.573 [2024-06-07 12:20:59.097385] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 4 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@863 -- # return 0 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@649 -- # local es=0 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:36.168 [2024-06-07 12:20:59.802301] app.c: 771:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 193326 has claimed it. 00:18:36.168 request: 00:18:36.168 { 00:18:36.168 "method": "framework_enable_cpumask_locks", 00:18:36.168 "req_id": 1 00:18:36.168 } 00:18:36.168 Got JSON-RPC error response 00:18:36.168 response: 00:18:36.168 { 00:18:36.168 "code": -32603, 00:18:36.168 "message": "Failed to claim CPU core: 2" 00:18:36.168 } 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # es=1 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 193326 /var/tmp/spdk.sock 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@830 -- # '[' -z 193326 ']' 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:36.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:36.168 12:20:59 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@863 -- # return 0 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 193349 /var/tmp/spdk2.sock 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@830 -- # '[' -z 193349 ']' 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk2.sock 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:18:36.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:36.733 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@863 -- # return 0 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:18:36.990 00:18:36.990 real 0m2.778s 00:18:36.990 user 0m1.459s 00:18:36.990 sys 0m0.244s 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:36.990 12:21:00 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:18:36.990 ************************************ 00:18:36.990 END TEST locking_overlapped_coremask_via_rpc 00:18:36.990 ************************************ 00:18:36.990 12:21:00 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:18:36.990 12:21:00 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 193326 ]] 00:18:36.990 12:21:00 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 193326 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@949 -- # '[' -z 193326 ']' 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@953 -- # kill -0 193326 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@954 -- # uname 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193326 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:36.990 12:21:00 event.cpu_locks -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193326' 00:18:36.991 killing process with pid 193326 00:18:36.991 12:21:00 event.cpu_locks -- common/autotest_common.sh@968 -- # kill 193326 00:18:36.991 12:21:00 event.cpu_locks -- common/autotest_common.sh@973 -- # wait 193326 00:18:37.925 12:21:01 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 193349 ]] 00:18:37.925 12:21:01 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 193349 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@949 -- # '[' -z 193349 ']' 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@953 -- # kill -0 193349 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@954 -- # uname 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193349 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193349' 00:18:37.925 killing process with pid 193349 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@968 -- # kill 193349 00:18:37.925 12:21:01 event.cpu_locks -- common/autotest_common.sh@973 -- # wait 193349 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 193326 ]] 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 193326 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@949 -- # '[' -z 193326 ']' 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@953 -- # kill -0 193326 00:18:38.492 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 953: kill: (193326) - No such process 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@976 -- # echo 'Process with pid 193326 is not found' 00:18:38.492 Process with pid 193326 is not found 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 193349 ]] 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 193349 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@949 -- # '[' -z 193349 ']' 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@953 -- # kill -0 193349 00:18:38.492 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 953: kill: (193349) - No such process 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@976 -- # echo 'Process with pid 193349 is not found' 00:18:38.492 Process with pid 193349 is not found 00:18:38.492 12:21:01 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:18:38.492 00:18:38.492 real 0m24.520s 00:18:38.492 user 0m40.817s 00:18:38.492 sys 0m7.594s 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:38.492 12:21:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:18:38.492 ************************************ 00:18:38.492 END TEST cpu_locks 00:18:38.492 ************************************ 00:18:38.492 00:18:38.492 real 0m53.786s 00:18:38.492 user 1m39.115s 00:18:38.492 sys 0m12.989s 00:18:38.492 12:21:01 event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:38.492 12:21:01 event -- common/autotest_common.sh@10 -- # set +x 00:18:38.492 ************************************ 00:18:38.492 END TEST event 00:18:38.492 ************************************ 00:18:38.492 12:21:02 -- spdk/autotest.sh@182 -- # run_test thread /home/vagrant/spdk_repo/spdk/test/thread/thread.sh 00:18:38.492 12:21:02 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:38.492 12:21:02 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:38.492 12:21:02 -- common/autotest_common.sh@10 -- # set +x 00:18:38.492 ************************************ 00:18:38.492 START TEST thread 00:18:38.492 ************************************ 00:18:38.492 12:21:02 thread -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/thread/thread.sh 00:18:38.782 * Looking for test storage... 00:18:38.782 * Found test storage at /home/vagrant/spdk_repo/spdk/test/thread 00:18:38.782 12:21:02 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:18:38.782 12:21:02 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:18:38.782 12:21:02 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:38.782 12:21:02 thread -- common/autotest_common.sh@10 -- # set +x 00:18:38.782 ************************************ 00:18:38.782 START TEST thread_poller_perf 00:18:38.782 ************************************ 00:18:38.782 12:21:02 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:18:38.782 [2024-06-07 12:21:02.193586] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:38.782 [2024-06-07 12:21:02.193992] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193495 ] 00:18:38.782 [2024-06-07 12:21:02.333257] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:39.054 [2024-06-07 12:21:02.424359] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:39.054 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:18:39.989 ====================================== 00:18:39.989 busy:2107663730 (cyc) 00:18:39.989 total_run_count: 1341000 00:18:39.989 tsc_hz: 2100000000 (cyc) 00:18:39.989 ====================================== 00:18:39.989 poller_cost: 1571 (cyc), 748 (nsec) 00:18:39.989 00:18:39.989 real 0m1.405s 00:18:39.989 user 0m1.185s 00:18:39.989 sys 0m0.110s 00:18:39.989 12:21:03 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:39.989 12:21:03 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:18:39.989 ************************************ 00:18:39.989 END TEST thread_poller_perf 00:18:39.989 ************************************ 00:18:39.989 12:21:03 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:18:39.989 12:21:03 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:18:39.989 12:21:03 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:39.989 12:21:03 thread -- common/autotest_common.sh@10 -- # set +x 00:18:40.248 ************************************ 00:18:40.248 START TEST thread_poller_perf 00:18:40.248 ************************************ 00:18:40.248 12:21:03 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:18:40.248 [2024-06-07 12:21:03.671855] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:40.248 [2024-06-07 12:21:03.672578] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193533 ] 00:18:40.248 [2024-06-07 12:21:03.832832] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:40.507 [2024-06-07 12:21:03.923267] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:40.507 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:18:41.479 ====================================== 00:18:41.479 busy:2103099158 (cyc) 00:18:41.479 total_run_count: 15807000 00:18:41.479 tsc_hz: 2100000000 (cyc) 00:18:41.479 ====================================== 00:18:41.479 poller_cost: 133 (cyc), 63 (nsec) 00:18:41.479 00:18:41.479 real 0m1.443s 00:18:41.479 user 0m1.204s 00:18:41.479 sys 0m0.127s 00:18:41.479 12:21:05 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:41.479 12:21:05 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:18:41.479 ************************************ 00:18:41.479 END TEST thread_poller_perf 00:18:41.479 ************************************ 00:18:41.754 12:21:05 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:18:41.754 12:21:05 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /home/vagrant/spdk_repo/spdk/test/thread/lock/spdk_lock 00:18:41.754 12:21:05 thread -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:41.754 12:21:05 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:41.754 12:21:05 thread -- common/autotest_common.sh@10 -- # set +x 00:18:41.754 ************************************ 00:18:41.754 START TEST thread_spdk_lock 00:18:41.754 ************************************ 00:18:41.754 12:21:05 thread.thread_spdk_lock -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/thread/lock/spdk_lock 00:18:41.754 [2024-06-07 12:21:05.182816] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:41.754 [2024-06-07 12:21:05.183094] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193581 ] 00:18:41.754 [2024-06-07 12:21:05.330581] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:18:42.014 [2024-06-07 12:21:05.445219] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:18:42.014 [2024-06-07 12:21:05.445253] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:42.581 [2024-06-07 12:21:05.930600] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c: 961:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:18:42.581 [2024-06-07 12:21:05.930709] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3072:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:18:42.581 [2024-06-07 12:21:05.930756] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:3027:sspin_stacks_print: *ERROR*: spinlock 0x6d12c0 00:18:42.581 [2024-06-07 12:21:05.932095] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c: 856:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:18:42.581 [2024-06-07 12:21:05.932198] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c:1022:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:18:42.581 [2024-06-07 12:21:05.932255] /home/vagrant/spdk_repo/spdk/lib/thread/thread.c: 856:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:18:42.581 Starting test contend 00:18:42.581 Worker Delay Wait us Hold us Total us 00:18:42.581 0 3 183728 182921 366650 00:18:42.581 1 5 100806 283262 384068 00:18:42.581 PASS test contend 00:18:42.581 Starting test hold_by_poller 00:18:42.581 PASS test hold_by_poller 00:18:42.581 Starting test hold_by_message 00:18:42.581 PASS test hold_by_message 00:18:42.581 /home/vagrant/spdk_repo/spdk/test/thread/lock/spdk_lock summary: 00:18:42.581 100014 assertions passed 00:18:42.581 0 assertions failed 00:18:42.581 00:18:42.581 real 0m0.928s 00:18:42.581 user 0m1.183s 00:18:42.581 sys 0m0.123s 00:18:42.581 12:21:06 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:42.581 12:21:06 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:18:42.581 ************************************ 00:18:42.581 END TEST thread_spdk_lock 00:18:42.581 ************************************ 00:18:42.581 00:18:42.581 real 0m4.074s 00:18:42.581 user 0m3.675s 00:18:42.581 sys 0m0.554s 00:18:42.581 12:21:06 thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:42.581 12:21:06 thread -- common/autotest_common.sh@10 -- # set +x 00:18:42.581 ************************************ 00:18:42.581 END TEST thread 00:18:42.581 ************************************ 00:18:42.581 12:21:06 -- spdk/autotest.sh@183 -- # run_test accel /home/vagrant/spdk_repo/spdk/test/accel/accel.sh 00:18:42.581 12:21:06 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:18:42.581 12:21:06 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:42.581 12:21:06 -- common/autotest_common.sh@10 -- # set +x 00:18:42.582 ************************************ 00:18:42.582 START TEST accel 00:18:42.582 ************************************ 00:18:42.582 12:21:06 accel -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/accel/accel.sh 00:18:42.840 * Looking for test storage... 00:18:42.840 * Found test storage at /home/vagrant/spdk_repo/spdk/test/accel 00:18:42.840 12:21:06 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:18:42.840 12:21:06 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:18:42.840 12:21:06 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:18:42.840 12:21:06 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=193661 00:18:42.840 12:21:06 accel -- accel/accel.sh@63 -- # waitforlisten 193661 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@830 -- # '[' -z 193661 ']' 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:42.840 12:21:06 accel -- accel/accel.sh@61 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:42.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:42.840 12:21:06 accel -- accel/accel.sh@61 -- # build_accel_config 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:42.840 12:21:06 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:42.840 12:21:06 accel -- common/autotest_common.sh@10 -- # set +x 00:18:42.840 12:21:06 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:42.840 12:21:06 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:42.840 12:21:06 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:42.840 12:21:06 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:42.840 12:21:06 accel -- accel/accel.sh@40 -- # local IFS=, 00:18:42.840 12:21:06 accel -- accel/accel.sh@41 -- # jq -r . 00:18:42.840 [2024-06-07 12:21:06.327592] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:42.840 [2024-06-07 12:21:06.327930] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193661 ] 00:18:42.840 [2024-06-07 12:21:06.472598] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:43.098 [2024-06-07 12:21:06.566019] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:44.034 12:21:07 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:44.034 12:21:07 accel -- common/autotest_common.sh@863 -- # return 0 00:18:44.034 12:21:07 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:18:44.034 12:21:07 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:18:44.034 12:21:07 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:18:44.034 12:21:07 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:18:44.034 12:21:07 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:18:44.034 12:21:07 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:18:44.034 12:21:07 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:18:44.034 12:21:07 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:18:44.034 12:21:07 accel -- common/autotest_common.sh@10 -- # set +x 00:18:44.034 12:21:07 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.034 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.034 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.034 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # IFS== 00:18:44.035 12:21:07 accel -- accel/accel.sh@72 -- # read -r opc module 00:18:44.035 12:21:07 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:18:44.035 12:21:07 accel -- accel/accel.sh@75 -- # killprocess 193661 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@949 -- # '[' -z 193661 ']' 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@953 -- # kill -0 193661 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@954 -- # uname 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 193661 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:44.035 killing process with pid 193661 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 193661' 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@968 -- # kill 193661 00:18:44.035 12:21:07 accel -- common/autotest_common.sh@973 -- # wait 193661 00:18:44.602 12:21:08 accel -- accel/accel.sh@76 -- # trap - ERR 00:18:44.602 12:21:08 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@10 -- # set +x 00:18:44.602 12:21:08 accel.accel_help -- common/autotest_common.sh@1124 -- # accel_perf -h 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:18:44.602 12:21:08 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:18:44.602 12:21:08 accel.accel_help -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:44.602 12:21:08 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:18:44.602 12:21:08 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:44.602 12:21:08 accel -- common/autotest_common.sh@10 -- # set +x 00:18:44.602 ************************************ 00:18:44.602 START TEST accel_missing_filename 00:18:44.602 ************************************ 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@649 -- # local es=0 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # type -t accel_perf 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:44.602 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:18:44.602 12:21:08 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:18:44.602 [2024-06-07 12:21:08.181741] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:44.602 [2024-06-07 12:21:08.182154] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193738 ] 00:18:44.860 [2024-06-07 12:21:08.331637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:44.860 [2024-06-07 12:21:08.438490] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:45.117 [2024-06-07 12:21:08.523265] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:18:45.117 [2024-06-07 12:21:08.656950] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:18:45.375 A filename is required. 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # es=234 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # es=106 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # case "$es" in 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@669 -- # es=1 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:45.375 00:18:45.375 real 0m0.674s 00:18:45.375 user 0m0.404s 00:18:45.375 sys 0m0.213s 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:45.375 12:21:08 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:18:45.375 ************************************ 00:18:45.375 END TEST accel_missing_filename 00:18:45.375 ************************************ 00:18:45.375 12:21:08 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:18:45.375 12:21:08 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:18:45.375 12:21:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:45.375 12:21:08 accel -- common/autotest_common.sh@10 -- # set +x 00:18:45.375 ************************************ 00:18:45.375 START TEST accel_compress_verify 00:18:45.375 ************************************ 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@649 -- # local es=0 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # type -t accel_perf 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:45.375 12:21:08 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:18:45.375 12:21:08 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:18:45.375 [2024-06-07 12:21:08.900900] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:45.375 [2024-06-07 12:21:08.902050] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193762 ] 00:18:45.631 [2024-06-07 12:21:09.059932] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:45.631 [2024-06-07 12:21:09.170049] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:45.631 [2024-06-07 12:21:09.256670] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:18:45.889 [2024-06-07 12:21:09.389864] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:18:46.147 00:18:46.147 Compression does not support the verify option, aborting. 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # es=161 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # es=33 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # case "$es" in 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@669 -- # es=1 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:46.147 00:18:46.147 real 0m0.696s 00:18:46.147 user 0m0.430s 00:18:46.147 sys 0m0.201s 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:46.147 12:21:09 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:18:46.147 ************************************ 00:18:46.147 END TEST accel_compress_verify 00:18:46.147 ************************************ 00:18:46.147 12:21:09 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@10 -- # set +x 00:18:46.147 ************************************ 00:18:46.147 START TEST accel_wrong_workload 00:18:46.147 ************************************ 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w foobar 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@649 -- # local es=0 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # type -t accel_perf 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w foobar 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:18:46.147 12:21:09 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:18:46.147 Unsupported workload type: foobar 00:18:46.147 [2024-06-07 12:21:09.635403] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:18:46.147 accel_perf options: 00:18:46.147 [-h help message] 00:18:46.147 [-q queue depth per core] 00:18:46.147 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:18:46.147 [-T number of threads per core 00:18:46.147 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:18:46.147 [-t time in seconds] 00:18:46.147 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:18:46.147 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:18:46.147 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:18:46.147 [-l for compress/decompress workloads, name of uncompressed input file 00:18:46.147 [-S for crc32c workload, use this seed value (default 0) 00:18:46.147 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:18:46.147 [-f for fill workload, use this BYTE value (default 255) 00:18:46.147 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:18:46.147 [-y verify result if this switch is on] 00:18:46.147 [-a tasks to allocate per core (default: same value as -q)] 00:18:46.147 Can be used to spread operations across a wider range of memory. 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # es=1 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:46.147 00:18:46.147 real 0m0.048s 00:18:46.147 user 0m0.024s 00:18:46.147 sys 0m0.023s 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:46.147 ************************************ 00:18:46.147 12:21:09 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:18:46.147 END TEST accel_wrong_workload 00:18:46.147 ************************************ 00:18:46.147 12:21:09 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:46.147 12:21:09 accel -- common/autotest_common.sh@10 -- # set +x 00:18:46.147 ************************************ 00:18:46.147 START TEST accel_negative_buffers 00:18:46.147 ************************************ 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@649 -- # local es=0 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # type -t accel_perf 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:46.147 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w xor -y -x -1 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:18:46.147 12:21:09 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:18:46.147 -x option must be non-negative. 00:18:46.148 [2024-06-07 12:21:09.726859] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:18:46.148 accel_perf options: 00:18:46.148 [-h help message] 00:18:46.148 [-q queue depth per core] 00:18:46.148 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:18:46.148 [-T number of threads per core 00:18:46.148 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:18:46.148 [-t time in seconds] 00:18:46.148 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:18:46.148 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:18:46.148 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:18:46.148 [-l for compress/decompress workloads, name of uncompressed input file 00:18:46.148 [-S for crc32c workload, use this seed value (default 0) 00:18:46.148 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:18:46.148 [-f for fill workload, use this BYTE value (default 255) 00:18:46.148 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:18:46.148 [-y verify result if this switch is on] 00:18:46.148 [-a tasks to allocate per core (default: same value as -q)] 00:18:46.148 Can be used to spread operations across a wider range of memory. 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # es=1 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:46.148 00:18:46.148 real 0m0.047s 00:18:46.148 user 0m0.055s 00:18:46.148 sys 0m0.031s 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:46.148 12:21:09 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:18:46.148 ************************************ 00:18:46.148 END TEST accel_negative_buffers 00:18:46.148 ************************************ 00:18:46.148 12:21:09 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:18:46.148 12:21:09 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:18:46.148 12:21:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:46.148 12:21:09 accel -- common/autotest_common.sh@10 -- # set +x 00:18:46.148 ************************************ 00:18:46.148 START TEST accel_crc32c 00:18:46.148 ************************************ 00:18:46.148 12:21:09 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -S 32 -y 00:18:46.148 12:21:09 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:18:46.148 12:21:09 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:18:46.148 12:21:09 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.148 12:21:09 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.148 12:21:09 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:18:46.406 12:21:09 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:18:46.406 [2024-06-07 12:21:09.821551] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:46.406 [2024-06-07 12:21:09.822274] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193845 ] 00:18:46.406 [2024-06-07 12:21:09.974922] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:46.664 [2024-06-07 12:21:10.092130] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:46.664 12:21:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:18:48.154 12:21:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:48.154 00:18:48.155 real 0m1.689s 00:18:48.155 user 0m1.400s 00:18:48.155 sys 0m0.211s 00:18:48.155 12:21:11 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:48.155 12:21:11 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:18:48.155 ************************************ 00:18:48.155 END TEST accel_crc32c 00:18:48.155 ************************************ 00:18:48.155 12:21:11 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:18:48.155 12:21:11 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:18:48.155 12:21:11 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:48.155 12:21:11 accel -- common/autotest_common.sh@10 -- # set +x 00:18:48.155 ************************************ 00:18:48.155 START TEST accel_crc32c_C2 00:18:48.155 ************************************ 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -y -C 2 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:18:48.155 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:18:48.155 [2024-06-07 12:21:11.570777] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:48.155 [2024-06-07 12:21:11.571202] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193896 ] 00:18:48.155 [2024-06-07 12:21:11.708161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:48.414 [2024-06-07 12:21:11.804312] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:48.414 12:21:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.789 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:49.790 00:18:49.790 real 0m1.638s 00:18:49.790 user 0m1.358s 00:18:49.790 sys 0m0.214s 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:49.790 12:21:13 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:18:49.790 ************************************ 00:18:49.790 END TEST accel_crc32c_C2 00:18:49.790 ************************************ 00:18:49.790 12:21:13 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:18:49.790 12:21:13 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:49.790 12:21:13 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:49.790 12:21:13 accel -- common/autotest_common.sh@10 -- # set +x 00:18:49.790 ************************************ 00:18:49.790 START TEST accel_copy 00:18:49.790 ************************************ 00:18:49.790 12:21:13 accel.accel_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy -y 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:18:49.790 12:21:13 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:18:49.790 [2024-06-07 12:21:13.273858] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:49.790 [2024-06-07 12:21:13.274216] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193937 ] 00:18:49.790 [2024-06-07 12:21:13.424418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:50.049 [2024-06-07 12:21:13.524416] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.049 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:50.050 12:21:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:18:51.430 12:21:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:51.430 00:18:51.430 real 0m1.661s 00:18:51.430 user 0m1.378s 00:18:51.430 sys 0m0.213s 00:18:51.430 12:21:14 accel.accel_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:51.430 12:21:14 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:18:51.430 ************************************ 00:18:51.430 END TEST accel_copy 00:18:51.430 ************************************ 00:18:51.430 12:21:14 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:18:51.430 12:21:14 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:18:51.430 12:21:14 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:51.430 12:21:14 accel -- common/autotest_common.sh@10 -- # set +x 00:18:51.430 ************************************ 00:18:51.430 START TEST accel_fill 00:18:51.430 ************************************ 00:18:51.430 12:21:14 accel.accel_fill -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:18:51.430 12:21:14 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:18:51.430 [2024-06-07 12:21:14.999356] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:51.430 [2024-06-07 12:21:14.999643] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193987 ] 00:18:51.689 [2024-06-07 12:21:15.151369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:51.689 [2024-06-07 12:21:15.246848] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:51.948 12:21:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:18:53.326 12:21:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:53.326 00:18:53.326 real 0m1.649s 00:18:53.326 user 0m1.389s 00:18:53.326 sys 0m0.203s 00:18:53.326 12:21:16 accel.accel_fill -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:53.326 12:21:16 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:18:53.326 ************************************ 00:18:53.326 END TEST accel_fill 00:18:53.326 ************************************ 00:18:53.326 12:21:16 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:18:53.326 12:21:16 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:53.326 12:21:16 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:53.326 12:21:16 accel -- common/autotest_common.sh@10 -- # set +x 00:18:53.326 ************************************ 00:18:53.326 START TEST accel_copy_crc32c 00:18:53.326 ************************************ 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:18:53.326 12:21:16 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:18:53.326 [2024-06-07 12:21:16.715027] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:53.326 [2024-06-07 12:21:16.715855] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194033 ] 00:18:53.326 [2024-06-07 12:21:16.859816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:53.326 [2024-06-07 12:21:16.958082] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:53.584 12:21:17 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:54.963 00:18:54.963 real 0m1.649s 00:18:54.963 user 0m1.385s 00:18:54.963 sys 0m0.201s 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:54.963 12:21:18 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:18:54.963 ************************************ 00:18:54.963 END TEST accel_copy_crc32c 00:18:54.963 ************************************ 00:18:54.963 12:21:18 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:18:54.963 12:21:18 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:18:54.963 12:21:18 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:54.963 12:21:18 accel -- common/autotest_common.sh@10 -- # set +x 00:18:54.963 ************************************ 00:18:54.963 START TEST accel_copy_crc32c_C2 00:18:54.963 ************************************ 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:18:54.964 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:18:54.964 [2024-06-07 12:21:18.435854] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:54.964 [2024-06-07 12:21:18.436141] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194079 ] 00:18:54.964 [2024-06-07 12:21:18.585079] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:55.222 [2024-06-07 12:21:18.720694] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:55.222 12:21:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.598 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:56.599 00:18:56.599 real 0m1.707s 00:18:56.599 user 0m1.405s 00:18:56.599 sys 0m0.236s 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:56.599 ************************************ 00:18:56.599 END TEST accel_copy_crc32c_C2 00:18:56.599 12:21:20 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:18:56.599 ************************************ 00:18:56.599 12:21:20 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:18:56.599 12:21:20 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:56.599 12:21:20 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:56.599 12:21:20 accel -- common/autotest_common.sh@10 -- # set +x 00:18:56.599 ************************************ 00:18:56.599 START TEST accel_dualcast 00:18:56.599 ************************************ 00:18:56.599 12:21:20 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dualcast -y 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:18:56.599 12:21:20 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:18:56.599 [2024-06-07 12:21:20.203636] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:56.599 [2024-06-07 12:21:20.204087] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194129 ] 00:18:56.858 [2024-06-07 12:21:20.353297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:56.858 [2024-06-07 12:21:20.464348] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:57.117 12:21:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:18:58.539 12:21:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:18:58.539 00:18:58.539 real 0m1.674s 00:18:58.539 user 0m1.391s 00:18:58.539 sys 0m0.211s 00:18:58.539 12:21:21 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:58.539 12:21:21 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:18:58.539 ************************************ 00:18:58.539 END TEST accel_dualcast 00:18:58.539 ************************************ 00:18:58.539 12:21:21 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:18:58.539 12:21:21 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:18:58.539 12:21:21 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:58.539 12:21:21 accel -- common/autotest_common.sh@10 -- # set +x 00:18:58.539 ************************************ 00:18:58.539 START TEST accel_compare 00:18:58.539 ************************************ 00:18:58.539 12:21:21 accel.accel_compare -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compare -y 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:18:58.539 12:21:21 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:18:58.539 [2024-06-07 12:21:21.940800] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:18:58.539 [2024-06-07 12:21:21.941111] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194174 ] 00:18:58.539 [2024-06-07 12:21:22.091069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:58.798 [2024-06-07 12:21:22.188237] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:18:58.798 12:21:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:19:00.174 12:21:23 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:00.174 00:19:00.174 real 0m1.659s 00:19:00.174 user 0m1.380s 00:19:00.174 sys 0m0.218s 00:19:00.174 12:21:23 accel.accel_compare -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:00.174 12:21:23 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:19:00.174 ************************************ 00:19:00.174 END TEST accel_compare 00:19:00.174 ************************************ 00:19:00.174 12:21:23 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:19:00.174 12:21:23 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:19:00.174 12:21:23 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:00.174 12:21:23 accel -- common/autotest_common.sh@10 -- # set +x 00:19:00.174 ************************************ 00:19:00.174 START TEST accel_xor 00:19:00.174 ************************************ 00:19:00.174 12:21:23 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:19:00.174 12:21:23 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:19:00.174 [2024-06-07 12:21:23.660018] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:00.174 [2024-06-07 12:21:23.660422] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194225 ] 00:19:00.432 [2024-06-07 12:21:23.820283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:00.432 [2024-06-07 12:21:23.914262] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:00.432 12:21:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:01.806 00:19:01.806 real 0m1.664s 00:19:01.806 user 0m1.390s 00:19:01.806 sys 0m0.215s 00:19:01.806 12:21:25 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:01.806 12:21:25 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:19:01.806 ************************************ 00:19:01.806 END TEST accel_xor 00:19:01.806 ************************************ 00:19:01.806 12:21:25 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:19:01.806 12:21:25 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:19:01.806 12:21:25 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:01.806 12:21:25 accel -- common/autotest_common.sh@10 -- # set +x 00:19:01.806 ************************************ 00:19:01.806 START TEST accel_xor 00:19:01.806 ************************************ 00:19:01.806 12:21:25 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y -x 3 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:19:01.806 12:21:25 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:19:01.806 [2024-06-07 12:21:25.388915] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:01.806 [2024-06-07 12:21:25.389174] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194264 ] 00:19:02.064 [2024-06-07 12:21:25.539140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:02.064 [2024-06-07 12:21:25.632081] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.322 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:02.323 12:21:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:19:03.696 12:21:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:03.696 00:19:03.696 real 0m1.662s 00:19:03.696 user 0m1.365s 00:19:03.696 sys 0m0.217s 00:19:03.696 12:21:27 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:03.696 12:21:27 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:19:03.696 ************************************ 00:19:03.696 END TEST accel_xor 00:19:03.696 ************************************ 00:19:03.696 12:21:27 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:19:03.696 12:21:27 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:19:03.696 12:21:27 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:03.696 12:21:27 accel -- common/autotest_common.sh@10 -- # set +x 00:19:03.696 ************************************ 00:19:03.696 START TEST accel_dif_verify 00:19:03.696 ************************************ 00:19:03.696 12:21:27 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_verify 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:19:03.696 12:21:27 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:19:03.696 [2024-06-07 12:21:27.119186] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:03.696 [2024-06-07 12:21:27.119508] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194313 ] 00:19:03.696 [2024-06-07 12:21:27.268220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:03.962 [2024-06-07 12:21:27.371375] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:03.962 12:21:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:19:05.338 12:21:28 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:05.338 00:19:05.338 real 0m1.671s 00:19:05.338 user 0m1.381s 00:19:05.338 sys 0m0.225s 00:19:05.338 12:21:28 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:05.338 12:21:28 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:19:05.338 ************************************ 00:19:05.338 END TEST accel_dif_verify 00:19:05.338 ************************************ 00:19:05.338 12:21:28 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:19:05.338 12:21:28 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:19:05.338 12:21:28 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:05.338 12:21:28 accel -- common/autotest_common.sh@10 -- # set +x 00:19:05.338 ************************************ 00:19:05.338 START TEST accel_dif_generate 00:19:05.338 ************************************ 00:19:05.338 12:21:28 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:19:05.338 12:21:28 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:19:05.338 [2024-06-07 12:21:28.857629] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:05.338 [2024-06-07 12:21:28.858202] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194359 ] 00:19:05.599 [2024-06-07 12:21:29.008546] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:05.599 [2024-06-07 12:21:29.115862] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:05.599 12:21:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:19:06.978 12:21:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:06.978 00:19:06.978 real 0m1.672s 00:19:06.978 user 0m1.410s 00:19:06.978 sys 0m0.199s 00:19:06.978 12:21:30 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:06.978 12:21:30 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:19:06.978 ************************************ 00:19:06.978 END TEST accel_dif_generate 00:19:06.978 ************************************ 00:19:06.978 12:21:30 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:19:06.978 12:21:30 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:19:06.978 12:21:30 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:06.978 12:21:30 accel -- common/autotest_common.sh@10 -- # set +x 00:19:06.978 ************************************ 00:19:06.978 START TEST accel_dif_generate_copy 00:19:06.978 ************************************ 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate_copy 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:19:06.978 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:19:06.978 [2024-06-07 12:21:30.592268] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:06.979 [2024-06-07 12:21:30.592728] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194403 ] 00:19:07.237 [2024-06-07 12:21:30.743963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:07.237 [2024-06-07 12:21:30.842663] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:07.496 12:21:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:08.873 00:19:08.873 real 0m1.671s 00:19:08.873 user 0m1.395s 00:19:08.873 sys 0m0.210s 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:08.873 12:21:32 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:19:08.873 ************************************ 00:19:08.873 END TEST accel_dif_generate_copy 00:19:08.873 ************************************ 00:19:08.873 12:21:32 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:19:08.873 12:21:32 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:08.873 12:21:32 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:19:08.873 12:21:32 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:08.873 12:21:32 accel -- common/autotest_common.sh@10 -- # set +x 00:19:08.873 ************************************ 00:19:08.873 START TEST accel_comp 00:19:08.873 ************************************ 00:19:08.873 12:21:32 accel.accel_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:19:08.873 12:21:32 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:19:08.873 [2024-06-07 12:21:32.336950] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:08.873 [2024-06-07 12:21:32.337531] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194454 ] 00:19:08.873 [2024-06-07 12:21:32.487963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:09.133 [2024-06-07 12:21:32.586192] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:09.133 12:21:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:19:10.507 12:21:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:10.507 00:19:10.507 real 0m1.677s 00:19:10.507 user 0m1.384s 00:19:10.507 sys 0m0.228s 00:19:10.507 12:21:33 accel.accel_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:10.507 12:21:33 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:19:10.507 ************************************ 00:19:10.507 END TEST accel_comp 00:19:10.507 ************************************ 00:19:10.507 12:21:34 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:19:10.507 12:21:34 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:19:10.507 12:21:34 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:10.507 12:21:34 accel -- common/autotest_common.sh@10 -- # set +x 00:19:10.507 ************************************ 00:19:10.507 START TEST accel_decomp 00:19:10.507 ************************************ 00:19:10.507 12:21:34 accel.accel_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:19:10.507 12:21:34 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:19:10.508 12:21:34 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:19:10.508 [2024-06-07 12:21:34.070458] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:10.508 [2024-06-07 12:21:34.071381] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194500 ] 00:19:10.765 [2024-06-07 12:21:34.210429] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:10.765 [2024-06-07 12:21:34.317407] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.024 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:11.025 12:21:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.400 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.400 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.400 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.400 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:12.401 12:21:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:12.401 00:19:12.401 real 0m1.658s 00:19:12.401 user 0m1.393s 00:19:12.401 sys 0m0.201s 00:19:12.401 12:21:35 accel.accel_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:12.401 12:21:35 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:19:12.401 ************************************ 00:19:12.401 END TEST accel_decomp 00:19:12.401 ************************************ 00:19:12.401 12:21:35 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 00:19:12.401 12:21:35 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:19:12.401 12:21:35 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:12.401 12:21:35 accel -- common/autotest_common.sh@10 -- # set +x 00:19:12.401 ************************************ 00:19:12.401 START TEST accel_decomp_full 00:19:12.401 ************************************ 00:19:12.401 12:21:35 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:19:12.401 12:21:35 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:19:12.401 [2024-06-07 12:21:35.794584] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:12.401 [2024-06-07 12:21:35.795040] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194544 ] 00:19:12.401 [2024-06-07 12:21:35.935145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:12.401 [2024-06-07 12:21:36.029665] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:12.660 12:21:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.037 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:14.038 12:21:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:14.038 00:19:14.038 real 0m1.664s 00:19:14.038 user 0m1.384s 00:19:14.038 sys 0m0.206s 00:19:14.038 12:21:37 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:14.038 12:21:37 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:19:14.038 ************************************ 00:19:14.038 END TEST accel_decomp_full 00:19:14.038 ************************************ 00:19:14.038 12:21:37 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -m 0xf 00:19:14.038 12:21:37 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:19:14.038 12:21:37 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:14.038 12:21:37 accel -- common/autotest_common.sh@10 -- # set +x 00:19:14.038 ************************************ 00:19:14.038 START TEST accel_decomp_mcore 00:19:14.038 ************************************ 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -m 0xf 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -m 0xf 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -m 0xf 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:19:14.038 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:19:14.038 [2024-06-07 12:21:37.529948] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:14.038 [2024-06-07 12:21:37.530513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194590 ] 00:19:14.297 [2024-06-07 12:21:37.699973] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:19:14.297 [2024-06-07 12:21:37.808696] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:19:14.297 [2024-06-07 12:21:37.808805] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:19:14.297 [2024-06-07 12:21:37.808958] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:14.297 [2024-06-07 12:21:37.808958] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:14.297 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:14.298 12:21:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.671 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:15.672 00:19:15.672 real 0m1.643s 00:19:15.672 user 0m4.851s 00:19:15.672 sys 0m0.260s 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:15.672 12:21:39 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:19:15.672 ************************************ 00:19:15.672 END TEST accel_decomp_mcore 00:19:15.672 ************************************ 00:19:15.672 12:21:39 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -m 0xf 00:19:15.672 12:21:39 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:19:15.672 12:21:39 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:15.672 12:21:39 accel -- common/autotest_common.sh@10 -- # set +x 00:19:15.672 ************************************ 00:19:15.672 START TEST accel_decomp_full_mcore 00:19:15.672 ************************************ 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -m 0xf 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -m 0xf 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -m 0xf 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:19:15.672 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:19:15.672 [2024-06-07 12:21:39.242165] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:15.672 [2024-06-07 12:21:39.243115] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194644 ] 00:19:15.931 [2024-06-07 12:21:39.405733] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:19:15.931 [2024-06-07 12:21:39.483491] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:19:15.931 [2024-06-07 12:21:39.483682] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:19:15.931 [2024-06-07 12:21:39.483826] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:19:15.931 [2024-06-07 12:21:39.483844] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:15.931 12:21:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:17.309 00:19:17.309 real 0m1.553s 00:19:17.309 user 0m4.748s 00:19:17.309 sys 0m0.184s 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:17.309 12:21:40 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:19:17.309 ************************************ 00:19:17.309 END TEST accel_decomp_full_mcore 00:19:17.309 ************************************ 00:19:17.309 12:21:40 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -T 2 00:19:17.309 12:21:40 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:19:17.309 12:21:40 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:17.309 12:21:40 accel -- common/autotest_common.sh@10 -- # set +x 00:19:17.309 ************************************ 00:19:17.309 START TEST accel_decomp_mthread 00:19:17.309 ************************************ 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -T 2 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -T 2 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -T 2 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:19:17.309 12:21:40 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:19:17.309 [2024-06-07 12:21:40.862183] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:17.309 [2024-06-07 12:21:40.863175] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194686 ] 00:19:17.569 [2024-06-07 12:21:41.011040] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:17.569 [2024-06-07 12:21:41.068365] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:17.569 12:21:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:18.999 00:19:18.999 real 0m1.471s 00:19:18.999 user 0m1.246s 00:19:18.999 sys 0m0.149s 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:18.999 12:21:42 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:19:18.999 ************************************ 00:19:18.999 END TEST accel_decomp_mthread 00:19:18.999 ************************************ 00:19:18.999 12:21:42 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -T 2 00:19:18.999 12:21:42 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:19:18.999 12:21:42 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:18.999 12:21:42 accel -- common/autotest_common.sh@10 -- # set +x 00:19:18.999 ************************************ 00:19:18.999 START TEST accel_decomp_full_mthread 00:19:18.999 ************************************ 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -T 2 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -T 2 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /home/vagrant/spdk_repo/spdk/test/accel/bib -y -o 0 -T 2 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:19:18.999 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:19:18.999 [2024-06-07 12:21:42.399676] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:18.999 [2024-06-07 12:21:42.400242] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194737 ] 00:19:18.999 [2024-06-07 12:21:42.548127] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:18.999 [2024-06-07 12:21:42.601380] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/home/vagrant/spdk_repo/spdk/test/accel/bib 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:19.259 12:21:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:19:20.637 00:19:20.637 real 0m1.487s 00:19:20.637 user 0m1.263s 00:19:20.637 sys 0m0.158s 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:20.637 12:21:43 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:19:20.637 ************************************ 00:19:20.637 END TEST accel_decomp_full_mthread 00:19:20.637 ************************************ 00:19:20.637 12:21:43 accel -- accel/accel.sh@124 -- # [[ n == y ]] 00:19:20.637 12:21:43 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /home/vagrant/spdk_repo/spdk/test/accel/dif/dif -c /dev/fd/62 00:19:20.637 12:21:43 accel -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:19:20.637 12:21:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:20.637 12:21:43 accel -- common/autotest_common.sh@10 -- # set +x 00:19:20.637 12:21:43 accel -- accel/accel.sh@137 -- # build_accel_config 00:19:20.637 12:21:43 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:19:20.637 12:21:43 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:19:20.637 12:21:43 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:19:20.637 12:21:43 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:19:20.637 12:21:43 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:19:20.637 12:21:43 accel -- accel/accel.sh@40 -- # local IFS=, 00:19:20.637 12:21:43 accel -- accel/accel.sh@41 -- # jq -r . 00:19:20.637 ************************************ 00:19:20.637 START TEST accel_dif_functional_tests 00:19:20.637 ************************************ 00:19:20.637 12:21:43 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/accel/dif/dif -c /dev/fd/62 00:19:20.637 [2024-06-07 12:21:43.961030] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:20.637 [2024-06-07 12:21:43.961535] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194776 ] 00:19:20.637 [2024-06-07 12:21:44.128693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:19:20.637 [2024-06-07 12:21:44.174985] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:19:20.637 [2024-06-07 12:21:44.175157] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:19:20.637 [2024-06-07 12:21:44.175158] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:20.637 00:19:20.637 00:19:20.637 CUnit - A unit testing framework for C - Version 2.1-3 00:19:20.637 http://cunit.sourceforge.net/ 00:19:20.637 00:19:20.637 00:19:20.637 Suite: accel_dif 00:19:20.637 Test: verify: DIF generated, GUARD check ...passed 00:19:20.637 Test: verify: DIF generated, APPTAG check ...passed 00:19:20.637 Test: verify: DIF generated, REFTAG check ...passed 00:19:20.638 Test: verify: DIF not generated, GUARD check ...[2024-06-07 12:21:44.244188] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:19:20.638 passed 00:19:20.638 Test: verify: DIF not generated, APPTAG check ...[2024-06-07 12:21:44.244590] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:19:20.638 passed 00:19:20.638 Test: verify: DIF not generated, REFTAG check ...[2024-06-07 12:21:44.245247] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:19:20.638 passed 00:19:20.638 Test: verify: APPTAG correct, APPTAG check ...passed 00:19:20.638 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-07 12:21:44.245898] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:19:20.638 passed 00:19:20.638 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:19:20.638 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:19:20.638 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:19:20.638 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-07 12:21:44.246800] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:19:20.638 passed 00:19:20.638 Test: verify copy: DIF generated, GUARD check ...passed 00:19:20.638 Test: verify copy: DIF generated, APPTAG check ...passed 00:19:20.638 Test: verify copy: DIF generated, REFTAG check ...passed 00:19:20.638 Test: verify copy: DIF not generated, GUARD check ...[2024-06-07 12:21:44.247806] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:19:20.638 passed 00:19:20.638 Test: verify copy: DIF not generated, APPTAG check ...[2024-06-07 12:21:44.248305] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:19:20.638 passed 00:19:20.638 Test: verify copy: DIF not generated, REFTAG check ...[2024-06-07 12:21:44.248783] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:19:20.638 passed 00:19:20.638 Test: generate copy: DIF generated, GUARD check ...passed 00:19:20.638 Test: generate copy: DIF generated, APTTAG check ...passed 00:19:20.638 Test: generate copy: DIF generated, REFTAG check ...passed 00:19:20.638 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:19:20.638 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:19:20.638 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:19:20.638 Test: generate copy: iovecs-len validate ...[2024-06-07 12:21:44.250380] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:19:20.638 passed 00:19:20.638 Test: generate copy: buffer alignment validate ...passed 00:19:20.638 00:19:20.638 Run Summary: Type Total Ran Passed Failed Inactive 00:19:20.638 suites 1 1 n/a 0 0 00:19:20.638 tests 26 26 26 0 0 00:19:20.638 asserts 115 115 115 0 n/a 00:19:20.638 00:19:20.638 Elapsed time = 0.013 seconds 00:19:20.897 00:19:20.897 real 0m0.553s 00:19:20.897 user 0m0.659s 00:19:20.897 sys 0m0.180s 00:19:20.897 12:21:44 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:20.897 12:21:44 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:19:20.897 ************************************ 00:19:20.897 END TEST accel_dif_functional_tests 00:19:20.897 ************************************ 00:19:20.897 00:19:20.897 real 0m38.338s 00:19:20.897 user 0m38.250s 00:19:20.897 sys 0m6.469s 00:19:20.897 12:21:44 accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:20.897 12:21:44 accel -- common/autotest_common.sh@10 -- # set +x 00:19:20.897 ************************************ 00:19:20.897 END TEST accel 00:19:20.897 ************************************ 00:19:21.156 12:21:44 -- spdk/autotest.sh@184 -- # run_test accel_rpc /home/vagrant/spdk_repo/spdk/test/accel/accel_rpc.sh 00:19:21.156 12:21:44 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:19:21.156 12:21:44 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:21.156 12:21:44 -- common/autotest_common.sh@10 -- # set +x 00:19:21.156 ************************************ 00:19:21.156 START TEST accel_rpc 00:19:21.156 ************************************ 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/accel/accel_rpc.sh 00:19:21.156 * Looking for test storage... 00:19:21.156 * Found test storage at /home/vagrant/spdk_repo/spdk/test/accel 00:19:21.156 12:21:44 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:19:21.156 12:21:44 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=194856 00:19:21.156 12:21:44 accel_rpc -- accel/accel_rpc.sh@13 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --wait-for-rpc 00:19:21.156 12:21:44 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 194856 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@830 -- # '[' -z 194856 ']' 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:21.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:21.156 12:21:44 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:19:21.156 [2024-06-07 12:21:44.728690] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:21.156 [2024-06-07 12:21:44.729099] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194856 ] 00:19:21.415 [2024-06-07 12:21:44.864443] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:21.415 [2024-06-07 12:21:44.911215] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:21.415 12:21:44 accel_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:21.415 12:21:44 accel_rpc -- common/autotest_common.sh@863 -- # return 0 00:19:21.415 12:21:44 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:19:21.415 12:21:44 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:19:21.415 12:21:44 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:19:21.415 12:21:44 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:19:21.415 12:21:44 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:19:21.415 12:21:44 accel_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:19:21.415 12:21:44 accel_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:21.415 12:21:44 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:19:21.415 ************************************ 00:19:21.415 START TEST accel_assign_opcode 00:19:21.415 ************************************ 00:19:21.415 12:21:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # accel_assign_opcode_test_suite 00:19:21.415 12:21:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:19:21.415 12:21:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:21.415 12:21:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:19:21.415 [2024-06-07 12:21:45.004118] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:19:21.415 [2024-06-07 12:21:45.016126] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:21.415 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:21.673 software 00:19:21.673 00:19:21.673 real 0m0.276s 00:19:21.673 user 0m0.078s 00:19:21.673 sys 0m0.012s 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:21.673 12:21:45 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:19:21.673 ************************************ 00:19:21.673 END TEST accel_assign_opcode 00:19:21.673 ************************************ 00:19:21.931 12:21:45 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 194856 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@949 -- # '[' -z 194856 ']' 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@953 -- # kill -0 194856 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@954 -- # uname 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 194856 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 194856' 00:19:21.931 killing process with pid 194856 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@968 -- # kill 194856 00:19:21.931 12:21:45 accel_rpc -- common/autotest_common.sh@973 -- # wait 194856 00:19:22.189 00:19:22.189 real 0m1.143s 00:19:22.189 user 0m1.062s 00:19:22.189 sys 0m0.437s 00:19:22.189 12:21:45 accel_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:22.189 12:21:45 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:19:22.189 ************************************ 00:19:22.189 END TEST accel_rpc 00:19:22.189 ************************************ 00:19:22.189 12:21:45 -- spdk/autotest.sh@185 -- # run_test app_cmdline /home/vagrant/spdk_repo/spdk/test/app/cmdline.sh 00:19:22.189 12:21:45 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:19:22.189 12:21:45 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:22.189 12:21:45 -- common/autotest_common.sh@10 -- # set +x 00:19:22.189 ************************************ 00:19:22.189 START TEST app_cmdline 00:19:22.189 ************************************ 00:19:22.189 12:21:45 app_cmdline -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/app/cmdline.sh 00:19:22.448 * Looking for test storage... 00:19:22.448 * Found test storage at /home/vagrant/spdk_repo/spdk/test/app 00:19:22.448 12:21:45 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:19:22.448 12:21:45 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=194948 00:19:22.448 12:21:45 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 194948 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@830 -- # '[' -z 194948 ']' 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:22.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:22.448 12:21:45 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:19:22.448 12:21:45 app_cmdline -- app/cmdline.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:19:22.448 [2024-06-07 12:21:45.958509] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:22.448 [2024-06-07 12:21:45.959179] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194948 ] 00:19:22.778 [2024-06-07 12:21:46.105631] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:22.778 [2024-06-07 12:21:46.154670] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:23.345 12:21:46 app_cmdline -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:23.345 12:21:46 app_cmdline -- common/autotest_common.sh@863 -- # return 0 00:19:23.345 12:21:46 app_cmdline -- app/cmdline.sh@20 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py spdk_get_version 00:19:23.604 { 00:19:23.604 "version": "SPDK v24.09-pre git sha1 e55c9a812", 00:19:23.604 "fields": { 00:19:23.604 "major": 24, 00:19:23.604 "minor": 9, 00:19:23.604 "patch": 0, 00:19:23.604 "suffix": "-pre", 00:19:23.604 "commit": "e55c9a812" 00:19:23.604 } 00:19:23.604 } 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@26 -- # sort 00:19:23.604 12:21:47 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:19:23.604 12:21:47 app_cmdline -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:23.604 12:21:47 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:23.605 12:21:47 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:19:23.605 12:21:47 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:19:23.605 12:21:47 app_cmdline -- app/cmdline.sh@30 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@649 -- # local es=0 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:23.605 12:21:47 app_cmdline -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:19:23.864 request: 00:19:23.864 { 00:19:23.864 "method": "env_dpdk_get_mem_stats", 00:19:23.864 "req_id": 1 00:19:23.864 } 00:19:23.864 Got JSON-RPC error response 00:19:23.864 response: 00:19:23.864 { 00:19:23.864 "code": -32601, 00:19:23.864 "message": "Method not found" 00:19:23.864 } 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@652 -- # es=1 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:19:23.864 12:21:47 app_cmdline -- app/cmdline.sh@1 -- # killprocess 194948 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@949 -- # '[' -z 194948 ']' 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@953 -- # kill -0 194948 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@954 -- # uname 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 194948 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@967 -- # echo 'killing process with pid 194948' 00:19:23.864 killing process with pid 194948 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@968 -- # kill 194948 00:19:23.864 12:21:47 app_cmdline -- common/autotest_common.sh@973 -- # wait 194948 00:19:24.430 00:19:24.430 real 0m2.023s 00:19:24.430 user 0m2.438s 00:19:24.430 sys 0m0.502s 00:19:24.430 12:21:47 app_cmdline -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:24.430 12:21:47 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:19:24.430 ************************************ 00:19:24.430 END TEST app_cmdline 00:19:24.430 ************************************ 00:19:24.430 12:21:47 -- spdk/autotest.sh@186 -- # run_test version /home/vagrant/spdk_repo/spdk/test/app/version.sh 00:19:24.430 12:21:47 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:19:24.430 12:21:47 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:24.430 12:21:47 -- common/autotest_common.sh@10 -- # set +x 00:19:24.430 ************************************ 00:19:24.430 START TEST version 00:19:24.430 ************************************ 00:19:24.430 12:21:47 version -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/app/version.sh 00:19:24.430 * Looking for test storage... 00:19:24.430 * Found test storage at /home/vagrant/spdk_repo/spdk/test/app 00:19:24.430 12:21:48 version -- app/version.sh@17 -- # get_header_version major 00:19:24.430 12:21:48 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # cut -f2 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # tr -d '"' 00:19:24.430 12:21:48 version -- app/version.sh@17 -- # major=24 00:19:24.430 12:21:48 version -- app/version.sh@18 -- # get_header_version minor 00:19:24.430 12:21:48 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # cut -f2 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # tr -d '"' 00:19:24.430 12:21:48 version -- app/version.sh@18 -- # minor=9 00:19:24.430 12:21:48 version -- app/version.sh@19 -- # get_header_version patch 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # cut -f2 00:19:24.430 12:21:48 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # tr -d '"' 00:19:24.430 12:21:48 version -- app/version.sh@19 -- # patch=0 00:19:24.430 12:21:48 version -- app/version.sh@20 -- # get_header_version suffix 00:19:24.430 12:21:48 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # cut -f2 00:19:24.430 12:21:48 version -- app/version.sh@14 -- # tr -d '"' 00:19:24.430 12:21:48 version -- app/version.sh@20 -- # suffix=-pre 00:19:24.430 12:21:48 version -- app/version.sh@22 -- # version=24.9 00:19:24.430 12:21:48 version -- app/version.sh@25 -- # (( patch != 0 )) 00:19:24.430 12:21:48 version -- app/version.sh@28 -- # version=24.9rc0 00:19:24.430 12:21:48 version -- app/version.sh@30 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:19:24.430 12:21:48 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:19:24.689 12:21:48 version -- app/version.sh@30 -- # py_version=24.9rc0 00:19:24.689 12:21:48 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:19:24.689 00:19:24.689 real 0m0.188s 00:19:24.689 user 0m0.097s 00:19:24.689 sys 0m0.134s 00:19:24.689 12:21:48 version -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:24.689 12:21:48 version -- common/autotest_common.sh@10 -- # set +x 00:19:24.689 ************************************ 00:19:24.689 END TEST version 00:19:24.689 ************************************ 00:19:24.689 12:21:48 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:19:24.689 12:21:48 -- spdk/autotest.sh@189 -- # run_test blockdev_general /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh 00:19:24.689 12:21:48 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:19:24.689 12:21:48 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:24.689 12:21:48 -- common/autotest_common.sh@10 -- # set +x 00:19:24.689 ************************************ 00:19:24.689 START TEST blockdev_general 00:19:24.689 ************************************ 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh 00:19:24.689 * Looking for test storage... 00:19:24.689 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:19:24.689 12:21:48 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=195114 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 195114 00:19:24.689 12:21:48 blockdev_general -- bdev/blockdev.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@830 -- # '[' -z 195114 ']' 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:24.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:24.689 12:21:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:24.689 [2024-06-07 12:21:48.303815] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:24.689 [2024-06-07 12:21:48.304301] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195114 ] 00:19:24.948 [2024-06-07 12:21:48.447861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:24.948 [2024-06-07 12:21:48.496053] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:24.948 12:21:48 blockdev_general -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:24.948 12:21:48 blockdev_general -- common/autotest_common.sh@863 -- # return 0 00:19:24.948 12:21:48 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:19:24.948 12:21:48 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:19:24.948 12:21:48 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:19:24.948 12:21:48 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:24.948 12:21:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.206 [2024-06-07 12:21:48.762281] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:25.206 [2024-06-07 12:21:48.762584] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:25.206 00:19:25.206 [2024-06-07 12:21:48.770266] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:25.206 [2024-06-07 12:21:48.770421] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:25.206 00:19:25.206 Malloc0 00:19:25.206 Malloc1 00:19:25.206 Malloc2 00:19:25.206 Malloc3 00:19:25.206 Malloc4 00:19:25.465 Malloc5 00:19:25.465 Malloc6 00:19:25.465 Malloc7 00:19:25.465 Malloc8 00:19:25.465 Malloc9 00:19:25.465 [2024-06-07 12:21:48.910698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:25.465 [2024-06-07 12:21:48.910898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.465 [2024-06-07 12:21:48.910976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b480 00:19:25.466 [2024-06-07 12:21:48.911121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.466 [2024-06-07 12:21:48.912890] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.466 [2024-06-07 12:21:48.913045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:19:25.466 TestPT 00:19:25.466 12:21:48 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:48 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/home/vagrant/spdk_repo/spdk/test/bdev/aiofile bs=2048 count=5000 00:19:25.466 5000+0 records in 00:19:25.466 5000+0 records out 00:19:25.466 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0278808 s, 367 MB/s 00:19:25.466 12:21:48 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /home/vagrant/spdk_repo/spdk/test/bdev/aiofile AIO0 2048 00:19:25.466 12:21:48 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.466 AIO0 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:19:25.466 12:21:49 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:19:25.466 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:25.725 12:21:49 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:19:25.725 12:21:49 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:19:25.725 12:21:49 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:19:25.727 12:21:49 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8870d154-b511-49a7-8ce4-1a070c92cb78"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8870d154-b511-49a7-8ce4-1a070c92cb78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "fb45ed9c-ed63-568e-9de8-6155a71d1613"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fb45ed9c-ed63-568e-9de8-6155a71d1613",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "41c36e4b-0ded-591b-ad83-3d6df881af80"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "41c36e4b-0ded-591b-ad83-3d6df881af80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "92452a2d-7bd5-5a80-8e19-c28aaab2dec1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "92452a2d-7bd5-5a80-8e19-c28aaab2dec1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "79f914be-d78e-5cb9-ac07-fe59e8380c19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "79f914be-d78e-5cb9-ac07-fe59e8380c19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "30fe63f1-5a9c-53a0-a03d-24059c68bd46"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30fe63f1-5a9c-53a0-a03d-24059c68bd46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cd0c2bc9-f7e0-538c-ac59-140213b1df3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd0c2bc9-f7e0-538c-ac59-140213b1df3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fbde6699-3387-57a1-8cf1-f2c76483ee8f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fbde6699-3387-57a1-8cf1-f2c76483ee8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0001e5af-702f-50ef-913f-44fc6800564d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0001e5af-702f-50ef-913f-44fc6800564d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f5f985fd-2918-531b-b7a7-a4c2c34d68d7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f5f985fd-2918-531b-b7a7-a4c2c34d68d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9619a91c-96da-588b-ba65-a9a0b2d34237"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9619a91c-96da-588b-ba65-a9a0b2d34237",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a39b31d4-4690-404f-9e94-a3a65ba36f8b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bba47e96-dd4f-482f-8506-859bcdca1024",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "29274068-0448-4bb7-a2a8-4a7a8334fb1a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d48d0e70-52b6-440a-8bfe-30722891cd57",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "4f94c834-5baa-4986-a84a-bbf79a0bc81b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "38cf9e76-d891-45e4-b24c-9d166a6e78cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0d283b8b-42b0-4695-bf9d-0bca7dbe502d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cb8c62a6-46a0-4499-a34a-16788603971d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0f44ecf5-f302-4835-9d83-9da4e0a11df0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0f44ecf5-f302-4835-9d83-9da4e0a11df0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/home/vagrant/spdk_repo/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:19:25.727 12:21:49 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:19:25.727 12:21:49 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:19:25.727 12:21:49 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:19:25.727 12:21:49 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 195114 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@949 -- # '[' -z 195114 ']' 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@953 -- # kill -0 195114 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@954 -- # uname 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 195114 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@967 -- # echo 'killing process with pid 195114' 00:19:25.727 killing process with pid 195114 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@968 -- # kill 195114 00:19:25.727 12:21:49 blockdev_general -- common/autotest_common.sh@973 -- # wait 195114 00:19:26.295 12:21:49 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:19:26.295 12:21:49 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Malloc0 '' 00:19:26.295 12:21:49 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:19:26.295 12:21:49 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:26.295 12:21:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:26.295 ************************************ 00:19:26.295 START TEST bdev_hello_world 00:19:26.295 ************************************ 00:19:26.295 12:21:49 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Malloc0 '' 00:19:26.295 [2024-06-07 12:21:49.848088] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:26.295 [2024-06-07 12:21:49.848668] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195158 ] 00:19:26.552 [2024-06-07 12:21:49.996828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:26.552 [2024-06-07 12:21:50.049252] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:26.552 [2024-06-07 12:21:50.171782] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:26.552 [2024-06-07 12:21:50.172143] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:26.552 [2024-06-07 12:21:50.179723] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:26.552 [2024-06-07 12:21:50.179869] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:26.552 [2024-06-07 12:21:50.187764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:26.552 [2024-06-07 12:21:50.187938] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:19:26.552 [2024-06-07 12:21:50.188037] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:19:26.810 [2024-06-07 12:21:50.266656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:26.810 [2024-06-07 12:21:50.266975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.811 [2024-06-07 12:21:50.267051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:19:26.811 [2024-06-07 12:21:50.267171] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.811 [2024-06-07 12:21:50.269047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.811 [2024-06-07 12:21:50.269236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:19:26.811 [2024-06-07 12:21:50.411716] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:19:26.811 [2024-06-07 12:21:50.411996] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:19:26.811 [2024-06-07 12:21:50.412100] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:19:26.811 [2024-06-07 12:21:50.412284] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:19:26.811 [2024-06-07 12:21:50.412432] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:19:26.811 [2024-06-07 12:21:50.412550] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:19:26.811 [2024-06-07 12:21:50.412661] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:19:26.811 00:19:26.811 [2024-06-07 12:21:50.412780] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:19:27.379 00:19:27.379 real 0m0.956s 00:19:27.379 user 0m0.548s 00:19:27.379 sys 0m0.272s 00:19:27.379 12:21:50 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:27.379 12:21:50 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:19:27.379 ************************************ 00:19:27.379 END TEST bdev_hello_world 00:19:27.379 ************************************ 00:19:27.379 12:21:50 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:19:27.379 12:21:50 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:19:27.379 12:21:50 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:27.379 12:21:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:27.379 ************************************ 00:19:27.379 START TEST bdev_bounds 00:19:27.379 ************************************ 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=195191 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 195191' 00:19:27.379 Process bdevio pid: 195191 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 195191 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 195191 ']' 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:27.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:27.379 12:21:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:19:27.379 [2024-06-07 12:21:50.869373] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:27.379 [2024-06-07 12:21:50.870187] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195191 ] 00:19:27.379 [2024-06-07 12:21:51.023195] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:19:27.638 [2024-06-07 12:21:51.072610] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:19:27.638 [2024-06-07 12:21:51.072810] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:27.638 [2024-06-07 12:21:51.072818] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:19:27.638 [2024-06-07 12:21:51.194987] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:27.638 [2024-06-07 12:21:51.196026] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:27.638 [2024-06-07 12:21:51.202941] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:27.638 [2024-06-07 12:21:51.203151] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:27.638 [2024-06-07 12:21:51.211001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:27.638 [2024-06-07 12:21:51.211218] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:19:27.638 [2024-06-07 12:21:51.211401] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:19:27.895 [2024-06-07 12:21:51.292089] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:27.895 [2024-06-07 12:21:51.292606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.895 [2024-06-07 12:21:51.292835] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:19:27.895 [2024-06-07 12:21:51.293020] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.895 [2024-06-07 12:21:51.295486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.895 [2024-06-07 12:21:51.295709] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:19:28.461 12:21:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:28.461 12:21:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:19:28.461 12:21:51 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/tests.py perform_tests 00:19:28.461 I/O targets: 00:19:28.461 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:19:28.461 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:19:28.461 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:19:28.461 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:19:28.461 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:19:28.461 raid0: 131072 blocks of 512 bytes (64 MiB) 00:19:28.461 concat0: 131072 blocks of 512 bytes (64 MiB) 00:19:28.461 raid1: 65536 blocks of 512 bytes (32 MiB) 00:19:28.461 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:19:28.461 00:19:28.461 00:19:28.461 CUnit - A unit testing framework for C - Version 2.1-3 00:19:28.461 http://cunit.sourceforge.net/ 00:19:28.461 00:19:28.461 00:19:28.461 Suite: bdevio tests on: AIO0 00:19:28.461 Test: blockdev write read block ...passed 00:19:28.461 Test: blockdev write zeroes read block ...passed 00:19:28.461 Test: blockdev write zeroes read no split ...passed 00:19:28.461 Test: blockdev write zeroes read split ...passed 00:19:28.461 Test: blockdev write zeroes read split partial ...passed 00:19:28.461 Test: blockdev reset ...passed 00:19:28.461 Test: blockdev write read 8 blocks ...passed 00:19:28.461 Test: blockdev write read size > 128k ...passed 00:19:28.461 Test: blockdev write read invalid size ...passed 00:19:28.461 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.461 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.461 Test: blockdev write read max offset ...passed 00:19:28.461 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.461 Test: blockdev writev readv 8 blocks ...passed 00:19:28.461 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.461 Test: blockdev writev readv block ...passed 00:19:28.461 Test: blockdev writev readv size > 128k ...passed 00:19:28.461 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.461 Test: blockdev comparev and writev ...passed 00:19:28.461 Test: blockdev nvme passthru rw ...passed 00:19:28.461 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.461 Test: blockdev nvme admin passthru ...passed 00:19:28.461 Test: blockdev copy ...passed 00:19:28.461 Suite: bdevio tests on: raid1 00:19:28.461 Test: blockdev write read block ...passed 00:19:28.461 Test: blockdev write zeroes read block ...passed 00:19:28.461 Test: blockdev write zeroes read no split ...passed 00:19:28.461 Test: blockdev write zeroes read split ...passed 00:19:28.461 Test: blockdev write zeroes read split partial ...passed 00:19:28.461 Test: blockdev reset ...passed 00:19:28.461 Test: blockdev write read 8 blocks ...passed 00:19:28.461 Test: blockdev write read size > 128k ...passed 00:19:28.461 Test: blockdev write read invalid size ...passed 00:19:28.461 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.461 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.461 Test: blockdev write read max offset ...passed 00:19:28.462 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.462 Test: blockdev writev readv 8 blocks ...passed 00:19:28.462 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.462 Test: blockdev writev readv block ...passed 00:19:28.462 Test: blockdev writev readv size > 128k ...passed 00:19:28.462 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.462 Test: blockdev comparev and writev ...passed 00:19:28.462 Test: blockdev nvme passthru rw ...passed 00:19:28.462 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.462 Test: blockdev nvme admin passthru ...passed 00:19:28.462 Test: blockdev copy ...passed 00:19:28.462 Suite: bdevio tests on: concat0 00:19:28.462 Test: blockdev write read block ...passed 00:19:28.462 Test: blockdev write zeroes read block ...passed 00:19:28.462 Test: blockdev write zeroes read no split ...passed 00:19:28.462 Test: blockdev write zeroes read split ...passed 00:19:28.462 Test: blockdev write zeroes read split partial ...passed 00:19:28.462 Test: blockdev reset ...passed 00:19:28.462 Test: blockdev write read 8 blocks ...passed 00:19:28.462 Test: blockdev write read size > 128k ...passed 00:19:28.462 Test: blockdev write read invalid size ...passed 00:19:28.462 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.462 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.462 Test: blockdev write read max offset ...passed 00:19:28.462 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.462 Test: blockdev writev readv 8 blocks ...passed 00:19:28.462 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.462 Test: blockdev writev readv block ...passed 00:19:28.462 Test: blockdev writev readv size > 128k ...passed 00:19:28.462 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.462 Test: blockdev comparev and writev ...passed 00:19:28.462 Test: blockdev nvme passthru rw ...passed 00:19:28.462 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.462 Test: blockdev nvme admin passthru ...passed 00:19:28.462 Test: blockdev copy ...passed 00:19:28.462 Suite: bdevio tests on: raid0 00:19:28.462 Test: blockdev write read block ...passed 00:19:28.462 Test: blockdev write zeroes read block ...passed 00:19:28.462 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: TestPT 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p7 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p6 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p5 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p4 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p3 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.721 Test: blockdev write read 8 blocks ...passed 00:19:28.721 Test: blockdev write read size > 128k ...passed 00:19:28.721 Test: blockdev write read invalid size ...passed 00:19:28.721 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.721 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.721 Test: blockdev write read max offset ...passed 00:19:28.721 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.721 Test: blockdev writev readv 8 blocks ...passed 00:19:28.721 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.721 Test: blockdev writev readv block ...passed 00:19:28.721 Test: blockdev writev readv size > 128k ...passed 00:19:28.721 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.721 Test: blockdev comparev and writev ...passed 00:19:28.721 Test: blockdev nvme passthru rw ...passed 00:19:28.721 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.721 Test: blockdev nvme admin passthru ...passed 00:19:28.721 Test: blockdev copy ...passed 00:19:28.721 Suite: bdevio tests on: Malloc2p2 00:19:28.721 Test: blockdev write read block ...passed 00:19:28.721 Test: blockdev write zeroes read block ...passed 00:19:28.721 Test: blockdev write zeroes read no split ...passed 00:19:28.721 Test: blockdev write zeroes read split ...passed 00:19:28.721 Test: blockdev write zeroes read split partial ...passed 00:19:28.721 Test: blockdev reset ...passed 00:19:28.722 Test: blockdev write read 8 blocks ...passed 00:19:28.722 Test: blockdev write read size > 128k ...passed 00:19:28.722 Test: blockdev write read invalid size ...passed 00:19:28.722 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.722 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.722 Test: blockdev write read max offset ...passed 00:19:28.722 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.722 Test: blockdev writev readv 8 blocks ...passed 00:19:28.722 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.722 Test: blockdev writev readv block ...passed 00:19:28.722 Test: blockdev writev readv size > 128k ...passed 00:19:28.722 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.722 Test: blockdev comparev and writev ...passed 00:19:28.722 Test: blockdev nvme passthru rw ...passed 00:19:28.722 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.722 Test: blockdev nvme admin passthru ...passed 00:19:28.722 Test: blockdev copy ...passed 00:19:28.722 Suite: bdevio tests on: Malloc2p1 00:19:28.722 Test: blockdev write read block ...passed 00:19:28.722 Test: blockdev write zeroes read block ...passed 00:19:28.722 Test: blockdev write zeroes read no split ...passed 00:19:28.722 Test: blockdev write zeroes read split ...passed 00:19:28.722 Test: blockdev write zeroes read split partial ...passed 00:19:28.722 Test: blockdev reset ...passed 00:19:28.722 Test: blockdev write read 8 blocks ...passed 00:19:28.722 Test: blockdev write read size > 128k ...passed 00:19:28.722 Test: blockdev write read invalid size ...passed 00:19:28.722 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.722 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.722 Test: blockdev write read max offset ...passed 00:19:28.722 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.722 Test: blockdev writev readv 8 blocks ...passed 00:19:28.722 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.722 Test: blockdev writev readv block ...passed 00:19:28.722 Test: blockdev writev readv size > 128k ...passed 00:19:28.722 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.722 Test: blockdev comparev and writev ...passed 00:19:28.722 Test: blockdev nvme passthru rw ...passed 00:19:28.722 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.722 Test: blockdev nvme admin passthru ...passed 00:19:28.722 Test: blockdev copy ...passed 00:19:28.722 Suite: bdevio tests on: Malloc2p0 00:19:28.722 Test: blockdev write read block ...passed 00:19:28.722 Test: blockdev write zeroes read block ...passed 00:19:28.722 Test: blockdev write zeroes read no split ...passed 00:19:28.722 Test: blockdev write zeroes read split ...passed 00:19:28.722 Test: blockdev write zeroes read split partial ...passed 00:19:28.722 Test: blockdev reset ...passed 00:19:28.722 Test: blockdev write read 8 blocks ...passed 00:19:28.722 Test: blockdev write read size > 128k ...passed 00:19:28.722 Test: blockdev write read invalid size ...passed 00:19:28.722 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.722 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.722 Test: blockdev write read max offset ...passed 00:19:28.722 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.722 Test: blockdev writev readv 8 blocks ...passed 00:19:28.722 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.722 Test: blockdev writev readv block ...passed 00:19:28.722 Test: blockdev writev readv size > 128k ...passed 00:19:28.722 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.722 Test: blockdev comparev and writev ...passed 00:19:28.722 Test: blockdev nvme passthru rw ...passed 00:19:28.722 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.722 Test: blockdev nvme admin passthru ...passed 00:19:28.722 Test: blockdev copy ...passed 00:19:28.722 Suite: bdevio tests on: Malloc1p1 00:19:28.722 Test: blockdev write read block ...passed 00:19:28.722 Test: blockdev write zeroes read block ...passed 00:19:28.722 Test: blockdev write zeroes read no split ...passed 00:19:28.722 Test: blockdev write zeroes read split ...passed 00:19:28.722 Test: blockdev write zeroes read split partial ...passed 00:19:28.722 Test: blockdev reset ...passed 00:19:28.722 Test: blockdev write read 8 blocks ...passed 00:19:28.722 Test: blockdev write read size > 128k ...passed 00:19:28.722 Test: blockdev write read invalid size ...passed 00:19:28.722 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.722 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.722 Test: blockdev write read max offset ...passed 00:19:28.722 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.722 Test: blockdev writev readv 8 blocks ...passed 00:19:28.722 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.722 Test: blockdev writev readv block ...passed 00:19:28.722 Test: blockdev writev readv size > 128k ...passed 00:19:28.722 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.722 Test: blockdev comparev and writev ...passed 00:19:28.722 Test: blockdev nvme passthru rw ...passed 00:19:28.722 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.722 Test: blockdev nvme admin passthru ...passed 00:19:28.722 Test: blockdev copy ...passed 00:19:28.722 Suite: bdevio tests on: Malloc1p0 00:19:28.722 Test: blockdev write read block ...passed 00:19:28.722 Test: blockdev write zeroes read block ...passed 00:19:28.722 Test: blockdev write zeroes read no split ...passed 00:19:28.722 Test: blockdev write zeroes read split ...passed 00:19:28.722 Test: blockdev write zeroes read split partial ...passed 00:19:28.722 Test: blockdev reset ...passed 00:19:28.722 Test: blockdev write read 8 blocks ...passed 00:19:28.722 Test: blockdev write read size > 128k ...passed 00:19:28.722 Test: blockdev write read invalid size ...passed 00:19:28.722 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.722 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.722 Test: blockdev write read max offset ...passed 00:19:28.722 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.722 Test: blockdev writev readv 8 blocks ...passed 00:19:28.722 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.722 Test: blockdev writev readv block ...passed 00:19:28.722 Test: blockdev writev readv size > 128k ...passed 00:19:28.722 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.981 Test: blockdev comparev and writev ...passed 00:19:28.981 Test: blockdev nvme passthru rw ...passed 00:19:28.981 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.981 Test: blockdev nvme admin passthru ...passed 00:19:28.981 Test: blockdev copy ...passed 00:19:28.981 Suite: bdevio tests on: Malloc0 00:19:28.981 Test: blockdev write read block ...passed 00:19:28.981 Test: blockdev write zeroes read block ...passed 00:19:28.981 Test: blockdev write zeroes read no split ...passed 00:19:28.981 Test: blockdev write zeroes read split ...passed 00:19:28.981 Test: blockdev write zeroes read split partial ...passed 00:19:28.981 Test: blockdev reset ...passed 00:19:28.981 Test: blockdev write read 8 blocks ...passed 00:19:28.981 Test: blockdev write read size > 128k ...passed 00:19:28.981 Test: blockdev write read invalid size ...passed 00:19:28.981 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:19:28.981 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:19:28.981 Test: blockdev write read max offset ...passed 00:19:28.981 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:19:28.981 Test: blockdev writev readv 8 blocks ...passed 00:19:28.981 Test: blockdev writev readv 30 x 1block ...passed 00:19:28.981 Test: blockdev writev readv block ...passed 00:19:28.981 Test: blockdev writev readv size > 128k ...passed 00:19:28.981 Test: blockdev writev readv size > 128k in two iovs ...passed 00:19:28.981 Test: blockdev comparev and writev ...passed 00:19:28.981 Test: blockdev nvme passthru rw ...passed 00:19:28.981 Test: blockdev nvme passthru vendor specific ...passed 00:19:28.981 Test: blockdev nvme admin passthru ...passed 00:19:28.981 Test: blockdev copy ...passed 00:19:28.981 00:19:28.981 Run Summary: Type Total Ran Passed Failed Inactive 00:19:28.981 suites 16 16 n/a 0 0 00:19:28.981 tests 368 368 368 0 0 00:19:28.981 asserts 2224 2224 2224 0 n/a 00:19:28.981 00:19:28.981 Elapsed time = 0.816 seconds 00:19:28.981 0 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 195191 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 195191 ']' 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 195191 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 195191 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 195191' 00:19:28.981 killing process with pid 195191 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # kill 195191 00:19:28.981 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@973 -- # wait 195191 00:19:29.240 12:21:52 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:19:29.240 00:19:29.240 real 0m1.934s 00:19:29.240 user 0m4.921s 00:19:29.240 sys 0m0.451s 00:19:29.240 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:29.240 12:21:52 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:19:29.240 ************************************ 00:19:29.240 END TEST bdev_bounds 00:19:29.240 ************************************ 00:19:29.240 12:21:52 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:19:29.240 12:21:52 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:29.240 12:21:52 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:29.240 12:21:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:29.240 ************************************ 00:19:29.240 START TEST bdev_nbd 00:19:29.240 ************************************ 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=195253 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 195253 /var/tmp/spdk-nbd.sock 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 195253 ']' 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:19:29.240 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:29.240 12:21:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:19:29.240 [2024-06-07 12:21:52.869965] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:19:29.240 [2024-06-07 12:21:52.870497] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:29.498 [2024-06-07 12:21:53.016892] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:29.498 [2024-06-07 12:21:53.064588] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:29.755 [2024-06-07 12:21:53.184077] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:29.755 [2024-06-07 12:21:53.184535] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:19:29.755 [2024-06-07 12:21:53.192028] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:29.755 [2024-06-07 12:21:53.192257] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:19:29.755 [2024-06-07 12:21:53.200061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:29.755 [2024-06-07 12:21:53.200300] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:19:29.755 [2024-06-07 12:21:53.200509] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:19:29.755 [2024-06-07 12:21:53.277044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:19:29.755 [2024-06-07 12:21:53.277383] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:29.755 [2024-06-07 12:21:53.277452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:19:29.755 [2024-06-07 12:21:53.277551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:29.755 [2024-06-07 12:21:53.279426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:29.755 [2024-06-07 12:21:53.279592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:30.320 12:21:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:30.578 1+0 records in 00:19:30.578 1+0 records out 00:19:30.578 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00068974 s, 5.9 MB/s 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:30.578 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:30.836 1+0 records in 00:19:30.836 1+0 records out 00:19:30.836 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000445412 s, 9.2 MB/s 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:30.836 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:31.095 1+0 records in 00:19:31.095 1+0 records out 00:19:31.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000435279 s, 9.4 MB/s 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:31.095 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:31.353 1+0 records in 00:19:31.353 1+0 records out 00:19:31.353 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351388 s, 11.7 MB/s 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:31.353 12:21:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:31.610 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:31.611 1+0 records in 00:19:31.611 1+0 records out 00:19:31.611 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000588585 s, 7.0 MB/s 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:31.611 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:31.868 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:31.869 1+0 records in 00:19:31.869 1+0 records out 00:19:31.869 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000647859 s, 6.3 MB/s 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:31.869 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:32.126 1+0 records in 00:19:32.126 1+0 records out 00:19:32.126 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00053477 s, 7.7 MB/s 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:32.126 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:32.127 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:32.127 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:32.127 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:32.384 1+0 records in 00:19:32.384 1+0 records out 00:19:32.384 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000573223 s, 7.1 MB/s 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:32.384 12:21:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:19:32.641 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:19:32.641 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:19:32.641 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:32.642 1+0 records in 00:19:32.642 1+0 records out 00:19:32.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000708983 s, 5.8 MB/s 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:32.642 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:32.899 1+0 records in 00:19:32.899 1+0 records out 00:19:32.899 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000609043 s, 6.7 MB/s 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:32.899 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:33.157 1+0 records in 00:19:33.157 1+0 records out 00:19:33.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00048345 s, 8.5 MB/s 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:33.157 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:33.416 12:21:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:33.416 1+0 records in 00:19:33.416 1+0 records out 00:19:33.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000863364 s, 4.7 MB/s 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:33.416 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:33.982 1+0 records in 00:19:33.982 1+0 records out 00:19:33.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000908344 s, 4.5 MB/s 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:33.982 1+0 records in 00:19:33.982 1+0 records out 00:19:33.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000768729 s, 5.3 MB/s 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:33.982 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:34.240 1+0 records in 00:19:34.240 1+0 records out 00:19:34.240 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000711522 s, 5.8 MB/s 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:34.240 12:21:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:34.806 1+0 records in 00:19:34.806 1+0 records out 00:19:34.806 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00132146 s, 3.1 MB/s 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:19:34.806 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:19:35.064 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd0", 00:19:35.064 "bdev_name": "Malloc0" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd1", 00:19:35.064 "bdev_name": "Malloc1p0" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd2", 00:19:35.064 "bdev_name": "Malloc1p1" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd3", 00:19:35.064 "bdev_name": "Malloc2p0" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd4", 00:19:35.064 "bdev_name": "Malloc2p1" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd5", 00:19:35.064 "bdev_name": "Malloc2p2" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd6", 00:19:35.064 "bdev_name": "Malloc2p3" 00:19:35.064 }, 00:19:35.064 { 00:19:35.064 "nbd_device": "/dev/nbd7", 00:19:35.065 "bdev_name": "Malloc2p4" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd8", 00:19:35.065 "bdev_name": "Malloc2p5" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd9", 00:19:35.065 "bdev_name": "Malloc2p6" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd10", 00:19:35.065 "bdev_name": "Malloc2p7" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd11", 00:19:35.065 "bdev_name": "TestPT" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd12", 00:19:35.065 "bdev_name": "raid0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd13", 00:19:35.065 "bdev_name": "concat0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd14", 00:19:35.065 "bdev_name": "raid1" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd15", 00:19:35.065 "bdev_name": "AIO0" 00:19:35.065 } 00:19:35.065 ]' 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd0", 00:19:35.065 "bdev_name": "Malloc0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd1", 00:19:35.065 "bdev_name": "Malloc1p0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd2", 00:19:35.065 "bdev_name": "Malloc1p1" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd3", 00:19:35.065 "bdev_name": "Malloc2p0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd4", 00:19:35.065 "bdev_name": "Malloc2p1" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd5", 00:19:35.065 "bdev_name": "Malloc2p2" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd6", 00:19:35.065 "bdev_name": "Malloc2p3" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd7", 00:19:35.065 "bdev_name": "Malloc2p4" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd8", 00:19:35.065 "bdev_name": "Malloc2p5" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd9", 00:19:35.065 "bdev_name": "Malloc2p6" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd10", 00:19:35.065 "bdev_name": "Malloc2p7" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd11", 00:19:35.065 "bdev_name": "TestPT" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd12", 00:19:35.065 "bdev_name": "raid0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd13", 00:19:35.065 "bdev_name": "concat0" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd14", 00:19:35.065 "bdev_name": "raid1" 00:19:35.065 }, 00:19:35.065 { 00:19:35.065 "nbd_device": "/dev/nbd15", 00:19:35.065 "bdev_name": "AIO0" 00:19:35.065 } 00:19:35.065 ]' 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:35.065 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:35.323 12:21:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:35.581 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:35.582 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:35.582 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:35.582 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:35.839 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:36.096 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:36.354 12:21:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:36.922 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:37.193 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:19:37.453 12:22:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:37.453 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:37.711 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:19:37.969 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.232 12:22:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:19:38.489 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:19:38.489 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:19:38.489 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:19:38.489 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.490 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.746 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:39.003 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:39.260 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:39.520 12:22:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:19:39.778 /dev/nbd0 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:39.778 1+0 records in 00:19:39.778 1+0 records out 00:19:39.778 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000441112 s, 9.3 MB/s 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:39.778 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:19:40.036 /dev/nbd1 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:40.036 1+0 records in 00:19:40.036 1+0 records out 00:19:40.036 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352684 s, 11.6 MB/s 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:40.036 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:19:40.036 /dev/nbd10 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:40.294 1+0 records in 00:19:40.294 1+0 records out 00:19:40.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418563 s, 9.8 MB/s 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:19:40.294 /dev/nbd11 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:40.294 1+0 records in 00:19:40.294 1+0 records out 00:19:40.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00075402 s, 5.4 MB/s 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:40.294 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.552 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:40.552 12:22:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:40.552 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:40.552 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:40.552 12:22:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:19:40.552 /dev/nbd12 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:40.552 1+0 records in 00:19:40.552 1+0 records out 00:19:40.552 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000729308 s, 5.6 MB/s 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:40.552 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:19:40.809 /dev/nbd13 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:40.809 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:41.068 1+0 records in 00:19:41.068 1+0 records out 00:19:41.068 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000506529 s, 8.1 MB/s 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:19:41.068 /dev/nbd14 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:41.068 1+0 records in 00:19:41.068 1+0 records out 00:19:41.068 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000861895 s, 4.8 MB/s 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:41.068 12:22:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:19:41.634 /dev/nbd15 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:41.634 1+0 records in 00:19:41.634 1+0 records out 00:19:41.634 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000568534 s, 7.2 MB/s 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:41.634 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:19:41.892 /dev/nbd2 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:41.892 1+0 records in 00:19:41.892 1+0 records out 00:19:41.892 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000520463 s, 7.9 MB/s 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:41.892 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:19:42.151 /dev/nbd3 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.151 1+0 records in 00:19:42.151 1+0 records out 00:19:42.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000678507 s, 6.0 MB/s 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:42.151 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:19:42.410 /dev/nbd4 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.410 1+0 records in 00:19:42.410 1+0 records out 00:19:42.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00081837 s, 5.0 MB/s 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:42.410 12:22:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:19:42.699 /dev/nbd5 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.699 1+0 records in 00:19:42.699 1+0 records out 00:19:42.699 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000801844 s, 5.1 MB/s 00:19:42.699 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:19:42.967 /dev/nbd6 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.967 1+0 records in 00:19:42.967 1+0 records out 00:19:42.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000786856 s, 5.2 MB/s 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:42.967 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:19:43.225 /dev/nbd7 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:43.225 1+0 records in 00:19:43.225 1+0 records out 00:19:43.225 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000592892 s, 6.9 MB/s 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:43.225 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:43.226 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:43.226 12:22:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:19:43.484 /dev/nbd8 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:43.484 1+0 records in 00:19:43.484 1+0 records out 00:19:43.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000930815 s, 4.4 MB/s 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:43.484 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:19:43.742 /dev/nbd9 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:43.743 1+0 records in 00:19:43.743 1+0 records out 00:19:43.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00108639 s, 3.8 MB/s 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:43.743 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:19:44.002 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd0", 00:19:44.002 "bdev_name": "Malloc0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd1", 00:19:44.002 "bdev_name": "Malloc1p0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd10", 00:19:44.002 "bdev_name": "Malloc1p1" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd11", 00:19:44.002 "bdev_name": "Malloc2p0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd12", 00:19:44.002 "bdev_name": "Malloc2p1" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd13", 00:19:44.002 "bdev_name": "Malloc2p2" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd14", 00:19:44.002 "bdev_name": "Malloc2p3" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd15", 00:19:44.002 "bdev_name": "Malloc2p4" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd2", 00:19:44.002 "bdev_name": "Malloc2p5" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd3", 00:19:44.002 "bdev_name": "Malloc2p6" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd4", 00:19:44.002 "bdev_name": "Malloc2p7" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd5", 00:19:44.002 "bdev_name": "TestPT" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd6", 00:19:44.002 "bdev_name": "raid0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd7", 00:19:44.002 "bdev_name": "concat0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd8", 00:19:44.002 "bdev_name": "raid1" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd9", 00:19:44.002 "bdev_name": "AIO0" 00:19:44.002 } 00:19:44.002 ]' 00:19:44.002 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd0", 00:19:44.002 "bdev_name": "Malloc0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd1", 00:19:44.002 "bdev_name": "Malloc1p0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd10", 00:19:44.002 "bdev_name": "Malloc1p1" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd11", 00:19:44.002 "bdev_name": "Malloc2p0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd12", 00:19:44.002 "bdev_name": "Malloc2p1" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd13", 00:19:44.002 "bdev_name": "Malloc2p2" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd14", 00:19:44.002 "bdev_name": "Malloc2p3" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd15", 00:19:44.002 "bdev_name": "Malloc2p4" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd2", 00:19:44.002 "bdev_name": "Malloc2p5" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd3", 00:19:44.002 "bdev_name": "Malloc2p6" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd4", 00:19:44.002 "bdev_name": "Malloc2p7" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd5", 00:19:44.002 "bdev_name": "TestPT" 00:19:44.002 }, 00:19:44.002 { 00:19:44.002 "nbd_device": "/dev/nbd6", 00:19:44.002 "bdev_name": "raid0" 00:19:44.002 }, 00:19:44.002 { 00:19:44.003 "nbd_device": "/dev/nbd7", 00:19:44.003 "bdev_name": "concat0" 00:19:44.003 }, 00:19:44.003 { 00:19:44.003 "nbd_device": "/dev/nbd8", 00:19:44.003 "bdev_name": "raid1" 00:19:44.003 }, 00:19:44.003 { 00:19:44.003 "nbd_device": "/dev/nbd9", 00:19:44.003 "bdev_name": "AIO0" 00:19:44.003 } 00:19:44.003 ]' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:19:44.003 /dev/nbd1 00:19:44.003 /dev/nbd10 00:19:44.003 /dev/nbd11 00:19:44.003 /dev/nbd12 00:19:44.003 /dev/nbd13 00:19:44.003 /dev/nbd14 00:19:44.003 /dev/nbd15 00:19:44.003 /dev/nbd2 00:19:44.003 /dev/nbd3 00:19:44.003 /dev/nbd4 00:19:44.003 /dev/nbd5 00:19:44.003 /dev/nbd6 00:19:44.003 /dev/nbd7 00:19:44.003 /dev/nbd8 00:19:44.003 /dev/nbd9' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:19:44.003 /dev/nbd1 00:19:44.003 /dev/nbd10 00:19:44.003 /dev/nbd11 00:19:44.003 /dev/nbd12 00:19:44.003 /dev/nbd13 00:19:44.003 /dev/nbd14 00:19:44.003 /dev/nbd15 00:19:44.003 /dev/nbd2 00:19:44.003 /dev/nbd3 00:19:44.003 /dev/nbd4 00:19:44.003 /dev/nbd5 00:19:44.003 /dev/nbd6 00:19:44.003 /dev/nbd7 00:19:44.003 /dev/nbd8 00:19:44.003 /dev/nbd9' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:19:44.003 256+0 records in 00:19:44.003 256+0 records out 00:19:44.003 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116288 s, 90.2 MB/s 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.003 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:19:44.262 256+0 records in 00:19:44.262 256+0 records out 00:19:44.262 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.160712 s, 6.5 MB/s 00:19:44.262 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.262 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:19:44.520 256+0 records in 00:19:44.520 256+0 records out 00:19:44.520 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.138784 s, 7.6 MB/s 00:19:44.520 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.520 12:22:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:19:44.520 256+0 records in 00:19:44.520 256+0 records out 00:19:44.520 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.142199 s, 7.4 MB/s 00:19:44.520 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.520 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:19:44.778 256+0 records in 00:19:44.778 256+0 records out 00:19:44.778 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.145624 s, 7.2 MB/s 00:19:44.778 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.778 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:19:44.778 256+0 records in 00:19:44.778 256+0 records out 00:19:44.778 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.150831 s, 7.0 MB/s 00:19:44.778 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:44.778 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:19:45.037 256+0 records in 00:19:45.037 256+0 records out 00:19:45.037 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.151699 s, 6.9 MB/s 00:19:45.037 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.037 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:19:45.296 256+0 records in 00:19:45.296 256+0 records out 00:19:45.296 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.151255 s, 6.9 MB/s 00:19:45.296 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.296 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:19:45.296 256+0 records in 00:19:45.296 256+0 records out 00:19:45.296 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.144529 s, 7.3 MB/s 00:19:45.296 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.296 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:19:45.554 256+0 records in 00:19:45.554 256+0 records out 00:19:45.554 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.136136 s, 7.7 MB/s 00:19:45.554 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.554 12:22:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:19:45.554 256+0 records in 00:19:45.555 256+0 records out 00:19:45.555 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.137358 s, 7.6 MB/s 00:19:45.555 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.555 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:19:45.814 256+0 records in 00:19:45.814 256+0 records out 00:19:45.814 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.145665 s, 7.2 MB/s 00:19:45.814 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.814 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:19:45.814 256+0 records in 00:19:45.814 256+0 records out 00:19:45.814 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.148434 s, 7.1 MB/s 00:19:45.814 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:45.814 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:19:46.072 256+0 records in 00:19:46.072 256+0 records out 00:19:46.072 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.147512 s, 7.1 MB/s 00:19:46.072 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:46.072 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:19:46.331 256+0 records in 00:19:46.331 256+0 records out 00:19:46.331 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.150857 s, 7.0 MB/s 00:19:46.331 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:46.331 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:19:46.331 256+0 records in 00:19:46.331 256+0 records out 00:19:46.332 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.144507 s, 7.3 MB/s 00:19:46.332 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:19:46.332 12:22:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:19:46.591 256+0 records in 00:19:46.591 256+0 records out 00:19:46.591 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.222776 s, 4.7 MB/s 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd0 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd1 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd10 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd11 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd12 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd13 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd14 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd15 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd2 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd3 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd4 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd5 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.591 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd6 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd7 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd8 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd9 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:46.850 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:47.167 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:47.426 12:22:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:47.685 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:47.944 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.203 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.462 12:22:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.462 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:19:48.720 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.721 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:49.034 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:49.294 12:22:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:49.553 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:49.811 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:49.812 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:50.070 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:50.329 12:22:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:50.588 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:50.846 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:51.104 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:19:51.363 12:22:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:19:51.621 malloc_lvol_verify 00:19:51.621 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:19:51.880 28a79137-ffa6-4108-bb2e-d421c9d960b4 00:19:51.880 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:19:52.140 a8ceadc8-a2d8-44ce-88ff-5d41bb546614 00:19:52.140 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:19:52.444 /dev/nbd0 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:19:52.444 mke2fs 1.46.5 (30-Dec-2021) 00:19:52.444 Discarding device blocks: 0/4096 done 00:19:52.444 Creating filesystem with 4096 1k blocks and 1024 inodes 00:19:52.444 00:19:52.444 Allocating group tables: 0/1 done 00:19:52.444 Writing inode tables: 0/1 done 00:19:52.444 Creating journal (1024 blocks): done 00:19:52.444 Writing superblocks and filesystem accounting information: 0/1 done 00:19:52.444 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:52.444 12:22:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 195253 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 195253 ']' 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 195253 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 195253 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 195253' 00:19:52.703 killing process with pid 195253 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # kill 195253 00:19:52.703 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@973 -- # wait 195253 00:19:53.270 12:22:16 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:19:53.270 00:19:53.270 real 0m24.022s 00:19:53.270 user 0m29.713s 00:19:53.270 sys 0m13.234s 00:19:53.270 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:53.270 12:22:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:19:53.270 ************************************ 00:19:53.270 END TEST bdev_nbd 00:19:53.270 ************************************ 00:19:53.270 12:22:16 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:19:53.270 12:22:16 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:19:53.270 12:22:16 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:19:53.270 12:22:16 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:19:53.270 12:22:16 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:19:53.270 12:22:16 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:53.270 12:22:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:19:53.529 ************************************ 00:19:53.529 START TEST bdev_fio 00:19:53.529 ************************************ 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /home/vagrant/spdk_repo/spdk/test/bdev 00:19:53.529 /home/vagrant/spdk_repo/spdk/test/bdev /home/vagrant/spdk_repo/spdk 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio verify AIO '' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio ']' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:19:53.529 12:22:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:19:53.529 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json' 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:53.530 12:22:17 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:19:53.530 ************************************ 00:19:53.530 START TEST bdev_fio_rw_verify 00:19:53.530 ************************************ 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib=/usr/lib64/libasan.so.6 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n /usr/lib64/libasan.so.6 ]] 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # break 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD='/usr/lib64/libasan.so.6 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:19:53.530 12:22:17 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:19:53.789 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:19:53.789 fio-3.35 00:19:53.789 Starting 16 threads 00:20:05.987 00:20:05.987 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=196162: Fri Jun 7 12:22:28 2024 00:20:05.987 read: IOPS=117k, BW=458MiB/s (480MB/s)(4576MiB/10001msec) 00:20:05.987 slat (nsec): min=702, max=44027k, avg=16324.41, stdev=258220.40 00:20:05.987 clat (usec): min=2, max=44104, avg=164.59, stdev=886.00 00:20:05.987 lat (usec): min=5, max=44113, avg=180.92, stdev=923.13 00:20:05.987 clat percentiles (usec): 00:20:05.987 | 50.000th=[ 87], 99.000th=[ 725], 99.900th=[13698], 99.990th=[26084], 00:20:05.987 | 99.999th=[41157] 00:20:05.987 write: IOPS=190k, BW=741MiB/s (777MB/s)(7331MiB/9899msec); 0 zone resets 00:20:05.987 slat (usec): min=2, max=56805, avg=45.67, stdev=509.24 00:20:05.987 clat (usec): min=3, max=58728, avg=242.33, stdev=1119.83 00:20:05.987 lat (usec): min=19, max=58742, avg=288.00, stdev=1232.08 00:20:05.987 clat percentiles (usec): 00:20:05.987 | 50.000th=[ 131], 99.000th=[ 1467], 99.900th=[17433], 99.990th=[29230], 00:20:05.987 | 99.999th=[47973] 00:20:05.987 bw ( KiB/s): min=407904, max=1179264, per=98.86%, avg=749705.05, stdev=15043.53, samples=304 00:20:05.987 iops : min=101975, max=294816, avg=187426.37, stdev=3760.89, samples=304 00:20:05.987 lat (usec) : 4=0.01%, 10=0.01%, 20=0.54%, 50=11.27%, 100=32.84% 00:20:05.987 lat (usec) : 250=47.42%, 500=4.15%, 750=2.38%, 1000=0.36% 00:20:05.987 lat (msec) : 2=0.30%, 4=0.13%, 10=0.20%, 20=0.33%, 50=0.05% 00:20:05.987 lat (msec) : 100=0.01% 00:20:05.987 cpu : usr=54.15%, sys=1.95%, ctx=253364, majf=1, minf=140245 00:20:05.987 IO depths : 1=11.7%, 2=23.9%, 4=51.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:20:05.987 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:20:05.987 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:20:05.987 issued rwts: total=1171389,1876769,0,0 short=0,0,0,0 dropped=0,0,0,0 00:20:05.987 latency : target=0, window=0, percentile=100.00%, depth=8 00:20:05.987 00:20:05.987 Run status group 0 (all jobs): 00:20:05.987 READ: bw=458MiB/s (480MB/s), 458MiB/s-458MiB/s (480MB/s-480MB/s), io=4576MiB (4798MB), run=10001-10001msec 00:20:05.987 WRITE: bw=741MiB/s (777MB/s), 741MiB/s-741MiB/s (777MB/s-777MB/s), io=7331MiB (7687MB), run=9899-9899msec 00:20:05.987 ----------------------------------------------------- 00:20:05.987 Suppressions used: 00:20:05.987 count bytes template 00:20:05.987 16 140 /usr/src/fio/parse.c 00:20:05.987 10496 1007616 /usr/src/fio/iolog.c 00:20:05.987 1 8 libtcmalloc_minimal.so 00:20:05.987 1 904 libcrypto.so 00:20:05.987 ----------------------------------------------------- 00:20:05.987 00:20:05.987 00:20:05.987 real 0m12.146s 00:20:05.987 user 1m29.191s 00:20:05.987 sys 0m3.881s 00:20:05.987 12:22:29 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:05.987 12:22:29 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:20:05.987 ************************************ 00:20:05.987 END TEST bdev_fio_rw_verify 00:20:05.987 ************************************ 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio trim '' '' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio ']' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:20:05.987 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:20:05.989 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8870d154-b511-49a7-8ce4-1a070c92cb78"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8870d154-b511-49a7-8ce4-1a070c92cb78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "fb45ed9c-ed63-568e-9de8-6155a71d1613"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fb45ed9c-ed63-568e-9de8-6155a71d1613",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "41c36e4b-0ded-591b-ad83-3d6df881af80"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "41c36e4b-0ded-591b-ad83-3d6df881af80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "92452a2d-7bd5-5a80-8e19-c28aaab2dec1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "92452a2d-7bd5-5a80-8e19-c28aaab2dec1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "79f914be-d78e-5cb9-ac07-fe59e8380c19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "79f914be-d78e-5cb9-ac07-fe59e8380c19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "30fe63f1-5a9c-53a0-a03d-24059c68bd46"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30fe63f1-5a9c-53a0-a03d-24059c68bd46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cd0c2bc9-f7e0-538c-ac59-140213b1df3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd0c2bc9-f7e0-538c-ac59-140213b1df3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fbde6699-3387-57a1-8cf1-f2c76483ee8f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fbde6699-3387-57a1-8cf1-f2c76483ee8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0001e5af-702f-50ef-913f-44fc6800564d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0001e5af-702f-50ef-913f-44fc6800564d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f5f985fd-2918-531b-b7a7-a4c2c34d68d7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f5f985fd-2918-531b-b7a7-a4c2c34d68d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9619a91c-96da-588b-ba65-a9a0b2d34237"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9619a91c-96da-588b-ba65-a9a0b2d34237",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a39b31d4-4690-404f-9e94-a3a65ba36f8b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bba47e96-dd4f-482f-8506-859bcdca1024",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "29274068-0448-4bb7-a2a8-4a7a8334fb1a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d48d0e70-52b6-440a-8bfe-30722891cd57",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "4f94c834-5baa-4986-a84a-bbf79a0bc81b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "38cf9e76-d891-45e4-b24c-9d166a6e78cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0d283b8b-42b0-4695-bf9d-0bca7dbe502d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cb8c62a6-46a0-4499-a34a-16788603971d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0f44ecf5-f302-4835-9d83-9da4e0a11df0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0f44ecf5-f302-4835-9d83-9da4e0a11df0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/home/vagrant/spdk_repo/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:20:05.989 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:20:05.989 Malloc1p0 00:20:05.989 Malloc1p1 00:20:05.989 Malloc2p0 00:20:05.989 Malloc2p1 00:20:05.989 Malloc2p2 00:20:05.989 Malloc2p3 00:20:05.989 Malloc2p4 00:20:05.989 Malloc2p5 00:20:05.989 Malloc2p6 00:20:05.989 Malloc2p7 00:20:05.989 TestPT 00:20:05.989 raid0 00:20:05.989 concat0 ]] 00:20:05.989 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8870d154-b511-49a7-8ce4-1a070c92cb78"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8870d154-b511-49a7-8ce4-1a070c92cb78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "fb45ed9c-ed63-568e-9de8-6155a71d1613"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fb45ed9c-ed63-568e-9de8-6155a71d1613",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "41c36e4b-0ded-591b-ad83-3d6df881af80"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "41c36e4b-0ded-591b-ad83-3d6df881af80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "92452a2d-7bd5-5a80-8e19-c28aaab2dec1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "92452a2d-7bd5-5a80-8e19-c28aaab2dec1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "79f914be-d78e-5cb9-ac07-fe59e8380c19"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "79f914be-d78e-5cb9-ac07-fe59e8380c19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "dc6cb53c-8bd1-5ef3-9778-df6fb0de0eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "30fe63f1-5a9c-53a0-a03d-24059c68bd46"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30fe63f1-5a9c-53a0-a03d-24059c68bd46",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cd0c2bc9-f7e0-538c-ac59-140213b1df3d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd0c2bc9-f7e0-538c-ac59-140213b1df3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fbde6699-3387-57a1-8cf1-f2c76483ee8f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fbde6699-3387-57a1-8cf1-f2c76483ee8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "0001e5af-702f-50ef-913f-44fc6800564d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0001e5af-702f-50ef-913f-44fc6800564d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f5f985fd-2918-531b-b7a7-a4c2c34d68d7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f5f985fd-2918-531b-b7a7-a4c2c34d68d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9619a91c-96da-588b-ba65-a9a0b2d34237"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9619a91c-96da-588b-ba65-a9a0b2d34237",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "a39b31d4-4690-404f-9e94-a3a65ba36f8b"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a39b31d4-4690-404f-9e94-a3a65ba36f8b",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bba47e96-dd4f-482f-8506-859bcdca1024",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "29274068-0448-4bb7-a2a8-4a7a8334fb1a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2fc8c564-02cf-4e07-8ea2-c451ea9e3d17",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d48d0e70-52b6-440a-8bfe-30722891cd57",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "4f94c834-5baa-4986-a84a-bbf79a0bc81b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "38cf9e76-d891-45e4-b24c-9d166a6e78cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "38cf9e76-d891-45e4-b24c-9d166a6e78cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0d283b8b-42b0-4695-bf9d-0bca7dbe502d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cb8c62a6-46a0-4499-a34a-16788603971d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0f44ecf5-f302-4835-9d83-9da4e0a11df0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0f44ecf5-f302-4835-9d83-9da4e0a11df0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/home/vagrant/spdk_repo/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:05.990 12:22:29 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:20:05.990 ************************************ 00:20:05.990 START TEST bdev_fio_trim 00:20:05.990 ************************************ 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib=/usr/lib64/libasan.so.6 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n /usr/lib64/libasan.so.6 ]] 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # break 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD='/usr/lib64/libasan.so.6 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:20:05.990 12:22:29 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:20:05.990 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.990 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:20:05.991 fio-3.35 00:20:05.991 Starting 14 threads 00:20:18.229 00:20:18.229 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=196364: Fri Jun 7 12:22:40 2024 00:20:18.229 write: IOPS=270k, BW=1056MiB/s (1107MB/s)(10.3GiB/10001msec); 0 zone resets 00:20:18.229 slat (nsec): min=1258, max=24019k, avg=16332.74, stdev=244404.69 00:20:18.229 clat (usec): min=10, max=31066, avg=144.96, stdev=741.85 00:20:18.229 lat (usec): min=13, max=31074, avg=161.29, stdev=780.72 00:20:18.229 clat percentiles (usec): 00:20:18.229 | 50.000th=[ 86], 99.000th=[ 701], 99.900th=[13173], 99.990th=[14091], 00:20:18.229 | 99.999th=[25560] 00:20:18.229 bw ( MiB/s): min= 747, max= 1415, per=100.00%, avg=1056.45, stdev=15.55, samples=266 00:20:18.229 iops : min=191441, max=362446, avg=270451.11, stdev=3980.69, samples=266 00:20:18.229 trim: IOPS=270k, BW=1056MiB/s (1107MB/s)(10.3GiB/10001msec); 0 zone resets 00:20:18.229 slat (usec): min=2, max=28499, avg=13.04, stdev=218.87 00:20:18.229 clat (nsec): min=1994, max=28619k, avg=133965.84, stdev=704185.92 00:20:18.229 lat (usec): min=8, max=28625, avg=147.01, stdev=737.36 00:20:18.229 clat percentiles (usec): 00:20:18.229 | 50.000th=[ 94], 99.000th=[ 192], 99.900th=[13173], 99.990th=[13173], 00:20:18.229 | 99.999th=[21103] 00:20:18.229 bw ( MiB/s): min= 747, max= 1415, per=100.00%, avg=1056.45, stdev=15.55, samples=266 00:20:18.229 iops : min=191461, max=362446, avg=270452.05, stdev=3980.39, samples=266 00:20:18.229 lat (usec) : 2=0.01%, 4=0.01%, 10=0.26%, 20=0.49%, 50=6.20% 00:20:18.229 lat (usec) : 100=53.65%, 250=37.00%, 500=0.96%, 750=1.09%, 1000=0.02% 00:20:18.229 lat (msec) : 2=0.01%, 4=0.01%, 10=0.02%, 20=0.29%, 50=0.01% 00:20:18.229 cpu : usr=69.05%, sys=0.20%, ctx=172906, majf=0, minf=9277 00:20:18.229 IO depths : 1=12.2%, 2=24.5%, 4=50.1%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:20:18.229 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:20:18.229 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:20:18.229 issued rwts: total=0,2703799,2703804,0 short=0,0,0,0 dropped=0,0,0,0 00:20:18.229 latency : target=0, window=0, percentile=100.00%, depth=8 00:20:18.229 00:20:18.229 Run status group 0 (all jobs): 00:20:18.229 WRITE: bw=1056MiB/s (1107MB/s), 1056MiB/s-1056MiB/s (1107MB/s-1107MB/s), io=10.3GiB (11.1GB), run=10001-10001msec 00:20:18.229 TRIM: bw=1056MiB/s (1107MB/s), 1056MiB/s-1056MiB/s (1107MB/s-1107MB/s), io=10.3GiB (11.1GB), run=10001-10001msec 00:20:18.229 ----------------------------------------------------- 00:20:18.229 Suppressions used: 00:20:18.229 count bytes template 00:20:18.229 14 129 /usr/src/fio/parse.c 00:20:18.229 1 8 libtcmalloc_minimal.so 00:20:18.229 1 904 libcrypto.so 00:20:18.229 ----------------------------------------------------- 00:20:18.229 00:20:18.229 00:20:18.229 real 0m11.744s 00:20:18.229 user 1m38.922s 00:20:18.229 sys 0m0.945s 00:20:18.229 12:22:41 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:18.229 12:22:41 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:20:18.229 ************************************ 00:20:18.229 END TEST bdev_fio_trim 00:20:18.229 ************************************ 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:20:18.229 /home/vagrant/spdk_repo/spdk 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:20:18.229 00:20:18.229 real 0m24.316s 00:20:18.229 user 3m8.280s 00:20:18.229 sys 0m5.010s 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:18.229 12:22:41 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:20:18.229 ************************************ 00:20:18.229 END TEST bdev_fio 00:20:18.229 ************************************ 00:20:18.229 12:22:41 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:20:18.229 12:22:41 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:20:18.229 12:22:41 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:20:18.229 12:22:41 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:18.229 12:22:41 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:18.229 ************************************ 00:20:18.229 START TEST bdev_verify 00:20:18.229 ************************************ 00:20:18.229 12:22:41 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:20:18.229 [2024-06-07 12:22:41.346172] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:18.229 [2024-06-07 12:22:41.346608] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196531 ] 00:20:18.229 [2024-06-07 12:22:41.491285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:20:18.229 [2024-06-07 12:22:41.586067] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:20:18.229 [2024-06-07 12:22:41.586069] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:18.229 [2024-06-07 12:22:41.785642] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:18.229 [2024-06-07 12:22:41.786042] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:18.229 [2024-06-07 12:22:41.793604] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:18.229 [2024-06-07 12:22:41.793828] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:18.229 [2024-06-07 12:22:41.801640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:18.229 [2024-06-07 12:22:41.801838] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:20:18.229 [2024-06-07 12:22:41.801970] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:20:18.487 [2024-06-07 12:22:41.906097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:18.487 [2024-06-07 12:22:41.906469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:18.487 [2024-06-07 12:22:41.906572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:20:18.487 [2024-06-07 12:22:41.906787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:18.487 [2024-06-07 12:22:41.909386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:18.487 [2024-06-07 12:22:41.909552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:20:18.745 Running I/O for 5 seconds... 00:20:24.045 00:20:24.045 Latency(us) 00:20:24.045 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:24.045 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x1000 00:20:24.045 Malloc0 : 5.07 2872.40 11.22 0.00 0.00 44512.60 315.98 152792.50 00:20:24.045 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x1000 length 0x1000 00:20:24.045 Malloc0 : 5.05 2697.91 10.54 0.00 0.00 47400.20 165.79 357514.48 00:20:24.045 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x800 00:20:24.045 Malloc1p0 : 5.07 1488.80 5.82 0.00 0.00 85791.54 1232.70 84884.72 00:20:24.045 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x800 length 0x800 00:20:24.045 Malloc1p0 : 5.05 1520.58 5.94 0.00 0.00 84012.92 1256.11 84884.72 00:20:24.045 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x800 00:20:24.045 Malloc1p1 : 5.07 1488.54 5.81 0.00 0.00 85719.85 1232.70 83886.08 00:20:24.045 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x800 length 0x800 00:20:24.045 Malloc1p1 : 5.05 1520.31 5.94 0.00 0.00 83938.42 1256.11 83386.76 00:20:24.045 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p0 : 5.07 1488.30 5.81 0.00 0.00 85646.78 1404.34 82388.11 00:20:24.045 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x200 length 0x200 00:20:24.045 Malloc2p0 : 5.05 1520.05 5.94 0.00 0.00 83864.07 1412.14 81888.79 00:20:24.045 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p1 : 5.08 1488.07 5.81 0.00 0.00 85572.30 1341.93 81389.47 00:20:24.045 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x200 length 0x200 00:20:24.045 Malloc2p1 : 5.05 1519.79 5.94 0.00 0.00 83795.62 1373.14 80390.83 00:20:24.045 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p2 : 5.08 1487.85 5.81 0.00 0.00 85498.33 1334.13 79891.50 00:20:24.045 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x200 length 0x200 00:20:24.045 Malloc2p2 : 5.05 1519.51 5.94 0.00 0.00 83723.23 1341.93 78393.54 00:20:24.045 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p3 : 5.08 1487.61 5.81 0.00 0.00 85423.07 1326.32 78892.86 00:20:24.045 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x200 length 0x200 00:20:24.045 Malloc2p3 : 5.06 1519.25 5.93 0.00 0.00 83648.73 1318.52 77894.22 00:20:24.045 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p4 : 5.08 1487.38 5.81 0.00 0.00 85352.14 1396.54 77894.22 00:20:24.045 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x200 length 0x200 00:20:24.045 Malloc2p4 : 5.06 1518.97 5.93 0.00 0.00 83581.17 1388.74 76396.25 00:20:24.045 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.045 Verification LBA range: start 0x0 length 0x200 00:20:24.045 Malloc2p5 : 5.08 1487.15 5.81 0.00 0.00 85275.16 1295.12 76895.57 00:20:24.046 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x200 length 0x200 00:20:24.046 Malloc2p5 : 5.09 1533.85 5.99 0.00 0.00 82686.81 1318.52 74898.29 00:20:24.046 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x200 00:20:24.046 Malloc2p6 : 5.08 1486.93 5.81 0.00 0.00 85190.82 1295.12 75896.93 00:20:24.046 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x200 length 0x200 00:20:24.046 Malloc2p6 : 5.09 1533.61 5.99 0.00 0.00 82607.67 1318.52 73899.64 00:20:24.046 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x200 00:20:24.046 Malloc2p7 : 5.08 1486.71 5.81 0.00 0.00 85110.81 1263.91 74398.96 00:20:24.046 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x200 length 0x200 00:20:24.046 Malloc2p7 : 5.09 1533.38 5.99 0.00 0.00 82527.19 1279.51 72901.00 00:20:24.046 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x1000 00:20:24.046 TestPT : 5.09 1482.46 5.79 0.00 0.00 85220.17 7614.66 74398.96 00:20:24.046 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x1000 length 0x1000 00:20:24.046 TestPT : 5.09 1508.08 5.89 0.00 0.00 83781.17 6491.18 102860.31 00:20:24.046 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x2000 00:20:24.046 raid0 : 5.10 1505.90 5.88 0.00 0.00 83783.66 1349.73 64911.85 00:20:24.046 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x2000 length 0x2000 00:20:24.046 raid0 : 5.09 1532.98 5.99 0.00 0.00 82308.87 1404.34 62415.24 00:20:24.046 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x2000 00:20:24.046 concat0 : 5.10 1505.72 5.88 0.00 0.00 83698.53 1240.50 64412.53 00:20:24.046 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x2000 length 0x2000 00:20:24.046 concat0 : 5.09 1532.69 5.99 0.00 0.00 82232.76 1271.71 64911.85 00:20:24.046 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x1000 00:20:24.046 raid1 : 5.10 1505.54 5.88 0.00 0.00 83611.41 1607.19 66909.14 00:20:24.046 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x1000 length 0x1000 00:20:24.046 raid1 : 5.10 1532.39 5.99 0.00 0.00 82145.20 1685.21 67408.46 00:20:24.046 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x0 length 0x4e2 00:20:24.046 AIO0 : 5.10 1505.22 5.88 0.00 0.00 83519.74 1224.90 71403.03 00:20:24.046 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:20:24.046 Verification LBA range: start 0x4e2 length 0x4e2 00:20:24.046 AIO0 : 5.10 1532.13 5.98 0.00 0.00 82042.26 448.61 71403.03 00:20:24.046 =================================================================================================================== 00:20:24.046 Total : 50830.05 198.55 0.00 0.00 79864.40 165.79 357514.48 00:20:24.306 00:20:24.306 real 0m6.580s 00:20:24.306 user 0m11.676s 00:20:24.306 sys 0m0.729s 00:20:24.306 12:22:47 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:24.306 12:22:47 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:20:24.306 ************************************ 00:20:24.306 END TEST bdev_verify 00:20:24.306 ************************************ 00:20:24.306 12:22:47 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:20:24.306 12:22:47 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:20:24.306 12:22:47 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:24.306 12:22:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:24.565 ************************************ 00:20:24.565 START TEST bdev_verify_big_io 00:20:24.565 ************************************ 00:20:24.565 12:22:47 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:20:24.565 [2024-06-07 12:22:48.000452] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:24.565 [2024-06-07 12:22:48.000981] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196637 ] 00:20:24.565 [2024-06-07 12:22:48.157980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:20:24.824 [2024-06-07 12:22:48.277503] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:20:24.824 [2024-06-07 12:22:48.277509] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:24.824 [2024-06-07 12:22:48.464150] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:24.824 [2024-06-07 12:22:48.464535] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:25.082 [2024-06-07 12:22:48.472101] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:25.082 [2024-06-07 12:22:48.472280] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:25.082 [2024-06-07 12:22:48.480160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:25.082 [2024-06-07 12:22:48.480370] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:20:25.082 [2024-06-07 12:22:48.480554] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:20:25.082 [2024-06-07 12:22:48.588045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:25.082 [2024-06-07 12:22:48.588422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:25.082 [2024-06-07 12:22:48.588516] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:20:25.082 [2024-06-07 12:22:48.588640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:25.082 [2024-06-07 12:22:48.591283] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:25.082 [2024-06-07 12:22:48.591440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:20:25.341 [2024-06-07 12:22:48.790830] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.792195] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.794251] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.796327] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.797497] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.799511] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.800731] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.802768] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.803999] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.806109] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.807329] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.809377] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.810609] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.812651] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.814770] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.815980] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:20:25.342 [2024-06-07 12:22:48.851502] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:20:25.342 [2024-06-07 12:22:48.854657] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:20:25.342 Running I/O for 5 seconds... 00:20:31.906 00:20:31.906 Latency(us) 00:20:31.906 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:31.906 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x100 00:20:31.906 Malloc0 : 5.26 462.05 28.88 0.00 0.00 274089.43 319.88 926741.46 00:20:31.906 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x100 length 0x100 00:20:31.906 Malloc0 : 5.26 486.51 30.41 0.00 0.00 260190.26 267.22 1054567.86 00:20:31.906 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x80 00:20:31.906 Malloc1p0 : 5.48 231.41 14.46 0.00 0.00 525368.46 1771.03 1086524.46 00:20:31.906 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x80 length 0x80 00:20:31.906 Malloc1p0 : 5.66 82.03 5.13 0.00 0.00 1483913.63 787.99 2109135.73 00:20:31.906 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x80 00:20:31.906 Malloc1p1 : 5.60 80.00 5.00 0.00 0.00 1516413.71 784.09 2268918.74 00:20:31.906 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x80 length 0x80 00:20:31.906 Malloc1p1 : 5.69 84.32 5.27 0.00 0.00 1427582.22 748.98 2037233.37 00:20:31.906 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x20 00:20:31.906 Malloc2p0 : 5.48 64.21 4.01 0.00 0.00 470347.99 462.26 794920.47 00:20:31.906 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x20 length 0x20 00:20:31.906 Malloc2p0 : 5.48 70.07 4.38 0.00 0.00 430369.51 493.47 679077.79 00:20:31.906 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x20 00:20:31.906 Malloc2p1 : 5.48 64.21 4.01 0.00 0.00 468828.22 481.77 782936.75 00:20:31.906 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x20 length 0x20 00:20:31.906 Malloc2p1 : 5.48 70.06 4.38 0.00 0.00 428858.07 569.54 671088.64 00:20:31.906 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x20 00:20:31.906 Malloc2p2 : 5.48 64.20 4.01 0.00 0.00 466986.13 473.97 770953.02 00:20:31.906 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x20 length 0x20 00:20:31.906 Malloc2p2 : 5.53 72.38 4.52 0.00 0.00 415532.21 481.77 659104.91 00:20:31.906 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x20 00:20:31.906 Malloc2p3 : 5.53 66.57 4.16 0.00 0.00 450923.51 460.31 762963.87 00:20:31.906 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x20 length 0x20 00:20:31.906 Malloc2p3 : 5.53 72.37 4.52 0.00 0.00 414029.92 497.37 647121.19 00:20:31.906 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.906 Verification LBA range: start 0x0 length 0x20 00:20:31.907 Malloc2p4 : 5.53 66.56 4.16 0.00 0.00 449193.12 475.92 750980.14 00:20:31.907 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x20 length 0x20 00:20:31.907 Malloc2p4 : 5.53 72.36 4.52 0.00 0.00 412593.97 477.87 639132.04 00:20:31.907 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x20 00:20:31.907 Malloc2p5 : 5.53 66.56 4.16 0.00 0.00 447688.79 487.62 742990.99 00:20:31.907 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x20 length 0x20 00:20:31.907 Malloc2p5 : 5.53 72.35 4.52 0.00 0.00 411025.28 514.93 627148.31 00:20:31.907 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x20 00:20:31.907 Malloc2p6 : 5.53 66.55 4.16 0.00 0.00 445984.12 481.77 731007.27 00:20:31.907 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x20 length 0x20 00:20:31.907 Malloc2p6 : 5.53 72.34 4.52 0.00 0.00 409373.44 485.67 619159.16 00:20:31.907 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x20 00:20:31.907 Malloc2p7 : 5.53 66.54 4.16 0.00 0.00 444374.53 511.02 723018.12 00:20:31.907 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x20 length 0x20 00:20:31.907 Malloc2p7 : 5.53 72.34 4.52 0.00 0.00 407852.76 472.02 607175.44 00:20:31.907 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x100 00:20:31.907 TestPT : 5.73 78.91 4.93 0.00 0.00 1464279.71 45188.63 1949352.72 00:20:31.907 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x100 length 0x100 00:20:31.907 TestPT : 5.64 82.33 5.15 0.00 0.00 1408880.34 44189.99 1797558.86 00:20:31.907 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x200 00:20:31.907 raid0 : 5.66 84.77 5.30 0.00 0.00 1353291.57 838.70 2037233.37 00:20:31.907 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x200 length 0x200 00:20:31.907 raid0 : 5.66 93.29 5.83 0.00 0.00 1230598.14 912.82 1805548.01 00:20:31.907 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x200 00:20:31.907 concat0 : 5.66 90.39 5.65 0.00 0.00 1257582.02 869.91 1965331.02 00:20:31.907 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x200 length 0x200 00:20:31.907 concat0 : 5.76 97.28 6.08 0.00 0.00 1159129.46 850.41 1725656.50 00:20:31.907 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x100 00:20:31.907 raid1 : 5.73 100.51 6.28 0.00 0.00 1117347.91 1022.05 1893428.66 00:20:31.907 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x100 length 0x100 00:20:31.907 raid1 : 5.76 113.93 7.12 0.00 0.00 983282.43 1068.86 1653754.15 00:20:31.907 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x0 length 0x4e 00:20:31.907 AIO0 : 5.75 116.47 7.28 0.00 0.00 581513.80 994.74 1134459.37 00:20:31.907 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:20:31.907 Verification LBA range: start 0x4e length 0x4e 00:20:31.907 AIO0 : 5.77 118.19 7.39 0.00 0.00 570596.89 979.14 938725.18 00:20:31.907 =================================================================================================================== 00:20:31.907 Total : 3502.06 218.88 0.00 0.00 656300.82 267.22 2268918.74 00:20:31.907 00:20:31.907 real 0m7.363s 00:20:31.907 user 0m13.366s 00:20:31.907 sys 0m0.603s 00:20:31.907 12:22:55 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:31.907 12:22:55 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:20:31.907 ************************************ 00:20:31.907 END TEST bdev_verify_big_io 00:20:31.907 ************************************ 00:20:31.907 12:22:55 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:31.907 12:22:55 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:20:31.907 12:22:55 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:31.907 12:22:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:31.907 ************************************ 00:20:31.907 START TEST bdev_write_zeroes 00:20:31.907 ************************************ 00:20:31.907 12:22:55 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:31.907 [2024-06-07 12:22:55.419188] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:31.907 [2024-06-07 12:22:55.419601] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196749 ] 00:20:32.166 [2024-06-07 12:22:55.559886] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:32.166 [2024-06-07 12:22:55.653948] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:32.425 [2024-06-07 12:22:55.836345] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:32.425 [2024-06-07 12:22:55.836711] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:20:32.425 [2024-06-07 12:22:55.844295] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:32.425 [2024-06-07 12:22:55.844438] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:20:32.425 [2024-06-07 12:22:55.852318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:32.425 [2024-06-07 12:22:55.852459] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:20:32.425 [2024-06-07 12:22:55.852568] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:20:32.425 [2024-06-07 12:22:55.957616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:20:32.425 [2024-06-07 12:22:55.957979] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.425 [2024-06-07 12:22:55.958062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:20:32.425 [2024-06-07 12:22:55.958312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.425 [2024-06-07 12:22:55.960793] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.425 [2024-06-07 12:22:55.960955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:20:32.685 Running I/O for 1 seconds... 00:20:33.622 00:20:33.622 Latency(us) 00:20:33.622 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:33.622 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc0 : 1.01 11658.87 45.54 0.00 0.00 10975.03 399.85 19099.06 00:20:33.622 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc1p0 : 1.02 11670.39 45.59 0.00 0.00 10950.58 493.47 18724.57 00:20:33.622 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc1p1 : 1.02 11667.25 45.58 0.00 0.00 10944.88 485.67 18350.08 00:20:33.622 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p0 : 1.02 11664.57 45.56 0.00 0.00 10935.45 507.12 17850.76 00:20:33.622 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p1 : 1.02 11661.72 45.55 0.00 0.00 10926.07 493.47 17351.44 00:20:33.622 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p2 : 1.02 11658.92 45.54 0.00 0.00 10920.33 487.62 16976.94 00:20:33.622 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p3 : 1.02 11656.30 45.53 0.00 0.00 10911.62 489.57 16727.28 00:20:33.622 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p4 : 1.02 11653.71 45.52 0.00 0.00 10901.01 495.42 16227.96 00:20:33.622 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p5 : 1.02 11650.90 45.51 0.00 0.00 10895.22 507.12 15853.47 00:20:33.622 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.622 Malloc2p6 : 1.02 11648.25 45.50 0.00 0.00 10884.37 499.32 15478.98 00:20:33.623 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 Malloc2p7 : 1.02 11645.62 45.49 0.00 0.00 10875.13 538.33 14979.66 00:20:33.623 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 TestPT : 1.02 11642.93 45.48 0.00 0.00 10865.99 511.02 14667.58 00:20:33.623 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 raid0 : 1.02 11638.78 45.46 0.00 0.00 10857.10 600.75 14230.67 00:20:33.623 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 concat0 : 1.02 11635.09 45.45 0.00 0.00 10847.15 581.24 13668.94 00:20:33.623 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 raid1 : 1.02 11630.03 45.43 0.00 0.00 10835.61 979.14 12795.12 00:20:33.623 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:20:33.623 AIO0 : 1.02 11605.17 45.33 0.00 0.00 10834.55 975.24 12483.05 00:20:33.623 =================================================================================================================== 00:20:33.623 Total : 186388.50 728.08 0.00 0.00 10897.46 399.85 19099.06 00:20:34.190 00:20:34.190 real 0m2.431s 00:20:34.190 user 0m1.753s 00:20:34.190 sys 0m0.488s 00:20:34.190 12:22:57 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:34.190 12:22:57 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:20:34.190 ************************************ 00:20:34.190 END TEST bdev_write_zeroes 00:20:34.190 ************************************ 00:20:34.449 12:22:57 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:34.449 12:22:57 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:20:34.449 12:22:57 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:34.449 12:22:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:34.449 ************************************ 00:20:34.449 START TEST bdev_json_nonenclosed 00:20:34.449 ************************************ 00:20:34.449 12:22:57 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:34.449 [2024-06-07 12:22:57.928467] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:34.449 [2024-06-07 12:22:57.928930] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196798 ] 00:20:34.449 [2024-06-07 12:22:58.063064] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:34.708 [2024-06-07 12:22:58.150585] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:34.708 [2024-06-07 12:22:58.150950] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:20:34.708 [2024-06-07 12:22:58.151086] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:20:34.708 [2024-06-07 12:22:58.151221] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:20:34.708 00:20:34.708 real 0m0.411s 00:20:34.708 user 0m0.199s 00:20:34.708 sys 0m0.109s 00:20:34.708 12:22:58 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:34.708 12:22:58 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:20:34.708 ************************************ 00:20:34.708 END TEST bdev_json_nonenclosed 00:20:34.708 ************************************ 00:20:34.966 12:22:58 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:34.966 12:22:58 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:20:34.966 12:22:58 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:34.966 12:22:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:34.966 ************************************ 00:20:34.966 START TEST bdev_json_nonarray 00:20:34.966 ************************************ 00:20:34.966 12:22:58 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:20:34.967 [2024-06-07 12:22:58.414175] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:34.967 [2024-06-07 12:22:58.414721] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196829 ] 00:20:34.967 [2024-06-07 12:22:58.564762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:35.225 [2024-06-07 12:22:58.653959] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:35.225 [2024-06-07 12:22:58.654362] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:20:35.225 [2024-06-07 12:22:58.654511] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:20:35.225 [2024-06-07 12:22:58.654641] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:20:35.225 00:20:35.225 real 0m0.432s 00:20:35.225 user 0m0.205s 00:20:35.225 sys 0m0.124s 00:20:35.225 12:22:58 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:35.225 12:22:58 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:20:35.225 ************************************ 00:20:35.225 END TEST bdev_json_nonarray 00:20:35.225 ************************************ 00:20:35.225 12:22:58 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:20:35.225 12:22:58 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:20:35.225 12:22:58 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:20:35.225 12:22:58 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:35.225 12:22:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:20:35.483 ************************************ 00:20:35.483 START TEST bdev_qos 00:20:35.483 ************************************ 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # qos_test_suite '' 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=196858 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 196858' 00:20:35.483 Process qos testing pid: 196858 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:20:35.483 12:22:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 196858 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@830 -- # '[' -z 196858 ']' 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:20:35.484 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:35.484 12:22:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:35.484 [2024-06-07 12:22:58.929776] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:20:35.484 [2024-06-07 12:22:58.930834] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196858 ] 00:20:35.484 [2024-06-07 12:22:59.086377] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:35.742 [2024-06-07 12:22:59.175392] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@863 -- # return 0 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 Malloc_0 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_0 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 [ 00:20:36.347 { 00:20:36.347 "name": "Malloc_0", 00:20:36.347 "aliases": [ 00:20:36.347 "a6fbe035-0e1d-423a-ab5c-9159e326cb56" 00:20:36.347 ], 00:20:36.347 "product_name": "Malloc disk", 00:20:36.347 "block_size": 512, 00:20:36.347 "num_blocks": 262144, 00:20:36.347 "uuid": "a6fbe035-0e1d-423a-ab5c-9159e326cb56", 00:20:36.347 "assigned_rate_limits": { 00:20:36.347 "rw_ios_per_sec": 0, 00:20:36.347 "rw_mbytes_per_sec": 0, 00:20:36.347 "r_mbytes_per_sec": 0, 00:20:36.347 "w_mbytes_per_sec": 0 00:20:36.347 }, 00:20:36.347 "claimed": false, 00:20:36.347 "zoned": false, 00:20:36.347 "supported_io_types": { 00:20:36.347 "read": true, 00:20:36.347 "write": true, 00:20:36.347 "unmap": true, 00:20:36.347 "write_zeroes": true, 00:20:36.347 "flush": true, 00:20:36.347 "reset": true, 00:20:36.347 "compare": false, 00:20:36.347 "compare_and_write": false, 00:20:36.347 "abort": true, 00:20:36.347 "nvme_admin": false, 00:20:36.347 "nvme_io": false 00:20:36.347 }, 00:20:36.347 "memory_domains": [ 00:20:36.347 { 00:20:36.347 "dma_device_id": "system", 00:20:36.347 "dma_device_type": 1 00:20:36.347 }, 00:20:36.347 { 00:20:36.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.347 "dma_device_type": 2 00:20:36.347 } 00:20:36.347 ], 00:20:36.347 "driver_specific": {} 00:20:36.347 } 00:20:36.347 ] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 Null_1 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Null_1 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:36.347 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:36.347 [ 00:20:36.347 { 00:20:36.347 "name": "Null_1", 00:20:36.347 "aliases": [ 00:20:36.347 "e7f05f19-c579-4d8a-996d-1a83c13fafd5" 00:20:36.347 ], 00:20:36.347 "product_name": "Null disk", 00:20:36.347 "block_size": 512, 00:20:36.347 "num_blocks": 262144, 00:20:36.347 "uuid": "e7f05f19-c579-4d8a-996d-1a83c13fafd5", 00:20:36.347 "assigned_rate_limits": { 00:20:36.347 "rw_ios_per_sec": 0, 00:20:36.347 "rw_mbytes_per_sec": 0, 00:20:36.347 "r_mbytes_per_sec": 0, 00:20:36.347 "w_mbytes_per_sec": 0 00:20:36.347 }, 00:20:36.347 "claimed": false, 00:20:36.347 "zoned": false, 00:20:36.347 "supported_io_types": { 00:20:36.347 "read": true, 00:20:36.347 "write": true, 00:20:36.347 "unmap": false, 00:20:36.347 "write_zeroes": true, 00:20:36.347 "flush": false, 00:20:36.347 "reset": true, 00:20:36.347 "compare": false, 00:20:36.347 "compare_and_write": false, 00:20:36.347 "abort": true, 00:20:36.347 "nvme_admin": false, 00:20:36.348 "nvme_io": false 00:20:36.348 }, 00:20:36.348 "driver_specific": {} 00:20:36.348 } 00:20:36.348 ] 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /home/vagrant/spdk_repo/spdk/scripts/iostat.py -d -i 1 -t 5 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:20:36.348 12:22:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:20:36.606 Running I/O for 60 seconds... 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 206772.50 827090.01 0.00 0.00 835584.00 0.00 0.00 ' 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=206772.50 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 206772 00:20:41.910 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=206772 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=51000 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 51000 -gt 1000 ']' 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 51000 Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 51000 IOPS Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:41.911 12:23:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:41.911 ************************************ 00:20:41.911 START TEST bdev_qos_iops 00:20:41.911 ************************************ 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # run_qos_test 51000 IOPS Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=51000 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /home/vagrant/spdk_repo/spdk/scripts/iostat.py -d -i 1 -t 5 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:20:41.911 12:23:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 51004.71 204018.83 0.00 0.00 206244.00 0.00 0.00 ' 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=51004.71 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 51004 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=51004 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=45900 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=56100 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 51004 -lt 45900 ']' 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 51004 -gt 56100 ']' 00:20:47.254 00:20:47.254 real 0m5.168s 00:20:47.254 user 0m0.101s 00:20:47.254 sys 0m0.033s 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:47.254 12:23:10 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:20:47.254 ************************************ 00:20:47.254 END TEST bdev_qos_iops 00:20:47.254 ************************************ 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /home/vagrant/spdk_repo/spdk/scripts/iostat.py -d -i 1 -t 5 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:20:47.254 12:23:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 56469.33 225877.33 0.00 0.00 228352.00 0.00 0.00 ' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=228352.00 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 228352 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=228352 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=22 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 22 -lt 2 ']' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 22 Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 22 BANDWIDTH Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:52.519 12:23:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:52.519 ************************************ 00:20:52.519 START TEST bdev_qos_bw 00:20:52.519 ************************************ 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # run_qos_test 22 BANDWIDTH Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=22 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /home/vagrant/spdk_repo/spdk/scripts/iostat.py -d -i 1 -t 5 00:20:52.519 12:23:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 5630.03 22520.12 0.00 0.00 22776.00 0.00 0.00 ' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=22776.00 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 22776 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=22776 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=22528 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=20275 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=24780 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 22776 -lt 20275 ']' 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 22776 -gt 24780 ']' 00:20:57.797 00:20:57.797 real 0m5.214s 00:20:57.797 user 0m0.132s 00:20:57.797 sys 0m0.042s 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:20:57.797 ************************************ 00:20:57.797 END TEST bdev_qos_bw 00:20:57.797 ************************************ 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:57.797 12:23:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:20:57.797 ************************************ 00:20:57.797 START TEST bdev_qos_ro_bw 00:20:57.797 ************************************ 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:20:57.797 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:20:57.798 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /home/vagrant/spdk_repo/spdk/scripts/iostat.py -d -i 1 -t 5 00:20:57.798 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:20:57.798 12:23:20 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.40 2049.62 0.00 0.00 2068.00 0.00 0.00 ' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2068.00 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2068 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2068 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2068 -lt 1843 ']' 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2068 -gt 2252 ']' 00:21:03.070 00:21:03.070 real 0m5.202s 00:21:03.070 user 0m0.129s 00:21:03.070 sys 0m0.040s 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:03.070 12:23:26 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:21:03.070 ************************************ 00:21:03.070 END TEST bdev_qos_ro_bw 00:21:03.070 ************************************ 00:21:03.070 12:23:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:21:03.070 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:03.070 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:21:03.330 00:21:03.330 Latency(us) 00:21:03.330 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:03.330 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:21:03.330 Malloc_0 : 26.65 70774.28 276.46 0.00 0.00 3582.87 1100.07 503316.48 00:21:03.330 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:21:03.330 Null_1 : 26.74 64157.53 250.62 0.00 0.00 3984.66 246.74 84884.72 00:21:03.330 =================================================================================================================== 00:21:03.330 Total : 134931.81 527.08 0.00 0.00 3774.24 246.74 503316.48 00:21:03.330 0 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 196858 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@949 -- # '[' -z 196858 ']' 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # kill -0 196858 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # uname 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 196858 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # echo 'killing process with pid 196858' 00:21:03.330 killing process with pid 196858 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # kill 196858 00:21:03.330 Received shutdown signal, test time was about 26.782536 seconds 00:21:03.330 00:21:03.330 Latency(us) 00:21:03.330 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:03.330 =================================================================================================================== 00:21:03.330 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:03.330 12:23:26 blockdev_general.bdev_qos -- common/autotest_common.sh@973 -- # wait 196858 00:21:03.897 12:23:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:21:03.897 00:21:03.897 real 0m28.403s 00:21:03.897 user 0m28.922s 00:21:03.897 sys 0m0.833s 00:21:03.897 12:23:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:03.897 12:23:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:21:03.897 ************************************ 00:21:03.897 END TEST bdev_qos 00:21:03.897 ************************************ 00:21:03.897 12:23:27 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:21:03.897 12:23:27 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:21:03.897 12:23:27 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:03.897 12:23:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:21:03.897 ************************************ 00:21:03.897 START TEST bdev_qd_sampling 00:21:03.897 ************************************ 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # qd_sampling_test_suite '' 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=197317 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:21:03.897 Process bdev QD sampling period testing pid: 197317 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 197317' 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 197317 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@830 -- # '[' -z 197317 ']' 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:21:03.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:03.897 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:03.897 [2024-06-07 12:23:27.394317] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:03.897 [2024-06-07 12:23:27.394811] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197317 ] 00:21:04.155 [2024-06-07 12:23:27.551935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:21:04.155 [2024-06-07 12:23:27.656681] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:21:04.155 [2024-06-07 12:23:27.656692] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@863 -- # return 0 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:04.414 Malloc_QD 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_QD 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local i 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:04.414 [ 00:21:04.414 { 00:21:04.414 "name": "Malloc_QD", 00:21:04.414 "aliases": [ 00:21:04.414 "e61c452d-8272-43f1-b5c7-5fb831f1497f" 00:21:04.414 ], 00:21:04.414 "product_name": "Malloc disk", 00:21:04.414 "block_size": 512, 00:21:04.414 "num_blocks": 262144, 00:21:04.414 "uuid": "e61c452d-8272-43f1-b5c7-5fb831f1497f", 00:21:04.414 "assigned_rate_limits": { 00:21:04.414 "rw_ios_per_sec": 0, 00:21:04.414 "rw_mbytes_per_sec": 0, 00:21:04.414 "r_mbytes_per_sec": 0, 00:21:04.414 "w_mbytes_per_sec": 0 00:21:04.414 }, 00:21:04.414 "claimed": false, 00:21:04.414 "zoned": false, 00:21:04.414 "supported_io_types": { 00:21:04.414 "read": true, 00:21:04.414 "write": true, 00:21:04.414 "unmap": true, 00:21:04.414 "write_zeroes": true, 00:21:04.414 "flush": true, 00:21:04.414 "reset": true, 00:21:04.414 "compare": false, 00:21:04.414 "compare_and_write": false, 00:21:04.414 "abort": true, 00:21:04.414 "nvme_admin": false, 00:21:04.414 "nvme_io": false 00:21:04.414 }, 00:21:04.414 "memory_domains": [ 00:21:04.414 { 00:21:04.414 "dma_device_id": "system", 00:21:04.414 "dma_device_type": 1 00:21:04.414 }, 00:21:04.414 { 00:21:04.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.414 "dma_device_type": 2 00:21:04.414 } 00:21:04.414 ], 00:21:04.414 "driver_specific": {} 00:21:04.414 } 00:21:04.414 ] 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # return 0 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:21:04.414 12:23:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:21:04.414 Running I/O for 5 seconds... 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:21:06.317 "tick_rate": 2100000000, 00:21:06.317 "ticks": 1510827263504, 00:21:06.317 "bdevs": [ 00:21:06.317 { 00:21:06.317 "name": "Malloc_QD", 00:21:06.317 "bytes_read": 2015400448, 00:21:06.317 "num_read_ops": 492035, 00:21:06.317 "bytes_written": 0, 00:21:06.317 "num_write_ops": 0, 00:21:06.317 "bytes_unmapped": 0, 00:21:06.317 "num_unmap_ops": 0, 00:21:06.317 "bytes_copied": 0, 00:21:06.317 "num_copy_ops": 0, 00:21:06.317 "read_latency_ticks": 2042495473268, 00:21:06.317 "max_read_latency_ticks": 6442648, 00:21:06.317 "min_read_latency_ticks": 458028, 00:21:06.317 "write_latency_ticks": 0, 00:21:06.317 "max_write_latency_ticks": 0, 00:21:06.317 "min_write_latency_ticks": 0, 00:21:06.317 "unmap_latency_ticks": 0, 00:21:06.317 "max_unmap_latency_ticks": 0, 00:21:06.317 "min_unmap_latency_ticks": 0, 00:21:06.317 "copy_latency_ticks": 0, 00:21:06.317 "max_copy_latency_ticks": 0, 00:21:06.317 "min_copy_latency_ticks": 0, 00:21:06.317 "io_error": {}, 00:21:06.317 "queue_depth_polling_period": 10, 00:21:06.317 "queue_depth": 512, 00:21:06.317 "io_time": 60, 00:21:06.317 "weighted_io_time": 30720 00:21:06.317 } 00:21:06.317 ] 00:21:06.317 }' 00:21:06.317 12:23:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:06.575 00:21:06.575 Latency(us) 00:21:06.575 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:06.575 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:21:06.575 Malloc_QD : 1.97 127288.34 497.22 0.00 0.00 2007.52 526.63 3526.46 00:21:06.575 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:21:06.575 Malloc_QD : 1.97 131277.25 512.80 0.00 0.00 1946.85 329.63 2793.08 00:21:06.575 =================================================================================================================== 00:21:06.575 Total : 258565.59 1010.02 0.00 0.00 1976.71 329.63 3526.46 00:21:06.575 0 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 197317 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@949 -- # '[' -z 197317 ']' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # kill -0 197317 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # uname 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197317 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197317' 00:21:06.575 killing process with pid 197317 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # kill 197317 00:21:06.575 Received shutdown signal, test time was about 2.049257 seconds 00:21:06.575 00:21:06.575 Latency(us) 00:21:06.575 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:06.575 =================================================================================================================== 00:21:06.575 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:06.575 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@973 -- # wait 197317 00:21:07.142 12:23:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:21:07.142 00:21:07.142 real 0m3.158s 00:21:07.142 user 0m5.808s 00:21:07.142 sys 0m0.455s 00:21:07.142 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:07.142 12:23:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:21:07.142 ************************************ 00:21:07.142 END TEST bdev_qd_sampling 00:21:07.142 ************************************ 00:21:07.142 12:23:30 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:21:07.142 12:23:30 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:21:07.142 12:23:30 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:07.142 12:23:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:21:07.142 ************************************ 00:21:07.142 START TEST bdev_error 00:21:07.142 ************************************ 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # error_test_suite '' 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=197390 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 197390' 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:21:07.142 Process error testing pid: 197390 00:21:07.142 12:23:30 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 197390 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 197390 ']' 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:21:07.142 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:07.142 12:23:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:07.142 [2024-06-07 12:23:30.624296] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:07.143 [2024-06-07 12:23:30.624780] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197390 ] 00:21:07.143 [2024-06-07 12:23:30.774464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:07.402 [2024-06-07 12:23:30.880252] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 Dev_1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 [ 00:21:08.337 { 00:21:08.337 "name": "Dev_1", 00:21:08.337 "aliases": [ 00:21:08.337 "9a6f126b-afd3-4f41-819b-04f778c09ebc" 00:21:08.337 ], 00:21:08.337 "product_name": "Malloc disk", 00:21:08.337 "block_size": 512, 00:21:08.337 "num_blocks": 262144, 00:21:08.337 "uuid": "9a6f126b-afd3-4f41-819b-04f778c09ebc", 00:21:08.337 "assigned_rate_limits": { 00:21:08.337 "rw_ios_per_sec": 0, 00:21:08.337 "rw_mbytes_per_sec": 0, 00:21:08.337 "r_mbytes_per_sec": 0, 00:21:08.337 "w_mbytes_per_sec": 0 00:21:08.337 }, 00:21:08.337 "claimed": false, 00:21:08.337 "zoned": false, 00:21:08.337 "supported_io_types": { 00:21:08.337 "read": true, 00:21:08.337 "write": true, 00:21:08.337 "unmap": true, 00:21:08.337 "write_zeroes": true, 00:21:08.337 "flush": true, 00:21:08.337 "reset": true, 00:21:08.337 "compare": false, 00:21:08.337 "compare_and_write": false, 00:21:08.337 "abort": true, 00:21:08.337 "nvme_admin": false, 00:21:08.337 "nvme_io": false 00:21:08.337 }, 00:21:08.337 "memory_domains": [ 00:21:08.337 { 00:21:08.337 "dma_device_id": "system", 00:21:08.337 "dma_device_type": 1 00:21:08.337 }, 00:21:08.337 { 00:21:08.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.337 "dma_device_type": 2 00:21:08.337 } 00:21:08.337 ], 00:21:08.337 "driver_specific": {} 00:21:08.337 } 00:21:08.337 ] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 true 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 Dev_2 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 [ 00:21:08.337 { 00:21:08.337 "name": "Dev_2", 00:21:08.337 "aliases": [ 00:21:08.337 "6a47daff-12ae-4672-bac3-6d97966f4890" 00:21:08.337 ], 00:21:08.337 "product_name": "Malloc disk", 00:21:08.337 "block_size": 512, 00:21:08.337 "num_blocks": 262144, 00:21:08.337 "uuid": "6a47daff-12ae-4672-bac3-6d97966f4890", 00:21:08.337 "assigned_rate_limits": { 00:21:08.337 "rw_ios_per_sec": 0, 00:21:08.337 "rw_mbytes_per_sec": 0, 00:21:08.337 "r_mbytes_per_sec": 0, 00:21:08.337 "w_mbytes_per_sec": 0 00:21:08.337 }, 00:21:08.337 "claimed": false, 00:21:08.337 "zoned": false, 00:21:08.337 "supported_io_types": { 00:21:08.337 "read": true, 00:21:08.337 "write": true, 00:21:08.337 "unmap": true, 00:21:08.337 "write_zeroes": true, 00:21:08.337 "flush": true, 00:21:08.337 "reset": true, 00:21:08.337 "compare": false, 00:21:08.337 "compare_and_write": false, 00:21:08.337 "abort": true, 00:21:08.337 "nvme_admin": false, 00:21:08.337 "nvme_io": false 00:21:08.337 }, 00:21:08.337 "memory_domains": [ 00:21:08.337 { 00:21:08.337 "dma_device_id": "system", 00:21:08.337 "dma_device_type": 1 00:21:08.337 }, 00:21:08.337 { 00:21:08.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.337 "dma_device_type": 2 00:21:08.337 } 00:21:08.337 ], 00:21:08.337 "driver_specific": {} 00:21:08.337 } 00:21:08.337 ] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:08.337 12:23:31 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:21:08.337 12:23:31 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:21:08.596 Running I/O for 5 seconds... 00:21:09.532 12:23:32 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 197390 00:21:09.532 12:23:32 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 197390' 00:21:09.532 Process is existed as continue on error is set. Pid: 197390 00:21:09.532 12:23:32 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:21:09.532 12:23:32 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:09.532 12:23:32 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:09.532 12:23:32 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:09.532 12:23:32 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:21:09.532 12:23:32 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:09.532 12:23:32 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:09.532 12:23:33 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:09.532 12:23:33 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:21:09.532 Timeout while waiting for response: 00:21:09.532 00:21:09.532 00:21:13.722 00:21:13.722 Latency(us) 00:21:13.722 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:13.722 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:21:13.722 EE_Dev_1 : 0.90 121299.05 473.82 5.58 0.00 131.04 90.70 458.36 00:21:13.722 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:21:13.722 Dev_2 : 5.00 265653.09 1037.71 0.00 0.00 59.38 45.35 38447.79 00:21:13.722 =================================================================================================================== 00:21:13.722 Total : 386952.14 1511.53 5.58 0.00 64.81 45.35 38447.79 00:21:14.657 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 197390 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@949 -- # '[' -z 197390 ']' 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # kill -0 197390 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # uname 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197390 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197390' 00:21:14.657 killing process with pid 197390 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # kill 197390 00:21:14.657 Received shutdown signal, test time was about 5.000000 seconds 00:21:14.657 00:21:14.657 Latency(us) 00:21:14.657 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:14.657 =================================================================================================================== 00:21:14.657 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:14.657 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@973 -- # wait 197390 00:21:14.923 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=197498 00:21:14.923 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:21:14.923 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 197498' 00:21:14.923 Process error testing pid: 197498 00:21:14.923 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 197498 00:21:14.923 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 197498 ']' 00:21:14.923 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:21:14.923 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:14.923 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:21:14.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:21:14.924 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:14.924 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:14.924 [2024-06-07 12:23:38.534844] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:14.924 [2024-06-07 12:23:38.535410] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197498 ] 00:21:15.183 [2024-06-07 12:23:38.683579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:15.183 [2024-06-07 12:23:38.766894] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:21:15.442 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 Dev_1 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:38 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 [ 00:21:15.442 { 00:21:15.442 "name": "Dev_1", 00:21:15.442 "aliases": [ 00:21:15.442 "6af1c2c4-46ba-4de3-90ef-16b33d61c316" 00:21:15.442 ], 00:21:15.442 "product_name": "Malloc disk", 00:21:15.442 "block_size": 512, 00:21:15.442 "num_blocks": 262144, 00:21:15.442 "uuid": "6af1c2c4-46ba-4de3-90ef-16b33d61c316", 00:21:15.442 "assigned_rate_limits": { 00:21:15.442 "rw_ios_per_sec": 0, 00:21:15.442 "rw_mbytes_per_sec": 0, 00:21:15.442 "r_mbytes_per_sec": 0, 00:21:15.442 "w_mbytes_per_sec": 0 00:21:15.442 }, 00:21:15.442 "claimed": false, 00:21:15.442 "zoned": false, 00:21:15.442 "supported_io_types": { 00:21:15.442 "read": true, 00:21:15.442 "write": true, 00:21:15.442 "unmap": true, 00:21:15.442 "write_zeroes": true, 00:21:15.442 "flush": true, 00:21:15.442 "reset": true, 00:21:15.442 "compare": false, 00:21:15.442 "compare_and_write": false, 00:21:15.442 "abort": true, 00:21:15.442 "nvme_admin": false, 00:21:15.442 "nvme_io": false 00:21:15.442 }, 00:21:15.442 "memory_domains": [ 00:21:15.442 { 00:21:15.442 "dma_device_id": "system", 00:21:15.442 "dma_device_type": 1 00:21:15.442 }, 00:21:15.442 { 00:21:15.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.442 "dma_device_type": 2 00:21:15.442 } 00:21:15.442 ], 00:21:15.442 "driver_specific": {} 00:21:15.442 } 00:21:15.442 ] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:21:15.442 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 true 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 Dev_2 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.442 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.699 [ 00:21:15.699 { 00:21:15.699 "name": "Dev_2", 00:21:15.699 "aliases": [ 00:21:15.699 "eabd213f-c634-451d-be5b-8100b0b96801" 00:21:15.699 ], 00:21:15.699 "product_name": "Malloc disk", 00:21:15.699 "block_size": 512, 00:21:15.699 "num_blocks": 262144, 00:21:15.699 "uuid": "eabd213f-c634-451d-be5b-8100b0b96801", 00:21:15.699 "assigned_rate_limits": { 00:21:15.699 "rw_ios_per_sec": 0, 00:21:15.699 "rw_mbytes_per_sec": 0, 00:21:15.699 "r_mbytes_per_sec": 0, 00:21:15.699 "w_mbytes_per_sec": 0 00:21:15.699 }, 00:21:15.699 "claimed": false, 00:21:15.699 "zoned": false, 00:21:15.699 "supported_io_types": { 00:21:15.699 "read": true, 00:21:15.699 "write": true, 00:21:15.699 "unmap": true, 00:21:15.699 "write_zeroes": true, 00:21:15.699 "flush": true, 00:21:15.699 "reset": true, 00:21:15.699 "compare": false, 00:21:15.699 "compare_and_write": false, 00:21:15.699 "abort": true, 00:21:15.699 "nvme_admin": false, 00:21:15.699 "nvme_io": false 00:21:15.699 }, 00:21:15.699 "memory_domains": [ 00:21:15.699 { 00:21:15.699 "dma_device_id": "system", 00:21:15.699 "dma_device_type": 1 00:21:15.699 }, 00:21:15.699 { 00:21:15.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.699 "dma_device_type": 2 00:21:15.699 } 00:21:15.699 ], 00:21:15.699 "driver_specific": {} 00:21:15.699 } 00:21:15.699 ] 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:21:15.699 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:15.699 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 197498 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@649 -- # local es=0 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # valid_exec_arg wait 197498 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@637 -- # local arg=wait 00:21:15.699 12:23:39 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # type -t wait 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:15.699 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # wait 197498 00:21:15.699 Running I/O for 5 seconds... 00:21:15.699 task offset: 85200 on job bdev=EE_Dev_1 fails 00:21:15.699 00:21:15.699 Latency(us) 00:21:15.699 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:15.699 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:21:15.699 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:21:15.699 EE_Dev_1 : 0.00 53140.10 207.58 12077.29 0.00 193.20 76.07 364.74 00:21:15.699 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:21:15.699 Dev_2 : 0.00 61068.70 238.55 0.00 0.00 121.74 73.14 191.15 00:21:15.699 =================================================================================================================== 00:21:15.699 Total : 114208.80 446.13 12077.29 0.00 154.44 73.14 364.74 00:21:15.699 [2024-06-07 12:23:39.227518] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:21:15.699 request: 00:21:15.699 { 00:21:15.699 "method": "perform_tests", 00:21:15.699 "req_id": 1 00:21:15.699 } 00:21:15.699 Got JSON-RPC error response 00:21:15.699 response: 00:21:15.699 { 00:21:15.699 "code": -32603, 00:21:15.699 "message": "bdevperf failed with error Operation not permitted" 00:21:15.699 } 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # es=255 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # es=127 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # case "$es" in 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@669 -- # es=1 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:21:16.267 00:21:16.267 real 0m9.152s 00:21:16.267 user 0m9.235s 00:21:16.267 sys 0m0.949s 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:16.267 12:23:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:21:16.267 ************************************ 00:21:16.267 END TEST bdev_error 00:21:16.267 ************************************ 00:21:16.267 12:23:39 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:21:16.267 12:23:39 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:21:16.267 12:23:39 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:16.267 12:23:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:21:16.267 ************************************ 00:21:16.267 START TEST bdev_stat 00:21:16.267 ************************************ 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # stat_test_suite '' 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=197535 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 197535' 00:21:16.267 Process Bdev IO statistics testing pid: 197535 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 197535 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@830 -- # '[' -z 197535 ']' 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:21:16.267 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:16.267 12:23:39 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:16.267 [2024-06-07 12:23:39.847638] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:16.267 [2024-06-07 12:23:39.848122] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197535 ] 00:21:16.525 [2024-06-07 12:23:40.000471] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:21:16.525 [2024-06-07 12:23:40.101178] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:16.525 [2024-06-07 12:23:40.101179] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@863 -- # return 0 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:17.459 Malloc_STAT 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_STAT 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local i 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:17.459 [ 00:21:17.459 { 00:21:17.459 "name": "Malloc_STAT", 00:21:17.459 "aliases": [ 00:21:17.459 "b1edeb66-e678-4cc5-91d6-88037467c35b" 00:21:17.459 ], 00:21:17.459 "product_name": "Malloc disk", 00:21:17.459 "block_size": 512, 00:21:17.459 "num_blocks": 262144, 00:21:17.459 "uuid": "b1edeb66-e678-4cc5-91d6-88037467c35b", 00:21:17.459 "assigned_rate_limits": { 00:21:17.459 "rw_ios_per_sec": 0, 00:21:17.459 "rw_mbytes_per_sec": 0, 00:21:17.459 "r_mbytes_per_sec": 0, 00:21:17.459 "w_mbytes_per_sec": 0 00:21:17.459 }, 00:21:17.459 "claimed": false, 00:21:17.459 "zoned": false, 00:21:17.459 "supported_io_types": { 00:21:17.459 "read": true, 00:21:17.459 "write": true, 00:21:17.459 "unmap": true, 00:21:17.459 "write_zeroes": true, 00:21:17.459 "flush": true, 00:21:17.459 "reset": true, 00:21:17.459 "compare": false, 00:21:17.459 "compare_and_write": false, 00:21:17.459 "abort": true, 00:21:17.459 "nvme_admin": false, 00:21:17.459 "nvme_io": false 00:21:17.459 }, 00:21:17.459 "memory_domains": [ 00:21:17.459 { 00:21:17.459 "dma_device_id": "system", 00:21:17.459 "dma_device_type": 1 00:21:17.459 }, 00:21:17.459 { 00:21:17.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.459 "dma_device_type": 2 00:21:17.459 } 00:21:17.459 ], 00:21:17.459 "driver_specific": {} 00:21:17.459 } 00:21:17.459 ] 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # return 0 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:21:17.459 12:23:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:21:17.459 Running I/O for 10 seconds... 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:21:19.439 "tick_rate": 2100000000, 00:21:19.439 "ticks": 1538164978586, 00:21:19.439 "bdevs": [ 00:21:19.439 { 00:21:19.439 "name": "Malloc_STAT", 00:21:19.439 "bytes_read": 2159055360, 00:21:19.439 "num_read_ops": 527107, 00:21:19.439 "bytes_written": 0, 00:21:19.439 "num_write_ops": 0, 00:21:19.439 "bytes_unmapped": 0, 00:21:19.439 "num_unmap_ops": 0, 00:21:19.439 "bytes_copied": 0, 00:21:19.439 "num_copy_ops": 0, 00:21:19.439 "read_latency_ticks": 2032452083064, 00:21:19.439 "max_read_latency_ticks": 10346912, 00:21:19.439 "min_read_latency_ticks": 301542, 00:21:19.439 "write_latency_ticks": 0, 00:21:19.439 "max_write_latency_ticks": 0, 00:21:19.439 "min_write_latency_ticks": 0, 00:21:19.439 "unmap_latency_ticks": 0, 00:21:19.439 "max_unmap_latency_ticks": 0, 00:21:19.439 "min_unmap_latency_ticks": 0, 00:21:19.439 "copy_latency_ticks": 0, 00:21:19.439 "max_copy_latency_ticks": 0, 00:21:19.439 "min_copy_latency_ticks": 0, 00:21:19.439 "io_error": {} 00:21:19.439 } 00:21:19.439 ] 00:21:19.439 }' 00:21:19.439 12:23:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=527107 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:21:19.439 "tick_rate": 2100000000, 00:21:19.439 "ticks": 1538300441786, 00:21:19.439 "name": "Malloc_STAT", 00:21:19.439 "channels": [ 00:21:19.439 { 00:21:19.439 "thread_id": 2, 00:21:19.439 "bytes_read": 1126170624, 00:21:19.439 "num_read_ops": 274944, 00:21:19.439 "bytes_written": 0, 00:21:19.439 "num_write_ops": 0, 00:21:19.439 "bytes_unmapped": 0, 00:21:19.439 "num_unmap_ops": 0, 00:21:19.439 "bytes_copied": 0, 00:21:19.439 "num_copy_ops": 0, 00:21:19.439 "read_latency_ticks": 1050705996820, 00:21:19.439 "max_read_latency_ticks": 10346912, 00:21:19.439 "min_read_latency_ticks": 2569006, 00:21:19.439 "write_latency_ticks": 0, 00:21:19.439 "max_write_latency_ticks": 0, 00:21:19.439 "min_write_latency_ticks": 0, 00:21:19.439 "unmap_latency_ticks": 0, 00:21:19.439 "max_unmap_latency_ticks": 0, 00:21:19.439 "min_unmap_latency_ticks": 0, 00:21:19.439 "copy_latency_ticks": 0, 00:21:19.439 "max_copy_latency_ticks": 0, 00:21:19.439 "min_copy_latency_ticks": 0 00:21:19.439 }, 00:21:19.439 { 00:21:19.439 "thread_id": 3, 00:21:19.439 "bytes_read": 1107296256, 00:21:19.439 "num_read_ops": 270336, 00:21:19.439 "bytes_written": 0, 00:21:19.439 "num_write_ops": 0, 00:21:19.439 "bytes_unmapped": 0, 00:21:19.439 "num_unmap_ops": 0, 00:21:19.439 "bytes_copied": 0, 00:21:19.439 "num_copy_ops": 0, 00:21:19.439 "read_latency_ticks": 1051551620424, 00:21:19.439 "max_read_latency_ticks": 5378748, 00:21:19.439 "min_read_latency_ticks": 1939810, 00:21:19.439 "write_latency_ticks": 0, 00:21:19.439 "max_write_latency_ticks": 0, 00:21:19.439 "min_write_latency_ticks": 0, 00:21:19.439 "unmap_latency_ticks": 0, 00:21:19.439 "max_unmap_latency_ticks": 0, 00:21:19.439 "min_unmap_latency_ticks": 0, 00:21:19.439 "copy_latency_ticks": 0, 00:21:19.439 "max_copy_latency_ticks": 0, 00:21:19.439 "min_copy_latency_ticks": 0 00:21:19.439 } 00:21:19.439 ] 00:21:19.439 }' 00:21:19.439 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=274944 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=274944 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=270336 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=545280 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:21:19.697 "tick_rate": 2100000000, 00:21:19.697 "ticks": 1538557432478, 00:21:19.697 "bdevs": [ 00:21:19.697 { 00:21:19.697 "name": "Malloc_STAT", 00:21:19.697 "bytes_read": 2368770560, 00:21:19.697 "num_read_ops": 578307, 00:21:19.697 "bytes_written": 0, 00:21:19.697 "num_write_ops": 0, 00:21:19.697 "bytes_unmapped": 0, 00:21:19.697 "num_unmap_ops": 0, 00:21:19.697 "bytes_copied": 0, 00:21:19.697 "num_copy_ops": 0, 00:21:19.697 "read_latency_ticks": 2232854879178, 00:21:19.697 "max_read_latency_ticks": 10346912, 00:21:19.697 "min_read_latency_ticks": 301542, 00:21:19.697 "write_latency_ticks": 0, 00:21:19.697 "max_write_latency_ticks": 0, 00:21:19.697 "min_write_latency_ticks": 0, 00:21:19.697 "unmap_latency_ticks": 0, 00:21:19.697 "max_unmap_latency_ticks": 0, 00:21:19.697 "min_unmap_latency_ticks": 0, 00:21:19.697 "copy_latency_ticks": 0, 00:21:19.697 "max_copy_latency_ticks": 0, 00:21:19.697 "min_copy_latency_ticks": 0, 00:21:19.697 "io_error": {} 00:21:19.697 } 00:21:19.697 ] 00:21:19.697 }' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=578307 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 545280 -lt 527107 ']' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 545280 -gt 578307 ']' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:19.697 00:21:19.697 Latency(us) 00:21:19.697 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:19.697 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:21:19.697 Malloc_STAT : 2.13 139600.91 545.32 0.00 0.00 1830.66 659.26 4930.80 00:21:19.697 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:21:19.697 Malloc_STAT : 2.13 138277.51 540.15 0.00 0.00 1848.10 624.15 2574.63 00:21:19.697 =================================================================================================================== 00:21:19.697 Total : 277878.42 1085.46 0.00 0.00 1839.34 624.15 4930.80 00:21:19.697 0 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 197535 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@949 -- # '[' -z 197535 ']' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # kill -0 197535 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # uname 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197535 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197535' 00:21:19.697 killing process with pid 197535 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # kill 197535 00:21:19.697 Received shutdown signal, test time was about 2.212884 seconds 00:21:19.697 00:21:19.697 Latency(us) 00:21:19.697 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:19.697 =================================================================================================================== 00:21:19.697 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:19.697 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@973 -- # wait 197535 00:21:20.263 12:23:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:21:20.263 00:21:20.263 real 0m3.876s 00:21:20.263 user 0m7.561s 00:21:20.263 sys 0m0.483s 00:21:20.263 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:20.263 12:23:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:21:20.263 ************************************ 00:21:20.263 END TEST bdev_stat 00:21:20.263 ************************************ 00:21:20.263 12:23:43 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:21:20.263 12:23:43 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:21:20.263 12:23:43 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/aiofile 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:21:20.264 12:23:43 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:21:20.264 00:21:20.264 real 1m55.591s 00:21:20.264 user 5m3.967s 00:21:20.264 sys 0m24.941s 00:21:20.264 12:23:43 blockdev_general -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:20.264 12:23:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:21:20.264 ************************************ 00:21:20.264 END TEST blockdev_general 00:21:20.264 ************************************ 00:21:20.264 12:23:43 -- spdk/autotest.sh@190 -- # run_test bdev_raid /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh 00:21:20.264 12:23:43 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:20.264 12:23:43 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:20.264 12:23:43 -- common/autotest_common.sh@10 -- # set +x 00:21:20.264 ************************************ 00:21:20.264 START TEST bdev_raid 00:21:20.264 ************************************ 00:21:20.264 12:23:43 bdev_raid -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh 00:21:20.567 * Looking for test storage... 00:21:20.567 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:21:20.567 12:23:43 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:21:20.567 12:23:43 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:21:20.567 12:23:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:21:20.567 12:23:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:20.567 12:23:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:20.567 ************************************ 00:21:20.567 START TEST raid_function_test_raid0 00:21:20.567 ************************************ 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # raid_function_test raid0 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=197682 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:20.567 Process raid pid: 197682 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 197682' 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 197682 /var/tmp/spdk-raid.sock 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@830 -- # '[' -z 197682 ']' 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:20.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:20.567 12:23:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:21:20.567 [2024-06-07 12:23:44.021346] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:20.567 [2024-06-07 12:23:44.022045] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:20.567 [2024-06-07 12:23:44.181017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:20.825 [2024-06-07 12:23:44.278273] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:20.825 [2024-06-07 12:23:44.362702] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@863 -- # return 0 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /home/vagrant/spdk_repo/spdk/test/bdev/rpcs.txt 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:21:21.761 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:21:21.761 [2024-06-07 12:23:45.389747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:21:21.761 [2024-06-07 12:23:45.392794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:21:21.761 [2024-06-07 12:23:45.393078] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:21:21.761 [2024-06-07 12:23:45.393206] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:21:21.761 [2024-06-07 12:23:45.393489] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:21:21.761 [2024-06-07 12:23:45.394043] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:21:21.761 [2024-06-07 12:23:45.394183] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x616000006080 00:21:21.761 [2024-06-07 12:23:45.394539] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:21.762 Base_1 00:21:21.762 Base_2 00:21:22.020 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /home/vagrant/spdk_repo/spdk/test/bdev/rpcs.txt 00:21:22.020 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:22.020 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:22.278 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:21:22.537 [2024-06-07 12:23:45.926826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:21:22.537 /dev/nbd0 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local i 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # break 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:22.537 1+0 records in 00:21:22.537 1+0 records out 00:21:22.537 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346911 s, 11.8 MB/s 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # size=4096 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # return 0 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.537 12:23:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:21:22.796 { 00:21:22.796 "nbd_device": "/dev/nbd0", 00:21:22.796 "bdev_name": "raid" 00:21:22.796 } 00:21:22.796 ]' 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:21:22.796 { 00:21:22.796 "nbd_device": "/dev/nbd0", 00:21:22.796 "bdev_name": "raid" 00:21:22.796 } 00:21:22.796 ]' 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:21:22.796 4096+0 records in 00:21:22.796 4096+0 records out 00:21:22.796 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0224519 s, 93.4 MB/s 00:21:22.796 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:21:23.055 4096+0 records in 00:21:23.055 4096+0 records out 00:21:23.055 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.214755 s, 9.8 MB/s 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:21:23.055 128+0 records in 00:21:23.055 128+0 records out 00:21:23.055 65536 bytes (66 kB, 64 KiB) copied, 0.00103214 s, 63.5 MB/s 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:21:23.055 2035+0 records in 00:21:23.055 2035+0 records out 00:21:23.055 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0136054 s, 76.6 MB/s 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:21:23.055 456+0 records in 00:21:23.055 456+0 records out 00:21:23.055 233472 bytes (233 kB, 228 KiB) copied, 0.00319273 s, 73.1 MB/s 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:23.055 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:23.623 [2024-06-07 12:23:46.982861] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:21:23.623 12:23:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:21:23.623 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.623 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 197682 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@949 -- # '[' -z 197682 ']' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # kill -0 197682 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # uname 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197682 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197682' 00:21:23.882 killing process with pid 197682 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # kill 197682 00:21:23.882 [2024-06-07 12:23:47.383536] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.882 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@973 -- # wait 197682 00:21:23.882 [2024-06-07 12:23:47.383853] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.882 [2024-06-07 12:23:47.383966] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.882 [2024-06-07 12:23:47.384068] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name raid, state offline 00:21:23.882 [2024-06-07 12:23:47.428599] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:24.449 12:23:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:21:24.449 00:21:24.449 real 0m3.812s 00:21:24.449 user 0m4.990s 00:21:24.449 sys 0m1.304s 00:21:24.449 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:24.449 12:23:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:21:24.449 ************************************ 00:21:24.449 END TEST raid_function_test_raid0 00:21:24.449 ************************************ 00:21:24.449 12:23:47 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:21:24.450 12:23:47 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:21:24.450 12:23:47 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:24.450 12:23:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:24.450 ************************************ 00:21:24.450 START TEST raid_function_test_concat 00:21:24.450 ************************************ 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # raid_function_test concat 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=197825 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 197825' 00:21:24.450 Process raid pid: 197825 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 197825 /var/tmp/spdk-raid.sock 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@830 -- # '[' -z 197825 ']' 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:24.450 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:24.450 12:23:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:21:24.450 [2024-06-07 12:23:47.901089] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:24.450 [2024-06-07 12:23:47.901804] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:24.450 [2024-06-07 12:23:48.054437] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:24.708 [2024-06-07 12:23:48.153288] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:24.708 [2024-06-07 12:23:48.238811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@863 -- # return 0 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /home/vagrant/spdk_repo/spdk/test/bdev/rpcs.txt 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:21:25.308 12:23:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:21:25.876 [2024-06-07 12:23:49.231047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:21:25.876 [2024-06-07 12:23:49.233678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:21:25.876 [2024-06-07 12:23:49.233894] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:21:25.876 [2024-06-07 12:23:49.234011] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:21:25.876 [2024-06-07 12:23:49.234205] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:21:25.876 [2024-06-07 12:23:49.234650] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:21:25.876 [2024-06-07 12:23:49.234707] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x616000006080 00:21:25.876 [2024-06-07 12:23:49.235021] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.876 Base_1 00:21:25.876 Base_2 00:21:25.876 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /home/vagrant/spdk_repo/spdk/test/bdev/rpcs.txt 00:21:25.876 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:25.876 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.133 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:21:26.133 [2024-06-07 12:23:49.755844] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:21:26.133 /dev/nbd0 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local i 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # break 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:26.392 1+0 records in 00:21:26.392 1+0 records out 00:21:26.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000425215 s, 9.6 MB/s 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # size=4096 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # return 0 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.392 12:23:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:21:26.650 { 00:21:26.650 "nbd_device": "/dev/nbd0", 00:21:26.650 "bdev_name": "raid" 00:21:26.650 } 00:21:26.650 ]' 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:21:26.650 { 00:21:26.650 "nbd_device": "/dev/nbd0", 00:21:26.650 "bdev_name": "raid" 00:21:26.650 } 00:21:26.650 ]' 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:21:26.650 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:21:26.651 4096+0 records in 00:21:26.651 4096+0 records out 00:21:26.651 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0231427 s, 90.6 MB/s 00:21:26.651 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:21:26.909 4096+0 records in 00:21:26.909 4096+0 records out 00:21:26.909 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.259305 s, 8.1 MB/s 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:21:26.909 128+0 records in 00:21:26.909 128+0 records out 00:21:26.909 65536 bytes (66 kB, 64 KiB) copied, 0.00153145 s, 42.8 MB/s 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:21:26.909 2035+0 records in 00:21:26.909 2035+0 records out 00:21:26.909 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00887831 s, 117 MB/s 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:21:26.909 456+0 records in 00:21:26.909 456+0 records out 00:21:26.909 233472 bytes (233 kB, 228 KiB) copied, 0.00263285 s, 88.7 MB/s 00:21:26.909 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:27.168 [2024-06-07 12:23:50.773447] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:27.168 12:23:50 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:21:27.428 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 197825 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@949 -- # '[' -z 197825 ']' 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # kill -0 197825 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # uname 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197825 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197825' 00:21:27.687 killing process with pid 197825 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # kill 197825 00:21:27.687 [2024-06-07 12:23:51.109362] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:27.687 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@973 -- # wait 197825 00:21:27.688 [2024-06-07 12:23:51.109760] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:27.688 [2024-06-07 12:23:51.109901] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:27.688 [2024-06-07 12:23:51.110081] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name raid, state offline 00:21:27.688 [2024-06-07 12:23:51.153217] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:27.947 12:23:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:21:27.947 00:21:27.947 real 0m3.653s 00:21:27.947 user 0m4.679s 00:21:27.947 sys 0m1.279s 00:21:27.947 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:27.947 12:23:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:21:27.947 ************************************ 00:21:27.947 END TEST raid_function_test_concat 00:21:27.947 ************************************ 00:21:27.947 12:23:51 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:21:27.947 12:23:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:21:27.947 12:23:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:27.947 12:23:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:27.947 ************************************ 00:21:27.947 START TEST raid0_resize_test 00:21:27.947 ************************************ 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # raid0_resize_test 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=197954 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:27.947 Process raid pid: 197954 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 197954' 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 197954 /var/tmp/spdk-raid.sock 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@830 -- # '[' -z 197954 ']' 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:27.947 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:27.947 12:23:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.206 [2024-06-07 12:23:51.611957] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:28.206 [2024-06-07 12:23:51.612403] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:28.206 [2024-06-07 12:23:51.757247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:28.465 [2024-06-07 12:23:51.853846] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:28.465 [2024-06-07 12:23:51.943197] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:29.032 12:23:52 bdev_raid.raid0_resize_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:29.032 12:23:52 bdev_raid.raid0_resize_test -- common/autotest_common.sh@863 -- # return 0 00:21:29.032 12:23:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:21:29.291 Base_1 00:21:29.291 12:23:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:21:29.553 Base_2 00:21:29.554 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:21:29.814 [2024-06-07 12:23:53.311779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:21:29.814 [2024-06-07 12:23:53.313927] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:21:29.814 [2024-06-07 12:23:53.314117] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:21:29.814 [2024-06-07 12:23:53.314205] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:21:29.814 [2024-06-07 12:23:53.314464] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000001f80 00:21:29.814 [2024-06-07 12:23:53.315046] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:21:29.814 [2024-06-07 12:23:53.315103] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x616000006080 00:21:29.814 [2024-06-07 12:23:53.315467] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.814 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:21:30.073 [2024-06-07 12:23:53.515793] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:21:30.074 [2024-06-07 12:23:53.516047] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:21:30.074 true 00:21:30.074 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:21:30.074 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:21:30.332 [2024-06-07 12:23:53.728040] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:30.332 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:21:30.332 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:21:30.332 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:21:30.332 12:23:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:21:30.590 [2024-06-07 12:23:54.011920] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:21:30.590 [2024-06-07 12:23:54.012179] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:21:30.590 [2024-06-07 12:23:54.012377] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:21:30.590 true 00:21:30.590 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:21:30.590 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:21:30.849 [2024-06-07 12:23:54.280045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 197954 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@949 -- # '[' -z 197954 ']' 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # kill -0 197954 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # uname 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 197954 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 197954' 00:21:30.849 killing process with pid 197954 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # kill 197954 00:21:30.849 [2024-06-07 12:23:54.334278] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:30.849 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@973 -- # wait 197954 00:21:30.849 [2024-06-07 12:23:54.334568] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:30.849 [2024-06-07 12:23:54.334757] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:30.849 [2024-06-07 12:23:54.334845] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Raid, state offline 00:21:30.849 [2024-06-07 12:23:54.335435] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:31.108 12:23:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:21:31.108 00:21:31.108 real 0m3.113s 00:21:31.108 user 0m4.646s 00:21:31.108 sys 0m0.592s 00:21:31.108 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:31.108 12:23:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:21:31.108 ************************************ 00:21:31.108 END TEST raid0_resize_test 00:21:31.108 ************************************ 00:21:31.108 12:23:54 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:21:31.108 12:23:54 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:31.108 12:23:54 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:21:31.108 12:23:54 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:21:31.108 12:23:54 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:31.108 12:23:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:31.108 ************************************ 00:21:31.108 START TEST raid_state_function_test 00:21:31.108 ************************************ 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 false 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:31.108 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:31.367 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:31.367 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:31.367 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=198043 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 198043' 00:21:31.368 Process raid pid: 198043 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 198043 /var/tmp/spdk-raid.sock 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 198043 ']' 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:31.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:31.368 12:23:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:31.368 [2024-06-07 12:23:54.794747] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:31.368 [2024-06-07 12:23:54.795261] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:31.368 [2024-06-07 12:23:54.943899] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:31.626 [2024-06-07 12:23:55.040653] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:31.626 [2024-06-07 12:23:55.125294] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:32.193 12:23:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:32.193 12:23:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:21:32.193 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:32.452 [2024-06-07 12:23:55.917297] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:32.452 [2024-06-07 12:23:55.917994] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:32.452 [2024-06-07 12:23:55.918141] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:32.452 [2024-06-07 12:23:55.918342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:32.452 12:23:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.713 12:23:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.713 "name": "Existed_Raid", 00:21:32.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.713 "strip_size_kb": 64, 00:21:32.713 "state": "configuring", 00:21:32.713 "raid_level": "raid0", 00:21:32.713 "superblock": false, 00:21:32.713 "num_base_bdevs": 2, 00:21:32.713 "num_base_bdevs_discovered": 0, 00:21:32.713 "num_base_bdevs_operational": 2, 00:21:32.713 "base_bdevs_list": [ 00:21:32.713 { 00:21:32.713 "name": "BaseBdev1", 00:21:32.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.713 "is_configured": false, 00:21:32.713 "data_offset": 0, 00:21:32.713 "data_size": 0 00:21:32.713 }, 00:21:32.713 { 00:21:32.713 "name": "BaseBdev2", 00:21:32.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.713 "is_configured": false, 00:21:32.713 "data_offset": 0, 00:21:32.713 "data_size": 0 00:21:32.713 } 00:21:32.713 ] 00:21:32.713 }' 00:21:32.713 12:23:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.713 12:23:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:33.279 12:23:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:33.846 [2024-06-07 12:23:57.221532] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:33.846 [2024-06-07 12:23:57.221795] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:21:33.846 12:23:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:34.105 [2024-06-07 12:23:57.505662] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:34.105 [2024-06-07 12:23:57.505990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:34.105 [2024-06-07 12:23:57.506110] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:34.105 [2024-06-07 12:23:57.506182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:34.105 12:23:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:34.364 [2024-06-07 12:23:57.813131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:34.364 BaseBdev1 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:34.364 12:23:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:34.622 12:23:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:34.881 [ 00:21:34.881 { 00:21:34.881 "name": "BaseBdev1", 00:21:34.881 "aliases": [ 00:21:34.881 "e52ce2cf-2e5e-4037-bba0-9a735a15faab" 00:21:34.881 ], 00:21:34.881 "product_name": "Malloc disk", 00:21:34.881 "block_size": 512, 00:21:34.881 "num_blocks": 65536, 00:21:34.881 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:34.881 "assigned_rate_limits": { 00:21:34.881 "rw_ios_per_sec": 0, 00:21:34.881 "rw_mbytes_per_sec": 0, 00:21:34.881 "r_mbytes_per_sec": 0, 00:21:34.881 "w_mbytes_per_sec": 0 00:21:34.881 }, 00:21:34.881 "claimed": true, 00:21:34.881 "claim_type": "exclusive_write", 00:21:34.881 "zoned": false, 00:21:34.881 "supported_io_types": { 00:21:34.881 "read": true, 00:21:34.881 "write": true, 00:21:34.881 "unmap": true, 00:21:34.881 "write_zeroes": true, 00:21:34.881 "flush": true, 00:21:34.881 "reset": true, 00:21:34.881 "compare": false, 00:21:34.881 "compare_and_write": false, 00:21:34.881 "abort": true, 00:21:34.881 "nvme_admin": false, 00:21:34.881 "nvme_io": false 00:21:34.881 }, 00:21:34.881 "memory_domains": [ 00:21:34.881 { 00:21:34.881 "dma_device_id": "system", 00:21:34.881 "dma_device_type": 1 00:21:34.881 }, 00:21:34.881 { 00:21:34.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.881 "dma_device_type": 2 00:21:34.881 } 00:21:34.881 ], 00:21:34.881 "driver_specific": {} 00:21:34.881 } 00:21:34.881 ] 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.881 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:35.140 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.140 "name": "Existed_Raid", 00:21:35.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.140 "strip_size_kb": 64, 00:21:35.140 "state": "configuring", 00:21:35.140 "raid_level": "raid0", 00:21:35.140 "superblock": false, 00:21:35.140 "num_base_bdevs": 2, 00:21:35.140 "num_base_bdevs_discovered": 1, 00:21:35.140 "num_base_bdevs_operational": 2, 00:21:35.140 "base_bdevs_list": [ 00:21:35.140 { 00:21:35.140 "name": "BaseBdev1", 00:21:35.140 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:35.140 "is_configured": true, 00:21:35.140 "data_offset": 0, 00:21:35.140 "data_size": 65536 00:21:35.140 }, 00:21:35.140 { 00:21:35.140 "name": "BaseBdev2", 00:21:35.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.140 "is_configured": false, 00:21:35.140 "data_offset": 0, 00:21:35.140 "data_size": 0 00:21:35.140 } 00:21:35.140 ] 00:21:35.140 }' 00:21:35.140 12:23:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.140 12:23:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.709 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:35.709 [2024-06-07 12:23:59.321429] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:35.709 [2024-06-07 12:23:59.321671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:21:35.709 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:35.968 [2024-06-07 12:23:59.525556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:35.968 [2024-06-07 12:23:59.527787] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:35.968 [2024-06-07 12:23:59.527994] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.968 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.227 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.227 "name": "Existed_Raid", 00:21:36.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.227 "strip_size_kb": 64, 00:21:36.227 "state": "configuring", 00:21:36.227 "raid_level": "raid0", 00:21:36.228 "superblock": false, 00:21:36.228 "num_base_bdevs": 2, 00:21:36.228 "num_base_bdevs_discovered": 1, 00:21:36.228 "num_base_bdevs_operational": 2, 00:21:36.228 "base_bdevs_list": [ 00:21:36.228 { 00:21:36.228 "name": "BaseBdev1", 00:21:36.228 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:36.228 "is_configured": true, 00:21:36.228 "data_offset": 0, 00:21:36.228 "data_size": 65536 00:21:36.228 }, 00:21:36.228 { 00:21:36.228 "name": "BaseBdev2", 00:21:36.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.228 "is_configured": false, 00:21:36.228 "data_offset": 0, 00:21:36.228 "data_size": 0 00:21:36.228 } 00:21:36.228 ] 00:21:36.228 }' 00:21:36.228 12:23:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.228 12:23:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:36.795 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:37.055 [2024-06-07 12:24:00.484034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:37.055 [2024-06-07 12:24:00.484302] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:21:37.055 [2024-06-07 12:24:00.484354] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:21:37.055 [2024-06-07 12:24:00.484606] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:21:37.055 [2024-06-07 12:24:00.485088] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:21:37.055 [2024-06-07 12:24:00.485253] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:21:37.055 [2024-06-07 12:24:00.485623] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.055 BaseBdev2 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:37.055 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:37.314 [ 00:21:37.314 { 00:21:37.314 "name": "BaseBdev2", 00:21:37.314 "aliases": [ 00:21:37.314 "c0c0a90c-f88e-44ef-b108-dda48b28956e" 00:21:37.314 ], 00:21:37.314 "product_name": "Malloc disk", 00:21:37.314 "block_size": 512, 00:21:37.314 "num_blocks": 65536, 00:21:37.314 "uuid": "c0c0a90c-f88e-44ef-b108-dda48b28956e", 00:21:37.314 "assigned_rate_limits": { 00:21:37.314 "rw_ios_per_sec": 0, 00:21:37.314 "rw_mbytes_per_sec": 0, 00:21:37.314 "r_mbytes_per_sec": 0, 00:21:37.314 "w_mbytes_per_sec": 0 00:21:37.314 }, 00:21:37.314 "claimed": true, 00:21:37.314 "claim_type": "exclusive_write", 00:21:37.314 "zoned": false, 00:21:37.314 "supported_io_types": { 00:21:37.314 "read": true, 00:21:37.314 "write": true, 00:21:37.314 "unmap": true, 00:21:37.314 "write_zeroes": true, 00:21:37.314 "flush": true, 00:21:37.314 "reset": true, 00:21:37.314 "compare": false, 00:21:37.314 "compare_and_write": false, 00:21:37.314 "abort": true, 00:21:37.314 "nvme_admin": false, 00:21:37.314 "nvme_io": false 00:21:37.314 }, 00:21:37.314 "memory_domains": [ 00:21:37.314 { 00:21:37.314 "dma_device_id": "system", 00:21:37.314 "dma_device_type": 1 00:21:37.314 }, 00:21:37.314 { 00:21:37.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.314 "dma_device_type": 2 00:21:37.314 } 00:21:37.314 ], 00:21:37.314 "driver_specific": {} 00:21:37.314 } 00:21:37.314 ] 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.314 12:24:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:37.881 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.881 "name": "Existed_Raid", 00:21:37.881 "uuid": "c618f26c-dd83-4bec-88c6-4969928f2633", 00:21:37.881 "strip_size_kb": 64, 00:21:37.881 "state": "online", 00:21:37.881 "raid_level": "raid0", 00:21:37.881 "superblock": false, 00:21:37.881 "num_base_bdevs": 2, 00:21:37.881 "num_base_bdevs_discovered": 2, 00:21:37.881 "num_base_bdevs_operational": 2, 00:21:37.881 "base_bdevs_list": [ 00:21:37.881 { 00:21:37.881 "name": "BaseBdev1", 00:21:37.881 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:37.881 "is_configured": true, 00:21:37.881 "data_offset": 0, 00:21:37.881 "data_size": 65536 00:21:37.881 }, 00:21:37.881 { 00:21:37.881 "name": "BaseBdev2", 00:21:37.881 "uuid": "c0c0a90c-f88e-44ef-b108-dda48b28956e", 00:21:37.881 "is_configured": true, 00:21:37.881 "data_offset": 0, 00:21:37.881 "data_size": 65536 00:21:37.881 } 00:21:37.881 ] 00:21:37.881 }' 00:21:37.881 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.881 12:24:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:38.139 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:38.398 [2024-06-07 12:24:01.976578] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:38.398 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:38.398 "name": "Existed_Raid", 00:21:38.398 "aliases": [ 00:21:38.398 "c618f26c-dd83-4bec-88c6-4969928f2633" 00:21:38.398 ], 00:21:38.398 "product_name": "Raid Volume", 00:21:38.398 "block_size": 512, 00:21:38.398 "num_blocks": 131072, 00:21:38.398 "uuid": "c618f26c-dd83-4bec-88c6-4969928f2633", 00:21:38.398 "assigned_rate_limits": { 00:21:38.398 "rw_ios_per_sec": 0, 00:21:38.398 "rw_mbytes_per_sec": 0, 00:21:38.398 "r_mbytes_per_sec": 0, 00:21:38.398 "w_mbytes_per_sec": 0 00:21:38.398 }, 00:21:38.398 "claimed": false, 00:21:38.398 "zoned": false, 00:21:38.398 "supported_io_types": { 00:21:38.398 "read": true, 00:21:38.398 "write": true, 00:21:38.398 "unmap": true, 00:21:38.398 "write_zeroes": true, 00:21:38.398 "flush": true, 00:21:38.398 "reset": true, 00:21:38.398 "compare": false, 00:21:38.398 "compare_and_write": false, 00:21:38.398 "abort": false, 00:21:38.398 "nvme_admin": false, 00:21:38.398 "nvme_io": false 00:21:38.398 }, 00:21:38.398 "memory_domains": [ 00:21:38.398 { 00:21:38.398 "dma_device_id": "system", 00:21:38.398 "dma_device_type": 1 00:21:38.398 }, 00:21:38.398 { 00:21:38.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.398 "dma_device_type": 2 00:21:38.398 }, 00:21:38.398 { 00:21:38.398 "dma_device_id": "system", 00:21:38.398 "dma_device_type": 1 00:21:38.398 }, 00:21:38.398 { 00:21:38.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.398 "dma_device_type": 2 00:21:38.398 } 00:21:38.398 ], 00:21:38.398 "driver_specific": { 00:21:38.398 "raid": { 00:21:38.398 "uuid": "c618f26c-dd83-4bec-88c6-4969928f2633", 00:21:38.398 "strip_size_kb": 64, 00:21:38.398 "state": "online", 00:21:38.398 "raid_level": "raid0", 00:21:38.398 "superblock": false, 00:21:38.398 "num_base_bdevs": 2, 00:21:38.398 "num_base_bdevs_discovered": 2, 00:21:38.398 "num_base_bdevs_operational": 2, 00:21:38.398 "base_bdevs_list": [ 00:21:38.398 { 00:21:38.398 "name": "BaseBdev1", 00:21:38.398 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:38.398 "is_configured": true, 00:21:38.398 "data_offset": 0, 00:21:38.398 "data_size": 65536 00:21:38.398 }, 00:21:38.398 { 00:21:38.398 "name": "BaseBdev2", 00:21:38.398 "uuid": "c0c0a90c-f88e-44ef-b108-dda48b28956e", 00:21:38.398 "is_configured": true, 00:21:38.398 "data_offset": 0, 00:21:38.398 "data_size": 65536 00:21:38.398 } 00:21:38.398 ] 00:21:38.398 } 00:21:38.398 } 00:21:38.398 }' 00:21:38.398 12:24:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:38.657 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:38.657 BaseBdev2' 00:21:38.657 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.657 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:38.657 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:38.915 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:38.915 "name": "BaseBdev1", 00:21:38.915 "aliases": [ 00:21:38.915 "e52ce2cf-2e5e-4037-bba0-9a735a15faab" 00:21:38.915 ], 00:21:38.915 "product_name": "Malloc disk", 00:21:38.915 "block_size": 512, 00:21:38.915 "num_blocks": 65536, 00:21:38.915 "uuid": "e52ce2cf-2e5e-4037-bba0-9a735a15faab", 00:21:38.915 "assigned_rate_limits": { 00:21:38.915 "rw_ios_per_sec": 0, 00:21:38.915 "rw_mbytes_per_sec": 0, 00:21:38.915 "r_mbytes_per_sec": 0, 00:21:38.915 "w_mbytes_per_sec": 0 00:21:38.915 }, 00:21:38.915 "claimed": true, 00:21:38.915 "claim_type": "exclusive_write", 00:21:38.915 "zoned": false, 00:21:38.915 "supported_io_types": { 00:21:38.915 "read": true, 00:21:38.915 "write": true, 00:21:38.915 "unmap": true, 00:21:38.915 "write_zeroes": true, 00:21:38.915 "flush": true, 00:21:38.916 "reset": true, 00:21:38.916 "compare": false, 00:21:38.916 "compare_and_write": false, 00:21:38.916 "abort": true, 00:21:38.916 "nvme_admin": false, 00:21:38.916 "nvme_io": false 00:21:38.916 }, 00:21:38.916 "memory_domains": [ 00:21:38.916 { 00:21:38.916 "dma_device_id": "system", 00:21:38.916 "dma_device_type": 1 00:21:38.916 }, 00:21:38.916 { 00:21:38.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.916 "dma_device_type": 2 00:21:38.916 } 00:21:38.916 ], 00:21:38.916 "driver_specific": {} 00:21:38.916 }' 00:21:38.916 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.916 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.916 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:38.916 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.916 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:39.175 12:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:39.434 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:39.434 "name": "BaseBdev2", 00:21:39.434 "aliases": [ 00:21:39.434 "c0c0a90c-f88e-44ef-b108-dda48b28956e" 00:21:39.434 ], 00:21:39.434 "product_name": "Malloc disk", 00:21:39.434 "block_size": 512, 00:21:39.434 "num_blocks": 65536, 00:21:39.434 "uuid": "c0c0a90c-f88e-44ef-b108-dda48b28956e", 00:21:39.434 "assigned_rate_limits": { 00:21:39.434 "rw_ios_per_sec": 0, 00:21:39.434 "rw_mbytes_per_sec": 0, 00:21:39.434 "r_mbytes_per_sec": 0, 00:21:39.434 "w_mbytes_per_sec": 0 00:21:39.434 }, 00:21:39.434 "claimed": true, 00:21:39.434 "claim_type": "exclusive_write", 00:21:39.434 "zoned": false, 00:21:39.434 "supported_io_types": { 00:21:39.434 "read": true, 00:21:39.434 "write": true, 00:21:39.434 "unmap": true, 00:21:39.434 "write_zeroes": true, 00:21:39.434 "flush": true, 00:21:39.434 "reset": true, 00:21:39.434 "compare": false, 00:21:39.434 "compare_and_write": false, 00:21:39.434 "abort": true, 00:21:39.434 "nvme_admin": false, 00:21:39.434 "nvme_io": false 00:21:39.434 }, 00:21:39.434 "memory_domains": [ 00:21:39.434 { 00:21:39.434 "dma_device_id": "system", 00:21:39.434 "dma_device_type": 1 00:21:39.434 }, 00:21:39.434 { 00:21:39.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:39.434 "dma_device_type": 2 00:21:39.434 } 00:21:39.434 ], 00:21:39.434 "driver_specific": {} 00:21:39.434 }' 00:21:39.434 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.694 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.951 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:39.951 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:40.208 [2024-06-07 12:24:03.608776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:40.208 [2024-06-07 12:24:03.609050] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:40.208 [2024-06-07 12:24:03.609294] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.208 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.466 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.466 "name": "Existed_Raid", 00:21:40.466 "uuid": "c618f26c-dd83-4bec-88c6-4969928f2633", 00:21:40.466 "strip_size_kb": 64, 00:21:40.466 "state": "offline", 00:21:40.466 "raid_level": "raid0", 00:21:40.466 "superblock": false, 00:21:40.466 "num_base_bdevs": 2, 00:21:40.466 "num_base_bdevs_discovered": 1, 00:21:40.466 "num_base_bdevs_operational": 1, 00:21:40.466 "base_bdevs_list": [ 00:21:40.466 { 00:21:40.466 "name": null, 00:21:40.466 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.466 "is_configured": false, 00:21:40.466 "data_offset": 0, 00:21:40.466 "data_size": 65536 00:21:40.466 }, 00:21:40.466 { 00:21:40.466 "name": "BaseBdev2", 00:21:40.466 "uuid": "c0c0a90c-f88e-44ef-b108-dda48b28956e", 00:21:40.466 "is_configured": true, 00:21:40.466 "data_offset": 0, 00:21:40.466 "data_size": 65536 00:21:40.466 } 00:21:40.466 ] 00:21:40.466 }' 00:21:40.467 12:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.467 12:24:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.033 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:41.033 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:41.033 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.033 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:41.291 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:41.291 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:41.291 12:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:41.548 [2024-06-07 12:24:05.043020] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:41.548 [2024-06-07 12:24:05.043347] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:21:41.548 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:41.548 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:41.548 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:41.548 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.806 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:41.806 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:41.806 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 198043 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 198043 ']' 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 198043 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 198043 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 198043' 00:21:41.807 killing process with pid 198043 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 198043 00:21:41.807 [2024-06-07 12:24:05.405629] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:41.807 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 198043 00:21:41.807 [2024-06-07 12:24:05.405878] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:42.373 00:21:42.373 real 0m11.003s 00:21:42.373 user 0m19.423s 00:21:42.373 sys 0m1.843s 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.373 ************************************ 00:21:42.373 END TEST raid_state_function_test 00:21:42.373 ************************************ 00:21:42.373 12:24:05 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:21:42.373 12:24:05 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:21:42.373 12:24:05 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:42.373 12:24:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:42.373 ************************************ 00:21:42.373 START TEST raid_state_function_test_sb 00:21:42.373 ************************************ 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 true 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:42.373 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=198405 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 198405' 00:21:42.374 Process raid pid: 198405 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 198405 /var/tmp/spdk-raid.sock 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 198405 ']' 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:42.374 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:42.374 12:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:42.374 [2024-06-07 12:24:05.861344] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:42.374 [2024-06-07 12:24:05.861703] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:42.374 [2024-06-07 12:24:06.005529] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:42.633 [2024-06-07 12:24:06.102238] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:42.633 [2024-06-07 12:24:06.186979] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.200 12:24:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:43.200 12:24:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:21:43.200 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:43.458 [2024-06-07 12:24:06.962322] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:43.458 [2024-06-07 12:24:06.962667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:43.458 [2024-06-07 12:24:06.962822] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:43.458 [2024-06-07 12:24:06.962906] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.458 12:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.779 12:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.779 "name": "Existed_Raid", 00:21:43.779 "uuid": "e1483db6-2348-48d2-b6de-6888d4b9406c", 00:21:43.779 "strip_size_kb": 64, 00:21:43.779 "state": "configuring", 00:21:43.779 "raid_level": "raid0", 00:21:43.779 "superblock": true, 00:21:43.779 "num_base_bdevs": 2, 00:21:43.779 "num_base_bdevs_discovered": 0, 00:21:43.779 "num_base_bdevs_operational": 2, 00:21:43.779 "base_bdevs_list": [ 00:21:43.779 { 00:21:43.779 "name": "BaseBdev1", 00:21:43.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.779 "is_configured": false, 00:21:43.779 "data_offset": 0, 00:21:43.779 "data_size": 0 00:21:43.779 }, 00:21:43.779 { 00:21:43.779 "name": "BaseBdev2", 00:21:43.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.779 "is_configured": false, 00:21:43.779 "data_offset": 0, 00:21:43.779 "data_size": 0 00:21:43.779 } 00:21:43.779 ] 00:21:43.779 }' 00:21:43.779 12:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.779 12:24:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:44.381 12:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:44.639 [2024-06-07 12:24:08.042313] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:44.639 [2024-06-07 12:24:08.042590] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:21:44.639 12:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:44.925 [2024-06-07 12:24:08.326378] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:44.925 [2024-06-07 12:24:08.326708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:44.925 [2024-06-07 12:24:08.326839] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:44.925 [2024-06-07 12:24:08.326918] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:44.925 12:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:44.925 [2024-06-07 12:24:08.566172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:44.925 BaseBdev1 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:45.240 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:45.603 12:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:45.603 [ 00:21:45.603 { 00:21:45.603 "name": "BaseBdev1", 00:21:45.604 "aliases": [ 00:21:45.604 "3f3cd46b-be57-458f-bb97-0a410c94aa25" 00:21:45.604 ], 00:21:45.604 "product_name": "Malloc disk", 00:21:45.604 "block_size": 512, 00:21:45.604 "num_blocks": 65536, 00:21:45.604 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:45.604 "assigned_rate_limits": { 00:21:45.604 "rw_ios_per_sec": 0, 00:21:45.604 "rw_mbytes_per_sec": 0, 00:21:45.604 "r_mbytes_per_sec": 0, 00:21:45.604 "w_mbytes_per_sec": 0 00:21:45.604 }, 00:21:45.604 "claimed": true, 00:21:45.604 "claim_type": "exclusive_write", 00:21:45.604 "zoned": false, 00:21:45.604 "supported_io_types": { 00:21:45.604 "read": true, 00:21:45.604 "write": true, 00:21:45.604 "unmap": true, 00:21:45.604 "write_zeroes": true, 00:21:45.604 "flush": true, 00:21:45.604 "reset": true, 00:21:45.604 "compare": false, 00:21:45.604 "compare_and_write": false, 00:21:45.604 "abort": true, 00:21:45.604 "nvme_admin": false, 00:21:45.604 "nvme_io": false 00:21:45.604 }, 00:21:45.604 "memory_domains": [ 00:21:45.604 { 00:21:45.604 "dma_device_id": "system", 00:21:45.604 "dma_device_type": 1 00:21:45.604 }, 00:21:45.604 { 00:21:45.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.604 "dma_device_type": 2 00:21:45.604 } 00:21:45.604 ], 00:21:45.604 "driver_specific": {} 00:21:45.604 } 00:21:45.604 ] 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.604 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.863 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.863 "name": "Existed_Raid", 00:21:45.863 "uuid": "a4cd36c5-e653-4b90-8d6b-890b8764939f", 00:21:45.863 "strip_size_kb": 64, 00:21:45.863 "state": "configuring", 00:21:45.863 "raid_level": "raid0", 00:21:45.863 "superblock": true, 00:21:45.863 "num_base_bdevs": 2, 00:21:45.863 "num_base_bdevs_discovered": 1, 00:21:45.863 "num_base_bdevs_operational": 2, 00:21:45.863 "base_bdevs_list": [ 00:21:45.863 { 00:21:45.863 "name": "BaseBdev1", 00:21:45.863 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:45.863 "is_configured": true, 00:21:45.863 "data_offset": 2048, 00:21:45.863 "data_size": 63488 00:21:45.863 }, 00:21:45.863 { 00:21:45.863 "name": "BaseBdev2", 00:21:45.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.863 "is_configured": false, 00:21:45.863 "data_offset": 0, 00:21:45.863 "data_size": 0 00:21:45.863 } 00:21:45.863 ] 00:21:45.863 }' 00:21:45.863 12:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.863 12:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:46.435 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:46.695 [2024-06-07 12:24:10.310490] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:46.695 [2024-06-07 12:24:10.310807] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:21:46.695 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:46.954 [2024-06-07 12:24:10.522597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:46.954 [2024-06-07 12:24:10.524771] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:46.954 [2024-06-07 12:24:10.524949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.954 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.218 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.218 "name": "Existed_Raid", 00:21:47.218 "uuid": "d00e7291-fc95-4660-9d97-444e0bca1dff", 00:21:47.218 "strip_size_kb": 64, 00:21:47.218 "state": "configuring", 00:21:47.218 "raid_level": "raid0", 00:21:47.218 "superblock": true, 00:21:47.218 "num_base_bdevs": 2, 00:21:47.218 "num_base_bdevs_discovered": 1, 00:21:47.218 "num_base_bdevs_operational": 2, 00:21:47.218 "base_bdevs_list": [ 00:21:47.218 { 00:21:47.218 "name": "BaseBdev1", 00:21:47.218 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:47.218 "is_configured": true, 00:21:47.218 "data_offset": 2048, 00:21:47.218 "data_size": 63488 00:21:47.218 }, 00:21:47.218 { 00:21:47.218 "name": "BaseBdev2", 00:21:47.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.218 "is_configured": false, 00:21:47.218 "data_offset": 0, 00:21:47.218 "data_size": 0 00:21:47.218 } 00:21:47.218 ] 00:21:47.218 }' 00:21:47.218 12:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.218 12:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:47.796 12:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:48.362 [2024-06-07 12:24:11.708637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:48.362 [2024-06-07 12:24:11.709193] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:21:48.362 [2024-06-07 12:24:11.709384] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:21:48.362 [2024-06-07 12:24:11.709591] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:21:48.362 [2024-06-07 12:24:11.710054] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:21:48.362 [2024-06-07 12:24:11.710193] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:21:48.362 [2024-06-07 12:24:11.710518] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:48.362 BaseBdev2 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.362 12:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:48.621 [ 00:21:48.621 { 00:21:48.621 "name": "BaseBdev2", 00:21:48.621 "aliases": [ 00:21:48.621 "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe" 00:21:48.621 ], 00:21:48.621 "product_name": "Malloc disk", 00:21:48.621 "block_size": 512, 00:21:48.621 "num_blocks": 65536, 00:21:48.621 "uuid": "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe", 00:21:48.621 "assigned_rate_limits": { 00:21:48.621 "rw_ios_per_sec": 0, 00:21:48.621 "rw_mbytes_per_sec": 0, 00:21:48.621 "r_mbytes_per_sec": 0, 00:21:48.621 "w_mbytes_per_sec": 0 00:21:48.621 }, 00:21:48.621 "claimed": true, 00:21:48.621 "claim_type": "exclusive_write", 00:21:48.621 "zoned": false, 00:21:48.621 "supported_io_types": { 00:21:48.621 "read": true, 00:21:48.621 "write": true, 00:21:48.621 "unmap": true, 00:21:48.621 "write_zeroes": true, 00:21:48.621 "flush": true, 00:21:48.621 "reset": true, 00:21:48.621 "compare": false, 00:21:48.621 "compare_and_write": false, 00:21:48.621 "abort": true, 00:21:48.621 "nvme_admin": false, 00:21:48.621 "nvme_io": false 00:21:48.621 }, 00:21:48.621 "memory_domains": [ 00:21:48.621 { 00:21:48.621 "dma_device_id": "system", 00:21:48.621 "dma_device_type": 1 00:21:48.621 }, 00:21:48.621 { 00:21:48.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.621 "dma_device_type": 2 00:21:48.621 } 00:21:48.621 ], 00:21:48.621 "driver_specific": {} 00:21:48.621 } 00:21:48.621 ] 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.621 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.879 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.879 "name": "Existed_Raid", 00:21:48.879 "uuid": "d00e7291-fc95-4660-9d97-444e0bca1dff", 00:21:48.879 "strip_size_kb": 64, 00:21:48.879 "state": "online", 00:21:48.879 "raid_level": "raid0", 00:21:48.879 "superblock": true, 00:21:48.879 "num_base_bdevs": 2, 00:21:48.879 "num_base_bdevs_discovered": 2, 00:21:48.879 "num_base_bdevs_operational": 2, 00:21:48.879 "base_bdevs_list": [ 00:21:48.879 { 00:21:48.879 "name": "BaseBdev1", 00:21:48.879 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:48.879 "is_configured": true, 00:21:48.879 "data_offset": 2048, 00:21:48.879 "data_size": 63488 00:21:48.879 }, 00:21:48.879 { 00:21:48.879 "name": "BaseBdev2", 00:21:48.879 "uuid": "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe", 00:21:48.879 "is_configured": true, 00:21:48.879 "data_offset": 2048, 00:21:48.879 "data_size": 63488 00:21:48.879 } 00:21:48.879 ] 00:21:48.879 }' 00:21:48.879 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.879 12:24:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:49.447 12:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:49.704 [2024-06-07 12:24:13.145057] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:49.704 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:49.704 "name": "Existed_Raid", 00:21:49.704 "aliases": [ 00:21:49.704 "d00e7291-fc95-4660-9d97-444e0bca1dff" 00:21:49.704 ], 00:21:49.704 "product_name": "Raid Volume", 00:21:49.704 "block_size": 512, 00:21:49.704 "num_blocks": 126976, 00:21:49.704 "uuid": "d00e7291-fc95-4660-9d97-444e0bca1dff", 00:21:49.704 "assigned_rate_limits": { 00:21:49.704 "rw_ios_per_sec": 0, 00:21:49.704 "rw_mbytes_per_sec": 0, 00:21:49.704 "r_mbytes_per_sec": 0, 00:21:49.704 "w_mbytes_per_sec": 0 00:21:49.704 }, 00:21:49.704 "claimed": false, 00:21:49.704 "zoned": false, 00:21:49.704 "supported_io_types": { 00:21:49.704 "read": true, 00:21:49.704 "write": true, 00:21:49.704 "unmap": true, 00:21:49.704 "write_zeroes": true, 00:21:49.704 "flush": true, 00:21:49.704 "reset": true, 00:21:49.705 "compare": false, 00:21:49.705 "compare_and_write": false, 00:21:49.705 "abort": false, 00:21:49.705 "nvme_admin": false, 00:21:49.705 "nvme_io": false 00:21:49.705 }, 00:21:49.705 "memory_domains": [ 00:21:49.705 { 00:21:49.705 "dma_device_id": "system", 00:21:49.705 "dma_device_type": 1 00:21:49.705 }, 00:21:49.705 { 00:21:49.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.705 "dma_device_type": 2 00:21:49.705 }, 00:21:49.705 { 00:21:49.705 "dma_device_id": "system", 00:21:49.705 "dma_device_type": 1 00:21:49.705 }, 00:21:49.705 { 00:21:49.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.705 "dma_device_type": 2 00:21:49.705 } 00:21:49.705 ], 00:21:49.705 "driver_specific": { 00:21:49.705 "raid": { 00:21:49.705 "uuid": "d00e7291-fc95-4660-9d97-444e0bca1dff", 00:21:49.705 "strip_size_kb": 64, 00:21:49.705 "state": "online", 00:21:49.705 "raid_level": "raid0", 00:21:49.705 "superblock": true, 00:21:49.705 "num_base_bdevs": 2, 00:21:49.705 "num_base_bdevs_discovered": 2, 00:21:49.705 "num_base_bdevs_operational": 2, 00:21:49.705 "base_bdevs_list": [ 00:21:49.705 { 00:21:49.705 "name": "BaseBdev1", 00:21:49.705 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:49.705 "is_configured": true, 00:21:49.705 "data_offset": 2048, 00:21:49.705 "data_size": 63488 00:21:49.705 }, 00:21:49.705 { 00:21:49.705 "name": "BaseBdev2", 00:21:49.705 "uuid": "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe", 00:21:49.705 "is_configured": true, 00:21:49.705 "data_offset": 2048, 00:21:49.705 "data_size": 63488 00:21:49.705 } 00:21:49.705 ] 00:21:49.705 } 00:21:49.705 } 00:21:49.705 }' 00:21:49.705 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:49.705 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:49.705 BaseBdev2' 00:21:49.705 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:49.705 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:49.705 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:49.964 "name": "BaseBdev1", 00:21:49.964 "aliases": [ 00:21:49.964 "3f3cd46b-be57-458f-bb97-0a410c94aa25" 00:21:49.964 ], 00:21:49.964 "product_name": "Malloc disk", 00:21:49.964 "block_size": 512, 00:21:49.964 "num_blocks": 65536, 00:21:49.964 "uuid": "3f3cd46b-be57-458f-bb97-0a410c94aa25", 00:21:49.964 "assigned_rate_limits": { 00:21:49.964 "rw_ios_per_sec": 0, 00:21:49.964 "rw_mbytes_per_sec": 0, 00:21:49.964 "r_mbytes_per_sec": 0, 00:21:49.964 "w_mbytes_per_sec": 0 00:21:49.964 }, 00:21:49.964 "claimed": true, 00:21:49.964 "claim_type": "exclusive_write", 00:21:49.964 "zoned": false, 00:21:49.964 "supported_io_types": { 00:21:49.964 "read": true, 00:21:49.964 "write": true, 00:21:49.964 "unmap": true, 00:21:49.964 "write_zeroes": true, 00:21:49.964 "flush": true, 00:21:49.964 "reset": true, 00:21:49.964 "compare": false, 00:21:49.964 "compare_and_write": false, 00:21:49.964 "abort": true, 00:21:49.964 "nvme_admin": false, 00:21:49.964 "nvme_io": false 00:21:49.964 }, 00:21:49.964 "memory_domains": [ 00:21:49.964 { 00:21:49.964 "dma_device_id": "system", 00:21:49.964 "dma_device_type": 1 00:21:49.964 }, 00:21:49.964 { 00:21:49.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.964 "dma_device_type": 2 00:21:49.964 } 00:21:49.964 ], 00:21:49.964 "driver_specific": {} 00:21:49.964 }' 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:49.964 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.304 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.304 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:50.304 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.304 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.304 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:50.305 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:50.305 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:50.305 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:50.564 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:50.564 "name": "BaseBdev2", 00:21:50.564 "aliases": [ 00:21:50.564 "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe" 00:21:50.564 ], 00:21:50.564 "product_name": "Malloc disk", 00:21:50.564 "block_size": 512, 00:21:50.564 "num_blocks": 65536, 00:21:50.564 "uuid": "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe", 00:21:50.564 "assigned_rate_limits": { 00:21:50.564 "rw_ios_per_sec": 0, 00:21:50.564 "rw_mbytes_per_sec": 0, 00:21:50.564 "r_mbytes_per_sec": 0, 00:21:50.564 "w_mbytes_per_sec": 0 00:21:50.564 }, 00:21:50.564 "claimed": true, 00:21:50.564 "claim_type": "exclusive_write", 00:21:50.564 "zoned": false, 00:21:50.564 "supported_io_types": { 00:21:50.564 "read": true, 00:21:50.564 "write": true, 00:21:50.564 "unmap": true, 00:21:50.564 "write_zeroes": true, 00:21:50.564 "flush": true, 00:21:50.564 "reset": true, 00:21:50.564 "compare": false, 00:21:50.564 "compare_and_write": false, 00:21:50.564 "abort": true, 00:21:50.564 "nvme_admin": false, 00:21:50.564 "nvme_io": false 00:21:50.564 }, 00:21:50.564 "memory_domains": [ 00:21:50.564 { 00:21:50.564 "dma_device_id": "system", 00:21:50.564 "dma_device_type": 1 00:21:50.564 }, 00:21:50.564 { 00:21:50.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.564 "dma_device_type": 2 00:21:50.564 } 00:21:50.564 ], 00:21:50.564 "driver_specific": {} 00:21:50.564 }' 00:21:50.564 12:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:50.564 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.823 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.823 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:50.823 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:50.823 [2024-06-07 12:24:14.453154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:50.823 [2024-06-07 12:24:14.453209] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:50.823 [2024-06-07 12:24:14.453271] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:51.082 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.083 "name": "Existed_Raid", 00:21:51.083 "uuid": "d00e7291-fc95-4660-9d97-444e0bca1dff", 00:21:51.083 "strip_size_kb": 64, 00:21:51.083 "state": "offline", 00:21:51.083 "raid_level": "raid0", 00:21:51.083 "superblock": true, 00:21:51.083 "num_base_bdevs": 2, 00:21:51.083 "num_base_bdevs_discovered": 1, 00:21:51.083 "num_base_bdevs_operational": 1, 00:21:51.083 "base_bdevs_list": [ 00:21:51.083 { 00:21:51.083 "name": null, 00:21:51.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.083 "is_configured": false, 00:21:51.083 "data_offset": 2048, 00:21:51.083 "data_size": 63488 00:21:51.083 }, 00:21:51.083 { 00:21:51.083 "name": "BaseBdev2", 00:21:51.083 "uuid": "3b916f84-c4fe-4bd9-8bb7-7bc7656d63fe", 00:21:51.083 "is_configured": true, 00:21:51.083 "data_offset": 2048, 00:21:51.083 "data_size": 63488 00:21:51.083 } 00:21:51.083 ] 00:21:51.083 }' 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.083 12:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:52.020 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:52.279 [2024-06-07 12:24:15.766820] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:52.279 [2024-06-07 12:24:15.766913] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:21:52.279 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:52.279 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:52.279 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.279 12:24:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 198405 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 198405 ']' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 198405 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 198405 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:52.538 killing process with pid 198405 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 198405' 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 198405 00:21:52.538 [2024-06-07 12:24:16.050627] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:52.538 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 198405 00:21:52.538 [2024-06-07 12:24:16.050764] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:52.848 12:24:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:52.848 00:21:52.848 real 0m10.590s 00:21:52.848 user 0m18.753s 00:21:52.848 sys 0m1.777s 00:21:52.848 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:52.848 12:24:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:52.848 ************************************ 00:21:52.848 END TEST raid_state_function_test_sb 00:21:52.848 ************************************ 00:21:52.848 12:24:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:21:52.848 12:24:16 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:21:52.848 12:24:16 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:52.848 12:24:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:52.848 ************************************ 00:21:52.848 START TEST raid_superblock_test 00:21:52.848 ************************************ 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 2 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:21:52.848 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=198778 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 198778 /var/tmp/spdk-raid.sock 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 198778 ']' 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:53.110 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:53.110 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.110 [2024-06-07 12:24:16.512595] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:21:53.110 [2024-06-07 12:24:16.512968] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198778 ] 00:21:53.110 [2024-06-07 12:24:16.660468] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:53.369 [2024-06-07 12:24:16.756834] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:53.369 [2024-06-07 12:24:16.842049] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:53.369 12:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:53.628 malloc1 00:21:53.628 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:53.887 [2024-06-07 12:24:17.440883] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:53.887 [2024-06-07 12:24:17.441044] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.887 [2024-06-07 12:24:17.441094] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:21:53.887 [2024-06-07 12:24:17.441170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.887 [2024-06-07 12:24:17.443771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.887 [2024-06-07 12:24:17.443864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:53.887 pt1 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:53.887 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:54.145 malloc2 00:21:54.145 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:54.403 [2024-06-07 12:24:17.892745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:54.403 [2024-06-07 12:24:17.892857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.403 [2024-06-07 12:24:17.892907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:21:54.403 [2024-06-07 12:24:17.892959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.403 [2024-06-07 12:24:17.895322] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.403 [2024-06-07 12:24:17.895388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:54.403 pt2 00:21:54.403 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:54.403 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:54.403 12:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:54.661 [2024-06-07 12:24:18.112832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:54.661 [2024-06-07 12:24:18.114967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:54.661 [2024-06-07 12:24:18.115142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:21:54.661 [2024-06-07 12:24:18.115157] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:21:54.661 [2024-06-07 12:24:18.115340] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:21:54.661 [2024-06-07 12:24:18.115678] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:21:54.661 [2024-06-07 12:24:18.115699] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:21:54.661 [2024-06-07 12:24:18.115836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.661 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.919 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.919 "name": "raid_bdev1", 00:21:54.919 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:21:54.919 "strip_size_kb": 64, 00:21:54.919 "state": "online", 00:21:54.919 "raid_level": "raid0", 00:21:54.919 "superblock": true, 00:21:54.919 "num_base_bdevs": 2, 00:21:54.919 "num_base_bdevs_discovered": 2, 00:21:54.919 "num_base_bdevs_operational": 2, 00:21:54.919 "base_bdevs_list": [ 00:21:54.919 { 00:21:54.919 "name": "pt1", 00:21:54.919 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:54.919 "is_configured": true, 00:21:54.919 "data_offset": 2048, 00:21:54.919 "data_size": 63488 00:21:54.919 }, 00:21:54.919 { 00:21:54.919 "name": "pt2", 00:21:54.919 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:54.919 "is_configured": true, 00:21:54.919 "data_offset": 2048, 00:21:54.919 "data_size": 63488 00:21:54.919 } 00:21:54.919 ] 00:21:54.919 }' 00:21:54.919 12:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.919 12:24:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:55.486 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:55.745 [2024-06-07 12:24:19.181041] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.745 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:55.745 "name": "raid_bdev1", 00:21:55.745 "aliases": [ 00:21:55.745 "ec784e5e-4320-423b-9aa5-5daa78c97e2d" 00:21:55.745 ], 00:21:55.745 "product_name": "Raid Volume", 00:21:55.746 "block_size": 512, 00:21:55.746 "num_blocks": 126976, 00:21:55.746 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:21:55.746 "assigned_rate_limits": { 00:21:55.746 "rw_ios_per_sec": 0, 00:21:55.746 "rw_mbytes_per_sec": 0, 00:21:55.746 "r_mbytes_per_sec": 0, 00:21:55.746 "w_mbytes_per_sec": 0 00:21:55.746 }, 00:21:55.746 "claimed": false, 00:21:55.746 "zoned": false, 00:21:55.746 "supported_io_types": { 00:21:55.746 "read": true, 00:21:55.746 "write": true, 00:21:55.746 "unmap": true, 00:21:55.746 "write_zeroes": true, 00:21:55.746 "flush": true, 00:21:55.746 "reset": true, 00:21:55.746 "compare": false, 00:21:55.746 "compare_and_write": false, 00:21:55.746 "abort": false, 00:21:55.746 "nvme_admin": false, 00:21:55.746 "nvme_io": false 00:21:55.746 }, 00:21:55.746 "memory_domains": [ 00:21:55.746 { 00:21:55.746 "dma_device_id": "system", 00:21:55.746 "dma_device_type": 1 00:21:55.746 }, 00:21:55.746 { 00:21:55.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.746 "dma_device_type": 2 00:21:55.746 }, 00:21:55.746 { 00:21:55.746 "dma_device_id": "system", 00:21:55.746 "dma_device_type": 1 00:21:55.746 }, 00:21:55.746 { 00:21:55.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.746 "dma_device_type": 2 00:21:55.746 } 00:21:55.746 ], 00:21:55.746 "driver_specific": { 00:21:55.746 "raid": { 00:21:55.746 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:21:55.746 "strip_size_kb": 64, 00:21:55.746 "state": "online", 00:21:55.746 "raid_level": "raid0", 00:21:55.746 "superblock": true, 00:21:55.746 "num_base_bdevs": 2, 00:21:55.746 "num_base_bdevs_discovered": 2, 00:21:55.746 "num_base_bdevs_operational": 2, 00:21:55.746 "base_bdevs_list": [ 00:21:55.746 { 00:21:55.746 "name": "pt1", 00:21:55.746 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:55.746 "is_configured": true, 00:21:55.746 "data_offset": 2048, 00:21:55.746 "data_size": 63488 00:21:55.746 }, 00:21:55.746 { 00:21:55.746 "name": "pt2", 00:21:55.746 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:55.746 "is_configured": true, 00:21:55.746 "data_offset": 2048, 00:21:55.746 "data_size": 63488 00:21:55.746 } 00:21:55.746 ] 00:21:55.746 } 00:21:55.746 } 00:21:55.746 }' 00:21:55.746 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:55.746 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:55.746 pt2' 00:21:55.746 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:55.746 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:55.746 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.005 "name": "pt1", 00:21:56.005 "aliases": [ 00:21:56.005 "00000000-0000-0000-0000-000000000001" 00:21:56.005 ], 00:21:56.005 "product_name": "passthru", 00:21:56.005 "block_size": 512, 00:21:56.005 "num_blocks": 65536, 00:21:56.005 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:56.005 "assigned_rate_limits": { 00:21:56.005 "rw_ios_per_sec": 0, 00:21:56.005 "rw_mbytes_per_sec": 0, 00:21:56.005 "r_mbytes_per_sec": 0, 00:21:56.005 "w_mbytes_per_sec": 0 00:21:56.005 }, 00:21:56.005 "claimed": true, 00:21:56.005 "claim_type": "exclusive_write", 00:21:56.005 "zoned": false, 00:21:56.005 "supported_io_types": { 00:21:56.005 "read": true, 00:21:56.005 "write": true, 00:21:56.005 "unmap": true, 00:21:56.005 "write_zeroes": true, 00:21:56.005 "flush": true, 00:21:56.005 "reset": true, 00:21:56.005 "compare": false, 00:21:56.005 "compare_and_write": false, 00:21:56.005 "abort": true, 00:21:56.005 "nvme_admin": false, 00:21:56.005 "nvme_io": false 00:21:56.005 }, 00:21:56.005 "memory_domains": [ 00:21:56.005 { 00:21:56.005 "dma_device_id": "system", 00:21:56.005 "dma_device_type": 1 00:21:56.005 }, 00:21:56.005 { 00:21:56.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.005 "dma_device_type": 2 00:21:56.005 } 00:21:56.005 ], 00:21:56.005 "driver_specific": { 00:21:56.005 "passthru": { 00:21:56.005 "name": "pt1", 00:21:56.005 "base_bdev_name": "malloc1" 00:21:56.005 } 00:21:56.005 } 00:21:56.005 }' 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.005 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.278 12:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:56.537 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.537 "name": "pt2", 00:21:56.537 "aliases": [ 00:21:56.537 "00000000-0000-0000-0000-000000000002" 00:21:56.537 ], 00:21:56.537 "product_name": "passthru", 00:21:56.537 "block_size": 512, 00:21:56.537 "num_blocks": 65536, 00:21:56.537 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:56.537 "assigned_rate_limits": { 00:21:56.537 "rw_ios_per_sec": 0, 00:21:56.537 "rw_mbytes_per_sec": 0, 00:21:56.537 "r_mbytes_per_sec": 0, 00:21:56.537 "w_mbytes_per_sec": 0 00:21:56.537 }, 00:21:56.537 "claimed": true, 00:21:56.537 "claim_type": "exclusive_write", 00:21:56.537 "zoned": false, 00:21:56.537 "supported_io_types": { 00:21:56.537 "read": true, 00:21:56.537 "write": true, 00:21:56.537 "unmap": true, 00:21:56.537 "write_zeroes": true, 00:21:56.537 "flush": true, 00:21:56.537 "reset": true, 00:21:56.537 "compare": false, 00:21:56.537 "compare_and_write": false, 00:21:56.537 "abort": true, 00:21:56.537 "nvme_admin": false, 00:21:56.537 "nvme_io": false 00:21:56.537 }, 00:21:56.537 "memory_domains": [ 00:21:56.537 { 00:21:56.537 "dma_device_id": "system", 00:21:56.537 "dma_device_type": 1 00:21:56.537 }, 00:21:56.537 { 00:21:56.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.537 "dma_device_type": 2 00:21:56.537 } 00:21:56.537 ], 00:21:56.537 "driver_specific": { 00:21:56.537 "passthru": { 00:21:56.537 "name": "pt2", 00:21:56.537 "base_bdev_name": "malloc2" 00:21:56.537 } 00:21:56.537 } 00:21:56.537 }' 00:21:56.537 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.537 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.795 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.055 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:57.055 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:57.055 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:57.313 [2024-06-07 12:24:20.813935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:57.313 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ec784e5e-4320-423b-9aa5-5daa78c97e2d 00:21:57.313 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ec784e5e-4320-423b-9aa5-5daa78c97e2d ']' 00:21:57.313 12:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:57.572 [2024-06-07 12:24:21.205799] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:57.572 [2024-06-07 12:24:21.206104] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:57.572 [2024-06-07 12:24:21.206323] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:57.572 [2024-06-07 12:24:21.206420] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:57.572 [2024-06-07 12:24:21.206610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:21:57.831 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.831 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:58.090 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:58.090 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:58.090 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:58.090 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:58.348 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:58.348 12:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:58.607 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:58.607 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:21:58.866 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:21:59.454 [2024-06-07 12:24:22.793991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:59.454 [2024-06-07 12:24:22.796307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:59.454 [2024-06-07 12:24:22.796494] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:59.454 [2024-06-07 12:24:22.796697] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:59.454 [2024-06-07 12:24:22.796823] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:59.454 [2024-06-07 12:24:22.796864] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:21:59.454 request: 00:21:59.454 { 00:21:59.454 "name": "raid_bdev1", 00:21:59.454 "raid_level": "raid0", 00:21:59.454 "base_bdevs": [ 00:21:59.454 "malloc1", 00:21:59.454 "malloc2" 00:21:59.454 ], 00:21:59.454 "superblock": false, 00:21:59.454 "strip_size_kb": 64, 00:21:59.454 "method": "bdev_raid_create", 00:21:59.454 "req_id": 1 00:21:59.454 } 00:21:59.454 Got JSON-RPC error response 00:21:59.454 response: 00:21:59.454 { 00:21:59.454 "code": -17, 00:21:59.454 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:59.454 } 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.454 12:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:59.454 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:59.454 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:59.454 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:59.713 [2024-06-07 12:24:23.282036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:59.713 [2024-06-07 12:24:23.282400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:59.713 [2024-06-07 12:24:23.282482] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:21:59.713 [2024-06-07 12:24:23.282770] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:59.713 [2024-06-07 12:24:23.285137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:59.713 [2024-06-07 12:24:23.285330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:59.713 [2024-06-07 12:24:23.285504] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:59.713 [2024-06-07 12:24:23.285607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:59.713 pt1 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.713 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.971 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.971 "name": "raid_bdev1", 00:21:59.971 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:21:59.971 "strip_size_kb": 64, 00:21:59.971 "state": "configuring", 00:21:59.971 "raid_level": "raid0", 00:21:59.971 "superblock": true, 00:21:59.971 "num_base_bdevs": 2, 00:21:59.971 "num_base_bdevs_discovered": 1, 00:21:59.971 "num_base_bdevs_operational": 2, 00:21:59.971 "base_bdevs_list": [ 00:21:59.971 { 00:21:59.971 "name": "pt1", 00:21:59.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:59.971 "is_configured": true, 00:21:59.971 "data_offset": 2048, 00:21:59.971 "data_size": 63488 00:21:59.971 }, 00:21:59.971 { 00:21:59.971 "name": null, 00:21:59.971 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.971 "is_configured": false, 00:21:59.971 "data_offset": 2048, 00:21:59.971 "data_size": 63488 00:21:59.971 } 00:21:59.971 ] 00:21:59.971 }' 00:21:59.971 12:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.971 12:24:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:00.905 [2024-06-07 12:24:24.386199] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:00.905 [2024-06-07 12:24:24.386573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.905 [2024-06-07 12:24:24.386646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:22:00.905 [2024-06-07 12:24:24.386789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.905 [2024-06-07 12:24:24.387206] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.905 [2024-06-07 12:24:24.387368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:00.905 [2024-06-07 12:24:24.387553] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:00.905 [2024-06-07 12:24:24.387669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:00.905 [2024-06-07 12:24:24.387791] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:22:00.905 [2024-06-07 12:24:24.387927] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:00.905 [2024-06-07 12:24:24.388024] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:22:00.905 [2024-06-07 12:24:24.388451] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:22:00.905 [2024-06-07 12:24:24.388566] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:22:00.905 [2024-06-07 12:24:24.388718] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.905 pt2 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.905 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.164 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.164 "name": "raid_bdev1", 00:22:01.164 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:22:01.164 "strip_size_kb": 64, 00:22:01.164 "state": "online", 00:22:01.164 "raid_level": "raid0", 00:22:01.164 "superblock": true, 00:22:01.164 "num_base_bdevs": 2, 00:22:01.164 "num_base_bdevs_discovered": 2, 00:22:01.164 "num_base_bdevs_operational": 2, 00:22:01.164 "base_bdevs_list": [ 00:22:01.164 { 00:22:01.164 "name": "pt1", 00:22:01.164 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:01.164 "is_configured": true, 00:22:01.164 "data_offset": 2048, 00:22:01.164 "data_size": 63488 00:22:01.164 }, 00:22:01.164 { 00:22:01.164 "name": "pt2", 00:22:01.164 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.164 "is_configured": true, 00:22:01.164 "data_offset": 2048, 00:22:01.164 "data_size": 63488 00:22:01.164 } 00:22:01.164 ] 00:22:01.164 }' 00:22:01.164 12:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.164 12:24:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:01.943 [2024-06-07 12:24:25.454508] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:01.943 "name": "raid_bdev1", 00:22:01.943 "aliases": [ 00:22:01.943 "ec784e5e-4320-423b-9aa5-5daa78c97e2d" 00:22:01.943 ], 00:22:01.943 "product_name": "Raid Volume", 00:22:01.943 "block_size": 512, 00:22:01.943 "num_blocks": 126976, 00:22:01.943 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:22:01.943 "assigned_rate_limits": { 00:22:01.943 "rw_ios_per_sec": 0, 00:22:01.943 "rw_mbytes_per_sec": 0, 00:22:01.943 "r_mbytes_per_sec": 0, 00:22:01.943 "w_mbytes_per_sec": 0 00:22:01.943 }, 00:22:01.943 "claimed": false, 00:22:01.943 "zoned": false, 00:22:01.943 "supported_io_types": { 00:22:01.943 "read": true, 00:22:01.943 "write": true, 00:22:01.943 "unmap": true, 00:22:01.943 "write_zeroes": true, 00:22:01.943 "flush": true, 00:22:01.943 "reset": true, 00:22:01.943 "compare": false, 00:22:01.943 "compare_and_write": false, 00:22:01.943 "abort": false, 00:22:01.943 "nvme_admin": false, 00:22:01.943 "nvme_io": false 00:22:01.943 }, 00:22:01.943 "memory_domains": [ 00:22:01.943 { 00:22:01.943 "dma_device_id": "system", 00:22:01.943 "dma_device_type": 1 00:22:01.943 }, 00:22:01.943 { 00:22:01.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.943 "dma_device_type": 2 00:22:01.943 }, 00:22:01.943 { 00:22:01.943 "dma_device_id": "system", 00:22:01.943 "dma_device_type": 1 00:22:01.943 }, 00:22:01.943 { 00:22:01.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.943 "dma_device_type": 2 00:22:01.943 } 00:22:01.943 ], 00:22:01.943 "driver_specific": { 00:22:01.943 "raid": { 00:22:01.943 "uuid": "ec784e5e-4320-423b-9aa5-5daa78c97e2d", 00:22:01.943 "strip_size_kb": 64, 00:22:01.943 "state": "online", 00:22:01.943 "raid_level": "raid0", 00:22:01.943 "superblock": true, 00:22:01.943 "num_base_bdevs": 2, 00:22:01.943 "num_base_bdevs_discovered": 2, 00:22:01.943 "num_base_bdevs_operational": 2, 00:22:01.943 "base_bdevs_list": [ 00:22:01.943 { 00:22:01.943 "name": "pt1", 00:22:01.943 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:01.943 "is_configured": true, 00:22:01.943 "data_offset": 2048, 00:22:01.943 "data_size": 63488 00:22:01.943 }, 00:22:01.943 { 00:22:01.943 "name": "pt2", 00:22:01.943 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.943 "is_configured": true, 00:22:01.943 "data_offset": 2048, 00:22:01.943 "data_size": 63488 00:22:01.943 } 00:22:01.943 ] 00:22:01.943 } 00:22:01.943 } 00:22:01.943 }' 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:01.943 pt2' 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:01.943 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.202 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.202 "name": "pt1", 00:22:02.202 "aliases": [ 00:22:02.202 "00000000-0000-0000-0000-000000000001" 00:22:02.202 ], 00:22:02.202 "product_name": "passthru", 00:22:02.202 "block_size": 512, 00:22:02.202 "num_blocks": 65536, 00:22:02.202 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:02.202 "assigned_rate_limits": { 00:22:02.202 "rw_ios_per_sec": 0, 00:22:02.202 "rw_mbytes_per_sec": 0, 00:22:02.202 "r_mbytes_per_sec": 0, 00:22:02.202 "w_mbytes_per_sec": 0 00:22:02.202 }, 00:22:02.202 "claimed": true, 00:22:02.202 "claim_type": "exclusive_write", 00:22:02.202 "zoned": false, 00:22:02.202 "supported_io_types": { 00:22:02.202 "read": true, 00:22:02.202 "write": true, 00:22:02.202 "unmap": true, 00:22:02.202 "write_zeroes": true, 00:22:02.202 "flush": true, 00:22:02.202 "reset": true, 00:22:02.202 "compare": false, 00:22:02.202 "compare_and_write": false, 00:22:02.202 "abort": true, 00:22:02.202 "nvme_admin": false, 00:22:02.202 "nvme_io": false 00:22:02.202 }, 00:22:02.202 "memory_domains": [ 00:22:02.202 { 00:22:02.202 "dma_device_id": "system", 00:22:02.202 "dma_device_type": 1 00:22:02.202 }, 00:22:02.202 { 00:22:02.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.202 "dma_device_type": 2 00:22:02.202 } 00:22:02.202 ], 00:22:02.202 "driver_specific": { 00:22:02.202 "passthru": { 00:22:02.202 "name": "pt1", 00:22:02.202 "base_bdev_name": "malloc1" 00:22:02.202 } 00:22:02.202 } 00:22:02.202 }' 00:22:02.202 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.202 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.460 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.460 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.460 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.460 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.460 12:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.460 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.460 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.460 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.460 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.720 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.720 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.720 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.720 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.987 "name": "pt2", 00:22:02.987 "aliases": [ 00:22:02.987 "00000000-0000-0000-0000-000000000002" 00:22:02.987 ], 00:22:02.987 "product_name": "passthru", 00:22:02.987 "block_size": 512, 00:22:02.987 "num_blocks": 65536, 00:22:02.987 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:02.987 "assigned_rate_limits": { 00:22:02.987 "rw_ios_per_sec": 0, 00:22:02.987 "rw_mbytes_per_sec": 0, 00:22:02.987 "r_mbytes_per_sec": 0, 00:22:02.987 "w_mbytes_per_sec": 0 00:22:02.987 }, 00:22:02.987 "claimed": true, 00:22:02.987 "claim_type": "exclusive_write", 00:22:02.987 "zoned": false, 00:22:02.987 "supported_io_types": { 00:22:02.987 "read": true, 00:22:02.987 "write": true, 00:22:02.987 "unmap": true, 00:22:02.987 "write_zeroes": true, 00:22:02.987 "flush": true, 00:22:02.987 "reset": true, 00:22:02.987 "compare": false, 00:22:02.987 "compare_and_write": false, 00:22:02.987 "abort": true, 00:22:02.987 "nvme_admin": false, 00:22:02.987 "nvme_io": false 00:22:02.987 }, 00:22:02.987 "memory_domains": [ 00:22:02.987 { 00:22:02.987 "dma_device_id": "system", 00:22:02.987 "dma_device_type": 1 00:22:02.987 }, 00:22:02.987 { 00:22:02.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.987 "dma_device_type": 2 00:22:02.987 } 00:22:02.987 ], 00:22:02.987 "driver_specific": { 00:22:02.987 "passthru": { 00:22:02.987 "name": "pt2", 00:22:02.987 "base_bdev_name": "malloc2" 00:22:02.987 } 00:22:02.987 } 00:22:02.987 }' 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.987 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:03.267 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:03.526 [2024-06-07 12:24:26.922766] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ec784e5e-4320-423b-9aa5-5daa78c97e2d '!=' ec784e5e-4320-423b-9aa5-5daa78c97e2d ']' 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 198778 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 198778 ']' 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 198778 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 198778 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:03.526 killing process with pid 198778 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 198778' 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 198778 00:22:03.526 [2024-06-07 12:24:26.975019] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:03.526 12:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 198778 00:22:03.526 [2024-06-07 12:24:26.975129] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:03.526 [2024-06-07 12:24:26.975178] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:03.526 [2024-06-07 12:24:26.975190] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:22:03.526 [2024-06-07 12:24:27.016966] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:03.785 12:24:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:03.785 00:22:03.785 real 0m10.891s 00:22:03.785 user 0m19.686s 00:22:03.785 sys 0m1.871s 00:22:03.785 12:24:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:03.785 ************************************ 00:22:03.785 END TEST raid_superblock_test 00:22:03.785 ************************************ 00:22:03.785 12:24:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:03.785 12:24:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:22:03.785 12:24:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:03.785 12:24:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:03.785 12:24:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:03.785 ************************************ 00:22:03.785 START TEST raid_read_error_test 00:22:03.785 ************************************ 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 read 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:03.785 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:04.044 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:04.044 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HRtZ7MTBk0 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=199137 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 199137 /var/tmp/spdk-raid.sock 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 199137 ']' 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:04.045 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:04.045 12:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:04.045 [2024-06-07 12:24:27.480916] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:04.045 [2024-06-07 12:24:27.481180] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199137 ] 00:22:04.045 [2024-06-07 12:24:27.631335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:04.303 [2024-06-07 12:24:27.709555] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:04.303 [2024-06-07 12:24:27.789051] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:04.871 12:24:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:04.871 12:24:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:22:04.871 12:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:04.871 12:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:05.130 BaseBdev1_malloc 00:22:05.130 12:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:05.389 true 00:22:05.389 12:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:05.648 [2024-06-07 12:24:29.131932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:05.648 [2024-06-07 12:24:29.132487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.648 [2024-06-07 12:24:29.132607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:22:05.648 [2024-06-07 12:24:29.132708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.648 [2024-06-07 12:24:29.135155] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.648 [2024-06-07 12:24:29.135321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:05.648 BaseBdev1 00:22:05.648 12:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:05.648 12:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:05.907 BaseBdev2_malloc 00:22:05.907 12:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:06.167 true 00:22:06.167 12:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:06.426 [2024-06-07 12:24:29.995009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:06.426 [2024-06-07 12:24:29.995304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:06.426 [2024-06-07 12:24:29.995406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:22:06.426 [2024-06-07 12:24:29.995517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:06.426 [2024-06-07 12:24:29.997752] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:06.426 [2024-06-07 12:24:29.997859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:06.426 BaseBdev2 00:22:06.426 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:22:06.685 [2024-06-07 12:24:30.203166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:06.685 [2024-06-07 12:24:30.205223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:06.685 [2024-06-07 12:24:30.205477] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:22:06.685 [2024-06-07 12:24:30.205494] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:06.685 [2024-06-07 12:24:30.205669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:22:06.685 [2024-06-07 12:24:30.206016] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:22:06.685 [2024-06-07 12:24:30.206032] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:22:06.685 [2024-06-07 12:24:30.206156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.685 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.944 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.944 "name": "raid_bdev1", 00:22:06.944 "uuid": "651a8b3b-e05b-45d5-9570-69cea9a39de4", 00:22:06.944 "strip_size_kb": 64, 00:22:06.944 "state": "online", 00:22:06.944 "raid_level": "raid0", 00:22:06.944 "superblock": true, 00:22:06.944 "num_base_bdevs": 2, 00:22:06.944 "num_base_bdevs_discovered": 2, 00:22:06.944 "num_base_bdevs_operational": 2, 00:22:06.944 "base_bdevs_list": [ 00:22:06.944 { 00:22:06.944 "name": "BaseBdev1", 00:22:06.944 "uuid": "cd331934-a3ae-5579-b531-3e94c632226f", 00:22:06.944 "is_configured": true, 00:22:06.944 "data_offset": 2048, 00:22:06.944 "data_size": 63488 00:22:06.944 }, 00:22:06.944 { 00:22:06.944 "name": "BaseBdev2", 00:22:06.944 "uuid": "c09aa87f-47c5-56b2-b98f-00508049e967", 00:22:06.944 "is_configured": true, 00:22:06.944 "data_offset": 2048, 00:22:06.944 "data_size": 63488 00:22:06.944 } 00:22:06.944 ] 00:22:06.944 }' 00:22:06.944 12:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.944 12:24:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.511 12:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:07.511 12:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:07.511 [2024-06-07 12:24:31.096359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:22:08.447 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:08.706 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.707 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.966 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.966 "name": "raid_bdev1", 00:22:08.966 "uuid": "651a8b3b-e05b-45d5-9570-69cea9a39de4", 00:22:08.966 "strip_size_kb": 64, 00:22:08.966 "state": "online", 00:22:08.966 "raid_level": "raid0", 00:22:08.966 "superblock": true, 00:22:08.966 "num_base_bdevs": 2, 00:22:08.966 "num_base_bdevs_discovered": 2, 00:22:08.966 "num_base_bdevs_operational": 2, 00:22:08.966 "base_bdevs_list": [ 00:22:08.966 { 00:22:08.966 "name": "BaseBdev1", 00:22:08.966 "uuid": "cd331934-a3ae-5579-b531-3e94c632226f", 00:22:08.966 "is_configured": true, 00:22:08.966 "data_offset": 2048, 00:22:08.966 "data_size": 63488 00:22:08.966 }, 00:22:08.966 { 00:22:08.966 "name": "BaseBdev2", 00:22:08.966 "uuid": "c09aa87f-47c5-56b2-b98f-00508049e967", 00:22:08.966 "is_configured": true, 00:22:08.966 "data_offset": 2048, 00:22:08.966 "data_size": 63488 00:22:08.966 } 00:22:08.966 ] 00:22:08.966 }' 00:22:08.966 12:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.966 12:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:09.535 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:09.796 [2024-06-07 12:24:33.364850] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:09.796 [2024-06-07 12:24:33.364908] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:09.796 [2024-06-07 12:24:33.366036] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:09.796 [2024-06-07 12:24:33.366081] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.796 [2024-06-07 12:24:33.366107] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:09.796 [2024-06-07 12:24:33.366115] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:22:09.796 0 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 199137 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 199137 ']' 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 199137 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 199137 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 199137' 00:22:09.796 killing process with pid 199137 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 199137 00:22:09.796 [2024-06-07 12:24:33.431421] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:09.796 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 199137 00:22:10.056 [2024-06-07 12:24:33.460789] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HRtZ7MTBk0 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:22:10.316 00:22:10.316 real 0m6.412s 00:22:10.316 user 0m9.778s 00:22:10.316 sys 0m1.057s 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:10.316 12:24:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.316 ************************************ 00:22:10.316 END TEST raid_read_error_test 00:22:10.316 ************************************ 00:22:10.316 12:24:33 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:22:10.316 12:24:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:10.316 12:24:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:10.316 12:24:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:10.316 ************************************ 00:22:10.316 START TEST raid_write_error_test 00:22:10.316 ************************************ 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 write 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.IQOBOvvYk6 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=199321 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 199321 /var/tmp/spdk-raid.sock 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 199321 ']' 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:10.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:10.316 12:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.316 [2024-06-07 12:24:33.945388] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:10.316 [2024-06-07 12:24:33.945620] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199321 ] 00:22:10.623 [2024-06-07 12:24:34.084888] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:10.623 [2024-06-07 12:24:34.174473] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:10.623 [2024-06-07 12:24:34.253996] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:10.881 12:24:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:10.881 12:24:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:22:10.881 12:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:10.881 12:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:11.139 BaseBdev1_malloc 00:22:11.139 12:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:11.398 true 00:22:11.399 12:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:11.658 [2024-06-07 12:24:35.123103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:11.658 [2024-06-07 12:24:35.123684] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.658 [2024-06-07 12:24:35.123821] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:22:11.658 [2024-06-07 12:24:35.123936] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.658 [2024-06-07 12:24:35.126402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.658 [2024-06-07 12:24:35.126564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:11.658 BaseBdev1 00:22:11.658 12:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:11.658 12:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:11.918 BaseBdev2_malloc 00:22:11.918 12:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:12.178 true 00:22:12.178 12:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:12.438 [2024-06-07 12:24:35.878701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:12.438 [2024-06-07 12:24:35.879035] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:12.438 [2024-06-07 12:24:35.879139] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:22:12.438 [2024-06-07 12:24:35.879237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:12.438 [2024-06-07 12:24:35.881558] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:12.438 [2024-06-07 12:24:35.881695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:12.438 BaseBdev2 00:22:12.438 12:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:22:12.697 [2024-06-07 12:24:36.086826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:12.697 [2024-06-07 12:24:36.088978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:12.697 [2024-06-07 12:24:36.089174] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:22:12.697 [2024-06-07 12:24:36.089189] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:12.697 [2024-06-07 12:24:36.089389] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:22:12.697 [2024-06-07 12:24:36.089727] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:22:12.697 [2024-06-07 12:24:36.089747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:22:12.697 [2024-06-07 12:24:36.089883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.697 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.956 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.956 "name": "raid_bdev1", 00:22:12.956 "uuid": "c9ddd6d0-9e16-43cb-bb64-278616966c36", 00:22:12.956 "strip_size_kb": 64, 00:22:12.956 "state": "online", 00:22:12.956 "raid_level": "raid0", 00:22:12.956 "superblock": true, 00:22:12.956 "num_base_bdevs": 2, 00:22:12.956 "num_base_bdevs_discovered": 2, 00:22:12.956 "num_base_bdevs_operational": 2, 00:22:12.956 "base_bdevs_list": [ 00:22:12.956 { 00:22:12.956 "name": "BaseBdev1", 00:22:12.956 "uuid": "bdf8312a-3fcf-5df2-9429-9b7676ca3255", 00:22:12.956 "is_configured": true, 00:22:12.956 "data_offset": 2048, 00:22:12.956 "data_size": 63488 00:22:12.956 }, 00:22:12.956 { 00:22:12.956 "name": "BaseBdev2", 00:22:12.956 "uuid": "4642fc20-925c-514e-b29f-bdeb95f45071", 00:22:12.956 "is_configured": true, 00:22:12.956 "data_offset": 2048, 00:22:12.956 "data_size": 63488 00:22:12.956 } 00:22:12.956 ] 00:22:12.956 }' 00:22:12.956 12:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.956 12:24:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.525 12:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:13.525 12:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:13.525 [2024-06-07 12:24:37.119219] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:22:14.463 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.722 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.981 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.981 "name": "raid_bdev1", 00:22:14.981 "uuid": "c9ddd6d0-9e16-43cb-bb64-278616966c36", 00:22:14.981 "strip_size_kb": 64, 00:22:14.981 "state": "online", 00:22:14.981 "raid_level": "raid0", 00:22:14.981 "superblock": true, 00:22:14.981 "num_base_bdevs": 2, 00:22:14.981 "num_base_bdevs_discovered": 2, 00:22:14.981 "num_base_bdevs_operational": 2, 00:22:14.981 "base_bdevs_list": [ 00:22:14.981 { 00:22:14.981 "name": "BaseBdev1", 00:22:14.981 "uuid": "bdf8312a-3fcf-5df2-9429-9b7676ca3255", 00:22:14.981 "is_configured": true, 00:22:14.981 "data_offset": 2048, 00:22:14.981 "data_size": 63488 00:22:14.981 }, 00:22:14.981 { 00:22:14.981 "name": "BaseBdev2", 00:22:14.981 "uuid": "4642fc20-925c-514e-b29f-bdeb95f45071", 00:22:14.981 "is_configured": true, 00:22:14.981 "data_offset": 2048, 00:22:14.981 "data_size": 63488 00:22:14.981 } 00:22:14.981 ] 00:22:14.981 }' 00:22:14.981 12:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.981 12:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.548 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:15.893 [2024-06-07 12:24:39.471048] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:15.893 [2024-06-07 12:24:39.471105] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:15.893 [2024-06-07 12:24:39.472323] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:15.893 [2024-06-07 12:24:39.472373] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.893 [2024-06-07 12:24:39.472399] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:15.893 [2024-06-07 12:24:39.472409] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:22:15.893 0 00:22:15.893 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 199321 00:22:15.893 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 199321 ']' 00:22:15.893 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 199321 00:22:15.893 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:22:15.893 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 199321 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:16.160 killing process with pid 199321 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 199321' 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 199321 00:22:16.160 [2024-06-07 12:24:39.531015] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:16.160 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 199321 00:22:16.160 [2024-06-07 12:24:39.560300] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.IQOBOvvYk6 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.43 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.43 != \0\.\0\0 ]] 00:22:16.420 00:22:16.420 real 0m6.044s 00:22:16.420 user 0m9.636s 00:22:16.420 sys 0m0.960s 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:16.420 12:24:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:16.420 ************************************ 00:22:16.420 END TEST raid_write_error_test 00:22:16.420 ************************************ 00:22:16.420 12:24:39 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:22:16.420 12:24:39 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:22:16.420 12:24:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:16.420 12:24:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:16.420 12:24:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:16.420 ************************************ 00:22:16.420 START TEST raid_state_function_test 00:22:16.420 ************************************ 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 false 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=199500 00:22:16.420 Process raid pid: 199500 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 199500' 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 199500 /var/tmp/spdk-raid.sock 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 199500 ']' 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:16.420 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:16.420 12:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:16.421 [2024-06-07 12:24:40.051312] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:16.421 [2024-06-07 12:24:40.052279] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:16.680 [2024-06-07 12:24:40.196143] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:16.680 [2024-06-07 12:24:40.288337] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:16.939 [2024-06-07 12:24:40.373969] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:17.533 12:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:17.533 12:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:22:17.533 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:17.795 [2024-06-07 12:24:41.309023] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:17.795 [2024-06-07 12:24:41.309775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:17.795 [2024-06-07 12:24:41.309915] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:17.795 [2024-06-07 12:24:41.310045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.795 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.058 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.058 "name": "Existed_Raid", 00:22:18.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.058 "strip_size_kb": 64, 00:22:18.058 "state": "configuring", 00:22:18.058 "raid_level": "concat", 00:22:18.058 "superblock": false, 00:22:18.058 "num_base_bdevs": 2, 00:22:18.058 "num_base_bdevs_discovered": 0, 00:22:18.058 "num_base_bdevs_operational": 2, 00:22:18.058 "base_bdevs_list": [ 00:22:18.058 { 00:22:18.058 "name": "BaseBdev1", 00:22:18.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.058 "is_configured": false, 00:22:18.058 "data_offset": 0, 00:22:18.058 "data_size": 0 00:22:18.058 }, 00:22:18.058 { 00:22:18.058 "name": "BaseBdev2", 00:22:18.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.058 "is_configured": false, 00:22:18.058 "data_offset": 0, 00:22:18.058 "data_size": 0 00:22:18.058 } 00:22:18.058 ] 00:22:18.058 }' 00:22:18.058 12:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.058 12:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.992 12:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:18.992 [2024-06-07 12:24:42.577111] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:18.993 [2024-06-07 12:24:42.577424] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:22:18.993 12:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:19.251 [2024-06-07 12:24:42.821180] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:19.251 [2024-06-07 12:24:42.821974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:19.251 [2024-06-07 12:24:42.822120] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:19.251 [2024-06-07 12:24:42.822303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:19.251 12:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:19.508 [2024-06-07 12:24:43.049342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:19.508 BaseBdev1 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:19.508 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:19.766 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:20.023 [ 00:22:20.023 { 00:22:20.023 "name": "BaseBdev1", 00:22:20.023 "aliases": [ 00:22:20.023 "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1" 00:22:20.023 ], 00:22:20.023 "product_name": "Malloc disk", 00:22:20.023 "block_size": 512, 00:22:20.023 "num_blocks": 65536, 00:22:20.023 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:20.023 "assigned_rate_limits": { 00:22:20.023 "rw_ios_per_sec": 0, 00:22:20.023 "rw_mbytes_per_sec": 0, 00:22:20.023 "r_mbytes_per_sec": 0, 00:22:20.023 "w_mbytes_per_sec": 0 00:22:20.023 }, 00:22:20.023 "claimed": true, 00:22:20.023 "claim_type": "exclusive_write", 00:22:20.023 "zoned": false, 00:22:20.023 "supported_io_types": { 00:22:20.023 "read": true, 00:22:20.023 "write": true, 00:22:20.023 "unmap": true, 00:22:20.023 "write_zeroes": true, 00:22:20.023 "flush": true, 00:22:20.023 "reset": true, 00:22:20.023 "compare": false, 00:22:20.023 "compare_and_write": false, 00:22:20.023 "abort": true, 00:22:20.023 "nvme_admin": false, 00:22:20.023 "nvme_io": false 00:22:20.023 }, 00:22:20.023 "memory_domains": [ 00:22:20.023 { 00:22:20.023 "dma_device_id": "system", 00:22:20.023 "dma_device_type": 1 00:22:20.023 }, 00:22:20.023 { 00:22:20.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.023 "dma_device_type": 2 00:22:20.023 } 00:22:20.023 ], 00:22:20.023 "driver_specific": {} 00:22:20.023 } 00:22:20.023 ] 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.023 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.289 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.289 "name": "Existed_Raid", 00:22:20.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.289 "strip_size_kb": 64, 00:22:20.289 "state": "configuring", 00:22:20.289 "raid_level": "concat", 00:22:20.289 "superblock": false, 00:22:20.289 "num_base_bdevs": 2, 00:22:20.289 "num_base_bdevs_discovered": 1, 00:22:20.289 "num_base_bdevs_operational": 2, 00:22:20.289 "base_bdevs_list": [ 00:22:20.289 { 00:22:20.289 "name": "BaseBdev1", 00:22:20.289 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:20.289 "is_configured": true, 00:22:20.289 "data_offset": 0, 00:22:20.289 "data_size": 65536 00:22:20.289 }, 00:22:20.289 { 00:22:20.289 "name": "BaseBdev2", 00:22:20.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.289 "is_configured": false, 00:22:20.289 "data_offset": 0, 00:22:20.289 "data_size": 0 00:22:20.289 } 00:22:20.289 ] 00:22:20.289 }' 00:22:20.289 12:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.289 12:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.995 12:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:21.254 [2024-06-07 12:24:44.745641] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:21.254 [2024-06-07 12:24:44.745960] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:22:21.254 12:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:21.514 [2024-06-07 12:24:45.009742] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.514 [2024-06-07 12:24:45.012012] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:21.514 [2024-06-07 12:24:45.012666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.514 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.774 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.774 "name": "Existed_Raid", 00:22:21.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.774 "strip_size_kb": 64, 00:22:21.774 "state": "configuring", 00:22:21.774 "raid_level": "concat", 00:22:21.774 "superblock": false, 00:22:21.774 "num_base_bdevs": 2, 00:22:21.774 "num_base_bdevs_discovered": 1, 00:22:21.774 "num_base_bdevs_operational": 2, 00:22:21.774 "base_bdevs_list": [ 00:22:21.774 { 00:22:21.774 "name": "BaseBdev1", 00:22:21.774 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:21.774 "is_configured": true, 00:22:21.774 "data_offset": 0, 00:22:21.774 "data_size": 65536 00:22:21.774 }, 00:22:21.774 { 00:22:21.774 "name": "BaseBdev2", 00:22:21.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.774 "is_configured": false, 00:22:21.774 "data_offset": 0, 00:22:21.774 "data_size": 0 00:22:21.774 } 00:22:21.774 ] 00:22:21.774 }' 00:22:21.774 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.774 12:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.341 12:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:22.600 [2024-06-07 12:24:46.079588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:22.600 [2024-06-07 12:24:46.079824] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:22:22.600 [2024-06-07 12:24:46.079935] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:22:22.600 [2024-06-07 12:24:46.080121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:22:22.600 [2024-06-07 12:24:46.080491] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:22:22.600 [2024-06-07 12:24:46.080609] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:22:22.600 [2024-06-07 12:24:46.080923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.600 BaseBdev2 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:22.600 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:22.882 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:23.142 [ 00:22:23.142 { 00:22:23.142 "name": "BaseBdev2", 00:22:23.142 "aliases": [ 00:22:23.142 "99f4272b-8a02-4fc1-81e7-8f2241222e2c" 00:22:23.142 ], 00:22:23.142 "product_name": "Malloc disk", 00:22:23.142 "block_size": 512, 00:22:23.142 "num_blocks": 65536, 00:22:23.142 "uuid": "99f4272b-8a02-4fc1-81e7-8f2241222e2c", 00:22:23.142 "assigned_rate_limits": { 00:22:23.142 "rw_ios_per_sec": 0, 00:22:23.142 "rw_mbytes_per_sec": 0, 00:22:23.142 "r_mbytes_per_sec": 0, 00:22:23.142 "w_mbytes_per_sec": 0 00:22:23.142 }, 00:22:23.142 "claimed": true, 00:22:23.142 "claim_type": "exclusive_write", 00:22:23.142 "zoned": false, 00:22:23.142 "supported_io_types": { 00:22:23.142 "read": true, 00:22:23.142 "write": true, 00:22:23.142 "unmap": true, 00:22:23.142 "write_zeroes": true, 00:22:23.142 "flush": true, 00:22:23.142 "reset": true, 00:22:23.142 "compare": false, 00:22:23.142 "compare_and_write": false, 00:22:23.142 "abort": true, 00:22:23.142 "nvme_admin": false, 00:22:23.142 "nvme_io": false 00:22:23.142 }, 00:22:23.142 "memory_domains": [ 00:22:23.142 { 00:22:23.142 "dma_device_id": "system", 00:22:23.142 "dma_device_type": 1 00:22:23.142 }, 00:22:23.142 { 00:22:23.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.142 "dma_device_type": 2 00:22:23.142 } 00:22:23.142 ], 00:22:23.142 "driver_specific": {} 00:22:23.142 } 00:22:23.142 ] 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.142 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.402 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.402 "name": "Existed_Raid", 00:22:23.402 "uuid": "d7436a2e-c35e-48cd-b53c-af1df6bf8679", 00:22:23.402 "strip_size_kb": 64, 00:22:23.402 "state": "online", 00:22:23.402 "raid_level": "concat", 00:22:23.402 "superblock": false, 00:22:23.402 "num_base_bdevs": 2, 00:22:23.402 "num_base_bdevs_discovered": 2, 00:22:23.402 "num_base_bdevs_operational": 2, 00:22:23.402 "base_bdevs_list": [ 00:22:23.402 { 00:22:23.402 "name": "BaseBdev1", 00:22:23.402 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:23.402 "is_configured": true, 00:22:23.402 "data_offset": 0, 00:22:23.402 "data_size": 65536 00:22:23.402 }, 00:22:23.402 { 00:22:23.402 "name": "BaseBdev2", 00:22:23.402 "uuid": "99f4272b-8a02-4fc1-81e7-8f2241222e2c", 00:22:23.402 "is_configured": true, 00:22:23.402 "data_offset": 0, 00:22:23.402 "data_size": 65536 00:22:23.402 } 00:22:23.402 ] 00:22:23.402 }' 00:22:23.402 12:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.402 12:24:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:24.336 12:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:24.594 [2024-06-07 12:24:48.042550] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:24.594 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:24.594 "name": "Existed_Raid", 00:22:24.594 "aliases": [ 00:22:24.594 "d7436a2e-c35e-48cd-b53c-af1df6bf8679" 00:22:24.594 ], 00:22:24.594 "product_name": "Raid Volume", 00:22:24.594 "block_size": 512, 00:22:24.594 "num_blocks": 131072, 00:22:24.594 "uuid": "d7436a2e-c35e-48cd-b53c-af1df6bf8679", 00:22:24.594 "assigned_rate_limits": { 00:22:24.594 "rw_ios_per_sec": 0, 00:22:24.594 "rw_mbytes_per_sec": 0, 00:22:24.594 "r_mbytes_per_sec": 0, 00:22:24.594 "w_mbytes_per_sec": 0 00:22:24.594 }, 00:22:24.594 "claimed": false, 00:22:24.594 "zoned": false, 00:22:24.594 "supported_io_types": { 00:22:24.594 "read": true, 00:22:24.594 "write": true, 00:22:24.594 "unmap": true, 00:22:24.594 "write_zeroes": true, 00:22:24.594 "flush": true, 00:22:24.594 "reset": true, 00:22:24.594 "compare": false, 00:22:24.594 "compare_and_write": false, 00:22:24.594 "abort": false, 00:22:24.594 "nvme_admin": false, 00:22:24.594 "nvme_io": false 00:22:24.594 }, 00:22:24.595 "memory_domains": [ 00:22:24.595 { 00:22:24.595 "dma_device_id": "system", 00:22:24.595 "dma_device_type": 1 00:22:24.595 }, 00:22:24.595 { 00:22:24.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.595 "dma_device_type": 2 00:22:24.595 }, 00:22:24.595 { 00:22:24.595 "dma_device_id": "system", 00:22:24.595 "dma_device_type": 1 00:22:24.595 }, 00:22:24.595 { 00:22:24.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.595 "dma_device_type": 2 00:22:24.595 } 00:22:24.595 ], 00:22:24.595 "driver_specific": { 00:22:24.595 "raid": { 00:22:24.595 "uuid": "d7436a2e-c35e-48cd-b53c-af1df6bf8679", 00:22:24.595 "strip_size_kb": 64, 00:22:24.595 "state": "online", 00:22:24.595 "raid_level": "concat", 00:22:24.595 "superblock": false, 00:22:24.595 "num_base_bdevs": 2, 00:22:24.595 "num_base_bdevs_discovered": 2, 00:22:24.595 "num_base_bdevs_operational": 2, 00:22:24.595 "base_bdevs_list": [ 00:22:24.595 { 00:22:24.595 "name": "BaseBdev1", 00:22:24.595 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:24.595 "is_configured": true, 00:22:24.595 "data_offset": 0, 00:22:24.595 "data_size": 65536 00:22:24.595 }, 00:22:24.595 { 00:22:24.595 "name": "BaseBdev2", 00:22:24.595 "uuid": "99f4272b-8a02-4fc1-81e7-8f2241222e2c", 00:22:24.595 "is_configured": true, 00:22:24.595 "data_offset": 0, 00:22:24.595 "data_size": 65536 00:22:24.595 } 00:22:24.595 ] 00:22:24.595 } 00:22:24.595 } 00:22:24.595 }' 00:22:24.595 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:24.595 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:24.595 BaseBdev2' 00:22:24.595 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.595 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:24.595 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.854 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.854 "name": "BaseBdev1", 00:22:24.854 "aliases": [ 00:22:24.854 "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1" 00:22:24.854 ], 00:22:24.854 "product_name": "Malloc disk", 00:22:24.854 "block_size": 512, 00:22:24.854 "num_blocks": 65536, 00:22:24.854 "uuid": "45f153f7-7ef3-47e7-9c22-ebf78f61c0f1", 00:22:24.854 "assigned_rate_limits": { 00:22:24.854 "rw_ios_per_sec": 0, 00:22:24.854 "rw_mbytes_per_sec": 0, 00:22:24.854 "r_mbytes_per_sec": 0, 00:22:24.854 "w_mbytes_per_sec": 0 00:22:24.854 }, 00:22:24.854 "claimed": true, 00:22:24.854 "claim_type": "exclusive_write", 00:22:24.854 "zoned": false, 00:22:24.854 "supported_io_types": { 00:22:24.854 "read": true, 00:22:24.854 "write": true, 00:22:24.854 "unmap": true, 00:22:24.854 "write_zeroes": true, 00:22:24.854 "flush": true, 00:22:24.854 "reset": true, 00:22:24.854 "compare": false, 00:22:24.854 "compare_and_write": false, 00:22:24.854 "abort": true, 00:22:24.854 "nvme_admin": false, 00:22:24.854 "nvme_io": false 00:22:24.854 }, 00:22:24.854 "memory_domains": [ 00:22:24.855 { 00:22:24.855 "dma_device_id": "system", 00:22:24.855 "dma_device_type": 1 00:22:24.855 }, 00:22:24.855 { 00:22:24.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.855 "dma_device_type": 2 00:22:24.855 } 00:22:24.855 ], 00:22:24.855 "driver_specific": {} 00:22:24.855 }' 00:22:24.855 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.855 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.855 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.855 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:25.114 12:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.682 "name": "BaseBdev2", 00:22:25.682 "aliases": [ 00:22:25.682 "99f4272b-8a02-4fc1-81e7-8f2241222e2c" 00:22:25.682 ], 00:22:25.682 "product_name": "Malloc disk", 00:22:25.682 "block_size": 512, 00:22:25.682 "num_blocks": 65536, 00:22:25.682 "uuid": "99f4272b-8a02-4fc1-81e7-8f2241222e2c", 00:22:25.682 "assigned_rate_limits": { 00:22:25.682 "rw_ios_per_sec": 0, 00:22:25.682 "rw_mbytes_per_sec": 0, 00:22:25.682 "r_mbytes_per_sec": 0, 00:22:25.682 "w_mbytes_per_sec": 0 00:22:25.682 }, 00:22:25.682 "claimed": true, 00:22:25.682 "claim_type": "exclusive_write", 00:22:25.682 "zoned": false, 00:22:25.682 "supported_io_types": { 00:22:25.682 "read": true, 00:22:25.682 "write": true, 00:22:25.682 "unmap": true, 00:22:25.682 "write_zeroes": true, 00:22:25.682 "flush": true, 00:22:25.682 "reset": true, 00:22:25.682 "compare": false, 00:22:25.682 "compare_and_write": false, 00:22:25.682 "abort": true, 00:22:25.682 "nvme_admin": false, 00:22:25.682 "nvme_io": false 00:22:25.682 }, 00:22:25.682 "memory_domains": [ 00:22:25.682 { 00:22:25.682 "dma_device_id": "system", 00:22:25.682 "dma_device_type": 1 00:22:25.682 }, 00:22:25.682 { 00:22:25.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.682 "dma_device_type": 2 00:22:25.682 } 00:22:25.682 ], 00:22:25.682 "driver_specific": {} 00:22:25.682 }' 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.682 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.940 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.940 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.940 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:26.199 [2024-06-07 12:24:49.662751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:26.199 [2024-06-07 12:24:49.663030] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:26.199 [2024-06-07 12:24:49.663276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.199 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:26.458 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.458 "name": "Existed_Raid", 00:22:26.458 "uuid": "d7436a2e-c35e-48cd-b53c-af1df6bf8679", 00:22:26.458 "strip_size_kb": 64, 00:22:26.458 "state": "offline", 00:22:26.458 "raid_level": "concat", 00:22:26.458 "superblock": false, 00:22:26.458 "num_base_bdevs": 2, 00:22:26.458 "num_base_bdevs_discovered": 1, 00:22:26.458 "num_base_bdevs_operational": 1, 00:22:26.458 "base_bdevs_list": [ 00:22:26.458 { 00:22:26.458 "name": null, 00:22:26.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.458 "is_configured": false, 00:22:26.458 "data_offset": 0, 00:22:26.458 "data_size": 65536 00:22:26.458 }, 00:22:26.458 { 00:22:26.458 "name": "BaseBdev2", 00:22:26.458 "uuid": "99f4272b-8a02-4fc1-81e7-8f2241222e2c", 00:22:26.458 "is_configured": true, 00:22:26.458 "data_offset": 0, 00:22:26.458 "data_size": 65536 00:22:26.458 } 00:22:26.458 ] 00:22:26.458 }' 00:22:26.458 12:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.458 12:24:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:27.024 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:27.025 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:27.025 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:27.025 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.282 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:27.282 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:27.282 12:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:27.540 [2024-06-07 12:24:51.156653] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:27.540 [2024-06-07 12:24:51.157008] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 199500 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 199500 ']' 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 199500 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:27.798 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 199500 00:22:28.056 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:28.056 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:28.056 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 199500' 00:22:28.056 killing process with pid 199500 00:22:28.056 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 199500 00:22:28.056 [2024-06-07 12:24:51.451907] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:28.056 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 199500 00:22:28.056 [2024-06-07 12:24:51.452190] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:28.315 00:22:28.315 real 0m11.798s 00:22:28.315 user 0m20.898s 00:22:28.315 sys 0m1.930s 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.315 ************************************ 00:22:28.315 END TEST raid_state_function_test 00:22:28.315 ************************************ 00:22:28.315 12:24:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:22:28.315 12:24:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:28.315 12:24:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:28.315 12:24:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:28.315 ************************************ 00:22:28.315 START TEST raid_state_function_test_sb 00:22:28.315 ************************************ 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 true 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=199874 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:28.315 Process raid pid: 199874 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 199874' 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 199874 /var/tmp/spdk-raid.sock 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 199874 ']' 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:28.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:28.315 12:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:28.315 [2024-06-07 12:24:51.915295] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:28.315 [2024-06-07 12:24:51.916125] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:28.574 [2024-06-07 12:24:52.051031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:28.574 [2024-06-07 12:24:52.138663] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:28.831 [2024-06-07 12:24:52.220059] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:29.398 12:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:29.398 12:24:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:22:29.398 12:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:29.659 [2024-06-07 12:24:53.218952] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:29.659 [2024-06-07 12:24:53.219358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:29.659 [2024-06-07 12:24:53.219492] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:29.659 [2024-06-07 12:24:53.219558] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:29.659 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:29.659 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:29.659 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:29.659 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.660 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.226 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.226 "name": "Existed_Raid", 00:22:30.226 "uuid": "cd2c4043-1672-47fa-a301-c50502cccf53", 00:22:30.226 "strip_size_kb": 64, 00:22:30.226 "state": "configuring", 00:22:30.226 "raid_level": "concat", 00:22:30.226 "superblock": true, 00:22:30.226 "num_base_bdevs": 2, 00:22:30.226 "num_base_bdevs_discovered": 0, 00:22:30.226 "num_base_bdevs_operational": 2, 00:22:30.226 "base_bdevs_list": [ 00:22:30.226 { 00:22:30.226 "name": "BaseBdev1", 00:22:30.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.226 "is_configured": false, 00:22:30.226 "data_offset": 0, 00:22:30.226 "data_size": 0 00:22:30.226 }, 00:22:30.226 { 00:22:30.226 "name": "BaseBdev2", 00:22:30.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.226 "is_configured": false, 00:22:30.226 "data_offset": 0, 00:22:30.226 "data_size": 0 00:22:30.226 } 00:22:30.226 ] 00:22:30.226 }' 00:22:30.226 12:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.226 12:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.792 12:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:31.049 [2024-06-07 12:24:54.534989] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:31.049 [2024-06-07 12:24:54.535221] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:22:31.049 12:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:31.308 [2024-06-07 12:24:54.763065] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:31.308 [2024-06-07 12:24:54.763402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:31.308 [2024-06-07 12:24:54.763515] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:31.308 [2024-06-07 12:24:54.763636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:31.308 12:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:31.567 [2024-06-07 12:24:54.990404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:31.567 BaseBdev1 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:31.567 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:31.826 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:32.087 [ 00:22:32.087 { 00:22:32.087 "name": "BaseBdev1", 00:22:32.087 "aliases": [ 00:22:32.087 "b28b4cda-634d-40b7-a874-47b4d8f23bc8" 00:22:32.087 ], 00:22:32.087 "product_name": "Malloc disk", 00:22:32.087 "block_size": 512, 00:22:32.087 "num_blocks": 65536, 00:22:32.087 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:32.087 "assigned_rate_limits": { 00:22:32.087 "rw_ios_per_sec": 0, 00:22:32.087 "rw_mbytes_per_sec": 0, 00:22:32.087 "r_mbytes_per_sec": 0, 00:22:32.087 "w_mbytes_per_sec": 0 00:22:32.087 }, 00:22:32.087 "claimed": true, 00:22:32.087 "claim_type": "exclusive_write", 00:22:32.087 "zoned": false, 00:22:32.087 "supported_io_types": { 00:22:32.087 "read": true, 00:22:32.087 "write": true, 00:22:32.087 "unmap": true, 00:22:32.087 "write_zeroes": true, 00:22:32.087 "flush": true, 00:22:32.087 "reset": true, 00:22:32.087 "compare": false, 00:22:32.087 "compare_and_write": false, 00:22:32.087 "abort": true, 00:22:32.087 "nvme_admin": false, 00:22:32.087 "nvme_io": false 00:22:32.087 }, 00:22:32.087 "memory_domains": [ 00:22:32.087 { 00:22:32.087 "dma_device_id": "system", 00:22:32.087 "dma_device_type": 1 00:22:32.087 }, 00:22:32.087 { 00:22:32.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.087 "dma_device_type": 2 00:22:32.087 } 00:22:32.087 ], 00:22:32.087 "driver_specific": {} 00:22:32.087 } 00:22:32.087 ] 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.087 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.346 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.346 "name": "Existed_Raid", 00:22:32.346 "uuid": "0d323cb6-e86d-4fa2-9676-6b297db2bbbb", 00:22:32.346 "strip_size_kb": 64, 00:22:32.346 "state": "configuring", 00:22:32.346 "raid_level": "concat", 00:22:32.346 "superblock": true, 00:22:32.346 "num_base_bdevs": 2, 00:22:32.346 "num_base_bdevs_discovered": 1, 00:22:32.346 "num_base_bdevs_operational": 2, 00:22:32.346 "base_bdevs_list": [ 00:22:32.346 { 00:22:32.346 "name": "BaseBdev1", 00:22:32.346 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:32.346 "is_configured": true, 00:22:32.346 "data_offset": 2048, 00:22:32.346 "data_size": 63488 00:22:32.346 }, 00:22:32.346 { 00:22:32.346 "name": "BaseBdev2", 00:22:32.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.346 "is_configured": false, 00:22:32.346 "data_offset": 0, 00:22:32.346 "data_size": 0 00:22:32.346 } 00:22:32.346 ] 00:22:32.346 }' 00:22:32.346 12:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.346 12:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:32.913 12:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:33.171 [2024-06-07 12:24:56.666743] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:33.171 [2024-06-07 12:24:56.666864] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:22:33.171 12:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:33.429 [2024-06-07 12:24:57.014905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.429 [2024-06-07 12:24:57.017018] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:33.429 [2024-06-07 12:24:57.017090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.429 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.430 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.430 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.430 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.430 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.995 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.995 "name": "Existed_Raid", 00:22:33.995 "uuid": "27b3d41b-ba51-4849-b676-e478d7c9fbaa", 00:22:33.995 "strip_size_kb": 64, 00:22:33.995 "state": "configuring", 00:22:33.995 "raid_level": "concat", 00:22:33.995 "superblock": true, 00:22:33.995 "num_base_bdevs": 2, 00:22:33.995 "num_base_bdevs_discovered": 1, 00:22:33.995 "num_base_bdevs_operational": 2, 00:22:33.995 "base_bdevs_list": [ 00:22:33.995 { 00:22:33.995 "name": "BaseBdev1", 00:22:33.995 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:33.995 "is_configured": true, 00:22:33.995 "data_offset": 2048, 00:22:33.995 "data_size": 63488 00:22:33.995 }, 00:22:33.995 { 00:22:33.995 "name": "BaseBdev2", 00:22:33.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.995 "is_configured": false, 00:22:33.995 "data_offset": 0, 00:22:33.995 "data_size": 0 00:22:33.995 } 00:22:33.995 ] 00:22:33.995 }' 00:22:33.995 12:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.995 12:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.561 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:34.852 [2024-06-07 12:24:58.319961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:34.852 [2024-06-07 12:24:58.320199] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:22:34.852 [2024-06-07 12:24:58.320236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:34.852 [2024-06-07 12:24:58.320406] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:22:34.852 [2024-06-07 12:24:58.320781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:22:34.852 [2024-06-07 12:24:58.320808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:22:34.852 [2024-06-07 12:24:58.320947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:34.852 BaseBdev2 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:34.852 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:35.110 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:35.369 [ 00:22:35.369 { 00:22:35.369 "name": "BaseBdev2", 00:22:35.369 "aliases": [ 00:22:35.369 "9bc29cfa-5b26-4d98-8c67-26fb0993bad9" 00:22:35.369 ], 00:22:35.369 "product_name": "Malloc disk", 00:22:35.369 "block_size": 512, 00:22:35.369 "num_blocks": 65536, 00:22:35.369 "uuid": "9bc29cfa-5b26-4d98-8c67-26fb0993bad9", 00:22:35.369 "assigned_rate_limits": { 00:22:35.369 "rw_ios_per_sec": 0, 00:22:35.369 "rw_mbytes_per_sec": 0, 00:22:35.369 "r_mbytes_per_sec": 0, 00:22:35.369 "w_mbytes_per_sec": 0 00:22:35.369 }, 00:22:35.369 "claimed": true, 00:22:35.369 "claim_type": "exclusive_write", 00:22:35.369 "zoned": false, 00:22:35.369 "supported_io_types": { 00:22:35.369 "read": true, 00:22:35.369 "write": true, 00:22:35.369 "unmap": true, 00:22:35.369 "write_zeroes": true, 00:22:35.369 "flush": true, 00:22:35.369 "reset": true, 00:22:35.369 "compare": false, 00:22:35.369 "compare_and_write": false, 00:22:35.369 "abort": true, 00:22:35.369 "nvme_admin": false, 00:22:35.369 "nvme_io": false 00:22:35.369 }, 00:22:35.369 "memory_domains": [ 00:22:35.369 { 00:22:35.369 "dma_device_id": "system", 00:22:35.369 "dma_device_type": 1 00:22:35.369 }, 00:22:35.369 { 00:22:35.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.369 "dma_device_type": 2 00:22:35.369 } 00:22:35.369 ], 00:22:35.369 "driver_specific": {} 00:22:35.369 } 00:22:35.369 ] 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.369 12:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:35.629 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.629 "name": "Existed_Raid", 00:22:35.629 "uuid": "27b3d41b-ba51-4849-b676-e478d7c9fbaa", 00:22:35.629 "strip_size_kb": 64, 00:22:35.629 "state": "online", 00:22:35.629 "raid_level": "concat", 00:22:35.629 "superblock": true, 00:22:35.629 "num_base_bdevs": 2, 00:22:35.629 "num_base_bdevs_discovered": 2, 00:22:35.629 "num_base_bdevs_operational": 2, 00:22:35.629 "base_bdevs_list": [ 00:22:35.629 { 00:22:35.629 "name": "BaseBdev1", 00:22:35.629 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:35.629 "is_configured": true, 00:22:35.629 "data_offset": 2048, 00:22:35.629 "data_size": 63488 00:22:35.629 }, 00:22:35.629 { 00:22:35.629 "name": "BaseBdev2", 00:22:35.629 "uuid": "9bc29cfa-5b26-4d98-8c67-26fb0993bad9", 00:22:35.629 "is_configured": true, 00:22:35.629 "data_offset": 2048, 00:22:35.629 "data_size": 63488 00:22:35.629 } 00:22:35.629 ] 00:22:35.629 }' 00:22:35.629 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.629 12:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:36.198 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:36.456 [2024-06-07 12:24:59.916536] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:36.456 "name": "Existed_Raid", 00:22:36.456 "aliases": [ 00:22:36.456 "27b3d41b-ba51-4849-b676-e478d7c9fbaa" 00:22:36.456 ], 00:22:36.456 "product_name": "Raid Volume", 00:22:36.456 "block_size": 512, 00:22:36.456 "num_blocks": 126976, 00:22:36.456 "uuid": "27b3d41b-ba51-4849-b676-e478d7c9fbaa", 00:22:36.456 "assigned_rate_limits": { 00:22:36.456 "rw_ios_per_sec": 0, 00:22:36.456 "rw_mbytes_per_sec": 0, 00:22:36.456 "r_mbytes_per_sec": 0, 00:22:36.456 "w_mbytes_per_sec": 0 00:22:36.456 }, 00:22:36.456 "claimed": false, 00:22:36.456 "zoned": false, 00:22:36.456 "supported_io_types": { 00:22:36.456 "read": true, 00:22:36.456 "write": true, 00:22:36.456 "unmap": true, 00:22:36.456 "write_zeroes": true, 00:22:36.456 "flush": true, 00:22:36.456 "reset": true, 00:22:36.456 "compare": false, 00:22:36.456 "compare_and_write": false, 00:22:36.456 "abort": false, 00:22:36.456 "nvme_admin": false, 00:22:36.456 "nvme_io": false 00:22:36.456 }, 00:22:36.456 "memory_domains": [ 00:22:36.456 { 00:22:36.456 "dma_device_id": "system", 00:22:36.456 "dma_device_type": 1 00:22:36.456 }, 00:22:36.456 { 00:22:36.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.456 "dma_device_type": 2 00:22:36.456 }, 00:22:36.456 { 00:22:36.456 "dma_device_id": "system", 00:22:36.456 "dma_device_type": 1 00:22:36.456 }, 00:22:36.456 { 00:22:36.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.456 "dma_device_type": 2 00:22:36.456 } 00:22:36.456 ], 00:22:36.456 "driver_specific": { 00:22:36.456 "raid": { 00:22:36.456 "uuid": "27b3d41b-ba51-4849-b676-e478d7c9fbaa", 00:22:36.456 "strip_size_kb": 64, 00:22:36.456 "state": "online", 00:22:36.456 "raid_level": "concat", 00:22:36.456 "superblock": true, 00:22:36.456 "num_base_bdevs": 2, 00:22:36.456 "num_base_bdevs_discovered": 2, 00:22:36.456 "num_base_bdevs_operational": 2, 00:22:36.456 "base_bdevs_list": [ 00:22:36.456 { 00:22:36.456 "name": "BaseBdev1", 00:22:36.456 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:36.456 "is_configured": true, 00:22:36.456 "data_offset": 2048, 00:22:36.456 "data_size": 63488 00:22:36.456 }, 00:22:36.456 { 00:22:36.456 "name": "BaseBdev2", 00:22:36.456 "uuid": "9bc29cfa-5b26-4d98-8c67-26fb0993bad9", 00:22:36.456 "is_configured": true, 00:22:36.456 "data_offset": 2048, 00:22:36.456 "data_size": 63488 00:22:36.456 } 00:22:36.456 ] 00:22:36.456 } 00:22:36.456 } 00:22:36.456 }' 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:36.456 BaseBdev2' 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:36.456 12:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:36.714 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:36.714 "name": "BaseBdev1", 00:22:36.714 "aliases": [ 00:22:36.714 "b28b4cda-634d-40b7-a874-47b4d8f23bc8" 00:22:36.714 ], 00:22:36.714 "product_name": "Malloc disk", 00:22:36.714 "block_size": 512, 00:22:36.714 "num_blocks": 65536, 00:22:36.714 "uuid": "b28b4cda-634d-40b7-a874-47b4d8f23bc8", 00:22:36.714 "assigned_rate_limits": { 00:22:36.714 "rw_ios_per_sec": 0, 00:22:36.714 "rw_mbytes_per_sec": 0, 00:22:36.714 "r_mbytes_per_sec": 0, 00:22:36.714 "w_mbytes_per_sec": 0 00:22:36.714 }, 00:22:36.714 "claimed": true, 00:22:36.714 "claim_type": "exclusive_write", 00:22:36.714 "zoned": false, 00:22:36.714 "supported_io_types": { 00:22:36.714 "read": true, 00:22:36.714 "write": true, 00:22:36.714 "unmap": true, 00:22:36.714 "write_zeroes": true, 00:22:36.714 "flush": true, 00:22:36.714 "reset": true, 00:22:36.714 "compare": false, 00:22:36.714 "compare_and_write": false, 00:22:36.714 "abort": true, 00:22:36.714 "nvme_admin": false, 00:22:36.714 "nvme_io": false 00:22:36.714 }, 00:22:36.714 "memory_domains": [ 00:22:36.714 { 00:22:36.714 "dma_device_id": "system", 00:22:36.714 "dma_device_type": 1 00:22:36.714 }, 00:22:36.714 { 00:22:36.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.714 "dma_device_type": 2 00:22:36.714 } 00:22:36.714 ], 00:22:36.714 "driver_specific": {} 00:22:36.714 }' 00:22:36.714 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.714 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:36.714 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:36.714 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:36.971 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:37.230 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:37.230 "name": "BaseBdev2", 00:22:37.230 "aliases": [ 00:22:37.230 "9bc29cfa-5b26-4d98-8c67-26fb0993bad9" 00:22:37.230 ], 00:22:37.230 "product_name": "Malloc disk", 00:22:37.230 "block_size": 512, 00:22:37.230 "num_blocks": 65536, 00:22:37.230 "uuid": "9bc29cfa-5b26-4d98-8c67-26fb0993bad9", 00:22:37.230 "assigned_rate_limits": { 00:22:37.230 "rw_ios_per_sec": 0, 00:22:37.230 "rw_mbytes_per_sec": 0, 00:22:37.230 "r_mbytes_per_sec": 0, 00:22:37.230 "w_mbytes_per_sec": 0 00:22:37.230 }, 00:22:37.230 "claimed": true, 00:22:37.230 "claim_type": "exclusive_write", 00:22:37.230 "zoned": false, 00:22:37.230 "supported_io_types": { 00:22:37.230 "read": true, 00:22:37.230 "write": true, 00:22:37.230 "unmap": true, 00:22:37.230 "write_zeroes": true, 00:22:37.230 "flush": true, 00:22:37.230 "reset": true, 00:22:37.230 "compare": false, 00:22:37.230 "compare_and_write": false, 00:22:37.230 "abort": true, 00:22:37.230 "nvme_admin": false, 00:22:37.230 "nvme_io": false 00:22:37.230 }, 00:22:37.230 "memory_domains": [ 00:22:37.230 { 00:22:37.230 "dma_device_id": "system", 00:22:37.230 "dma_device_type": 1 00:22:37.230 }, 00:22:37.230 { 00:22:37.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.230 "dma_device_type": 2 00:22:37.230 } 00:22:37.230 ], 00:22:37.230 "driver_specific": {} 00:22:37.230 }' 00:22:37.230 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.230 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.489 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:37.489 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.489 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.489 12:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:37.489 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:37.489 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:37.489 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:37.489 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:37.489 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:37.746 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:37.746 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:38.004 [2024-06-07 12:25:01.404654] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:38.004 [2024-06-07 12:25:01.404701] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:38.004 [2024-06-07 12:25:01.404777] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.004 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.262 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.262 "name": "Existed_Raid", 00:22:38.262 "uuid": "27b3d41b-ba51-4849-b676-e478d7c9fbaa", 00:22:38.262 "strip_size_kb": 64, 00:22:38.262 "state": "offline", 00:22:38.262 "raid_level": "concat", 00:22:38.262 "superblock": true, 00:22:38.262 "num_base_bdevs": 2, 00:22:38.262 "num_base_bdevs_discovered": 1, 00:22:38.262 "num_base_bdevs_operational": 1, 00:22:38.262 "base_bdevs_list": [ 00:22:38.262 { 00:22:38.262 "name": null, 00:22:38.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.262 "is_configured": false, 00:22:38.262 "data_offset": 2048, 00:22:38.262 "data_size": 63488 00:22:38.262 }, 00:22:38.262 { 00:22:38.262 "name": "BaseBdev2", 00:22:38.262 "uuid": "9bc29cfa-5b26-4d98-8c67-26fb0993bad9", 00:22:38.262 "is_configured": true, 00:22:38.262 "data_offset": 2048, 00:22:38.262 "data_size": 63488 00:22:38.262 } 00:22:38.262 ] 00:22:38.262 }' 00:22:38.262 12:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.262 12:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.828 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:38.828 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:38.828 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.828 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:39.087 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:39.087 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:39.087 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:39.346 [2024-06-07 12:25:02.822795] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:39.346 [2024-06-07 12:25:02.822934] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:22:39.346 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:39.346 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:39.346 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.346 12:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 199874 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 199874 ']' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 199874 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 199874 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 199874' 00:22:39.605 killing process with pid 199874 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 199874 00:22:39.605 [2024-06-07 12:25:03.161557] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:39.605 [2024-06-07 12:25:03.161657] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:39.605 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 199874 00:22:40.174 12:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:40.174 00:22:40.174 real 0m11.639s 00:22:40.174 user 0m20.675s 00:22:40.174 sys 0m1.936s 00:22:40.174 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:40.174 12:25:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.174 ************************************ 00:22:40.174 END TEST raid_state_function_test_sb 00:22:40.174 ************************************ 00:22:40.174 12:25:03 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:22:40.174 12:25:03 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:22:40.174 12:25:03 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:40.174 12:25:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:40.174 ************************************ 00:22:40.174 START TEST raid_superblock_test 00:22:40.174 ************************************ 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 2 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=200254 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 200254 /var/tmp/spdk-raid.sock 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 200254 ']' 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:40.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:40.174 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.174 [2024-06-07 12:25:03.613443] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:40.174 [2024-06-07 12:25:03.613664] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200254 ] 00:22:40.174 [2024-06-07 12:25:03.749619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:40.433 [2024-06-07 12:25:03.840549] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:40.433 [2024-06-07 12:25:03.919096] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:40.433 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:40.434 12:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:40.692 malloc1 00:22:40.692 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:40.951 [2024-06-07 12:25:04.503253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:40.951 [2024-06-07 12:25:04.503418] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.951 [2024-06-07 12:25:04.503481] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:22:40.951 [2024-06-07 12:25:04.503560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.951 [2024-06-07 12:25:04.505966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.951 [2024-06-07 12:25:04.506050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:40.951 pt1 00:22:40.951 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:40.951 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:40.952 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:41.210 malloc2 00:22:41.210 12:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:41.469 [2024-06-07 12:25:05.011465] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:41.469 [2024-06-07 12:25:05.011589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.469 [2024-06-07 12:25:05.011635] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:22:41.469 [2024-06-07 12:25:05.011702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.469 [2024-06-07 12:25:05.014013] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.469 [2024-06-07 12:25:05.014067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:41.469 pt2 00:22:41.469 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:41.469 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:41.469 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:22:41.727 [2024-06-07 12:25:05.303522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:41.727 [2024-06-07 12:25:05.305597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:41.727 [2024-06-07 12:25:05.305756] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:22:41.727 [2024-06-07 12:25:05.305769] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:41.727 [2024-06-07 12:25:05.305908] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:22:41.727 [2024-06-07 12:25:05.306260] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:22:41.727 [2024-06-07 12:25:05.306278] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:22:41.727 [2024-06-07 12:25:05.306400] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.727 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.985 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.985 "name": "raid_bdev1", 00:22:41.985 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:41.985 "strip_size_kb": 64, 00:22:41.985 "state": "online", 00:22:41.985 "raid_level": "concat", 00:22:41.985 "superblock": true, 00:22:41.985 "num_base_bdevs": 2, 00:22:41.985 "num_base_bdevs_discovered": 2, 00:22:41.985 "num_base_bdevs_operational": 2, 00:22:41.985 "base_bdevs_list": [ 00:22:41.985 { 00:22:41.985 "name": "pt1", 00:22:41.985 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:41.985 "is_configured": true, 00:22:41.985 "data_offset": 2048, 00:22:41.985 "data_size": 63488 00:22:41.985 }, 00:22:41.985 { 00:22:41.985 "name": "pt2", 00:22:41.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.985 "is_configured": true, 00:22:41.985 "data_offset": 2048, 00:22:41.985 "data_size": 63488 00:22:41.985 } 00:22:41.985 ] 00:22:41.985 }' 00:22:41.985 12:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.985 12:25:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:42.550 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:42.808 [2024-06-07 12:25:06.395765] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:42.808 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:42.808 "name": "raid_bdev1", 00:22:42.808 "aliases": [ 00:22:42.808 "7650b3a7-8a54-4ab5-bc15-986e05a72b53" 00:22:42.808 ], 00:22:42.808 "product_name": "Raid Volume", 00:22:42.808 "block_size": 512, 00:22:42.808 "num_blocks": 126976, 00:22:42.808 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:42.808 "assigned_rate_limits": { 00:22:42.808 "rw_ios_per_sec": 0, 00:22:42.808 "rw_mbytes_per_sec": 0, 00:22:42.808 "r_mbytes_per_sec": 0, 00:22:42.808 "w_mbytes_per_sec": 0 00:22:42.808 }, 00:22:42.808 "claimed": false, 00:22:42.808 "zoned": false, 00:22:42.808 "supported_io_types": { 00:22:42.808 "read": true, 00:22:42.808 "write": true, 00:22:42.808 "unmap": true, 00:22:42.808 "write_zeroes": true, 00:22:42.808 "flush": true, 00:22:42.808 "reset": true, 00:22:42.808 "compare": false, 00:22:42.808 "compare_and_write": false, 00:22:42.808 "abort": false, 00:22:42.808 "nvme_admin": false, 00:22:42.808 "nvme_io": false 00:22:42.808 }, 00:22:42.808 "memory_domains": [ 00:22:42.808 { 00:22:42.808 "dma_device_id": "system", 00:22:42.808 "dma_device_type": 1 00:22:42.808 }, 00:22:42.808 { 00:22:42.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.808 "dma_device_type": 2 00:22:42.808 }, 00:22:42.808 { 00:22:42.808 "dma_device_id": "system", 00:22:42.808 "dma_device_type": 1 00:22:42.808 }, 00:22:42.808 { 00:22:42.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.809 "dma_device_type": 2 00:22:42.809 } 00:22:42.809 ], 00:22:42.809 "driver_specific": { 00:22:42.809 "raid": { 00:22:42.809 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:42.809 "strip_size_kb": 64, 00:22:42.809 "state": "online", 00:22:42.809 "raid_level": "concat", 00:22:42.809 "superblock": true, 00:22:42.809 "num_base_bdevs": 2, 00:22:42.809 "num_base_bdevs_discovered": 2, 00:22:42.809 "num_base_bdevs_operational": 2, 00:22:42.809 "base_bdevs_list": [ 00:22:42.809 { 00:22:42.809 "name": "pt1", 00:22:42.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:42.809 "is_configured": true, 00:22:42.809 "data_offset": 2048, 00:22:42.809 "data_size": 63488 00:22:42.809 }, 00:22:42.809 { 00:22:42.809 "name": "pt2", 00:22:42.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:42.809 "is_configured": true, 00:22:42.809 "data_offset": 2048, 00:22:42.809 "data_size": 63488 00:22:42.809 } 00:22:42.809 ] 00:22:42.809 } 00:22:42.809 } 00:22:42.809 }' 00:22:42.809 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:43.066 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:43.066 pt2' 00:22:43.066 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:43.066 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:43.066 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:43.066 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:43.066 "name": "pt1", 00:22:43.066 "aliases": [ 00:22:43.066 "00000000-0000-0000-0000-000000000001" 00:22:43.066 ], 00:22:43.066 "product_name": "passthru", 00:22:43.066 "block_size": 512, 00:22:43.066 "num_blocks": 65536, 00:22:43.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:43.066 "assigned_rate_limits": { 00:22:43.066 "rw_ios_per_sec": 0, 00:22:43.066 "rw_mbytes_per_sec": 0, 00:22:43.066 "r_mbytes_per_sec": 0, 00:22:43.066 "w_mbytes_per_sec": 0 00:22:43.066 }, 00:22:43.066 "claimed": true, 00:22:43.066 "claim_type": "exclusive_write", 00:22:43.067 "zoned": false, 00:22:43.067 "supported_io_types": { 00:22:43.067 "read": true, 00:22:43.067 "write": true, 00:22:43.067 "unmap": true, 00:22:43.067 "write_zeroes": true, 00:22:43.067 "flush": true, 00:22:43.067 "reset": true, 00:22:43.067 "compare": false, 00:22:43.067 "compare_and_write": false, 00:22:43.067 "abort": true, 00:22:43.067 "nvme_admin": false, 00:22:43.067 "nvme_io": false 00:22:43.067 }, 00:22:43.067 "memory_domains": [ 00:22:43.067 { 00:22:43.067 "dma_device_id": "system", 00:22:43.067 "dma_device_type": 1 00:22:43.067 }, 00:22:43.067 { 00:22:43.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.067 "dma_device_type": 2 00:22:43.067 } 00:22:43.067 ], 00:22:43.067 "driver_specific": { 00:22:43.067 "passthru": { 00:22:43.067 "name": "pt1", 00:22:43.067 "base_bdev_name": "malloc1" 00:22:43.067 } 00:22:43.067 } 00:22:43.067 }' 00:22:43.067 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:43.336 12:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.595 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.595 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:43.595 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:43.595 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:43.595 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:43.852 "name": "pt2", 00:22:43.852 "aliases": [ 00:22:43.852 "00000000-0000-0000-0000-000000000002" 00:22:43.852 ], 00:22:43.852 "product_name": "passthru", 00:22:43.852 "block_size": 512, 00:22:43.852 "num_blocks": 65536, 00:22:43.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:43.852 "assigned_rate_limits": { 00:22:43.852 "rw_ios_per_sec": 0, 00:22:43.852 "rw_mbytes_per_sec": 0, 00:22:43.852 "r_mbytes_per_sec": 0, 00:22:43.852 "w_mbytes_per_sec": 0 00:22:43.852 }, 00:22:43.852 "claimed": true, 00:22:43.852 "claim_type": "exclusive_write", 00:22:43.852 "zoned": false, 00:22:43.852 "supported_io_types": { 00:22:43.852 "read": true, 00:22:43.852 "write": true, 00:22:43.852 "unmap": true, 00:22:43.852 "write_zeroes": true, 00:22:43.852 "flush": true, 00:22:43.852 "reset": true, 00:22:43.852 "compare": false, 00:22:43.852 "compare_and_write": false, 00:22:43.852 "abort": true, 00:22:43.852 "nvme_admin": false, 00:22:43.852 "nvme_io": false 00:22:43.852 }, 00:22:43.852 "memory_domains": [ 00:22:43.852 { 00:22:43.852 "dma_device_id": "system", 00:22:43.852 "dma_device_type": 1 00:22:43.852 }, 00:22:43.852 { 00:22:43.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.852 "dma_device_type": 2 00:22:43.852 } 00:22:43.852 ], 00:22:43.852 "driver_specific": { 00:22:43.852 "passthru": { 00:22:43.852 "name": "pt2", 00:22:43.852 "base_bdev_name": "malloc2" 00:22:43.852 } 00:22:43.852 } 00:22:43.852 }' 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.852 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:44.111 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:44.369 [2024-06-07 12:25:07.880000] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:44.369 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7650b3a7-8a54-4ab5-bc15-986e05a72b53 00:22:44.369 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7650b3a7-8a54-4ab5-bc15-986e05a72b53 ']' 00:22:44.369 12:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:44.628 [2024-06-07 12:25:08.099817] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:44.628 [2024-06-07 12:25:08.100096] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:44.628 [2024-06-07 12:25:08.100307] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:44.628 [2024-06-07 12:25:08.100511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:44.628 [2024-06-07 12:25:08.100591] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:22:44.628 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.628 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:44.886 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:44.886 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:44.886 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:44.886 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:45.144 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:45.144 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:45.402 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:45.402 12:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:22:45.664 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:22:45.922 [2024-06-07 12:25:09.428030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:45.922 [2024-06-07 12:25:09.430359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:45.922 [2024-06-07 12:25:09.430563] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:45.922 [2024-06-07 12:25:09.430765] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:45.922 [2024-06-07 12:25:09.430921] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:45.922 [2024-06-07 12:25:09.431031] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:22:45.922 request: 00:22:45.922 { 00:22:45.922 "name": "raid_bdev1", 00:22:45.922 "raid_level": "concat", 00:22:45.922 "base_bdevs": [ 00:22:45.922 "malloc1", 00:22:45.922 "malloc2" 00:22:45.922 ], 00:22:45.922 "superblock": false, 00:22:45.922 "strip_size_kb": 64, 00:22:45.922 "method": "bdev_raid_create", 00:22:45.922 "req_id": 1 00:22:45.922 } 00:22:45.922 Got JSON-RPC error response 00:22:45.922 response: 00:22:45.922 { 00:22:45.922 "code": -17, 00:22:45.922 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:45.922 } 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.922 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:46.180 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:46.180 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:46.180 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:46.438 [2024-06-07 12:25:09.900045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:46.438 [2024-06-07 12:25:09.900461] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.438 [2024-06-07 12:25:09.900627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:22:46.438 [2024-06-07 12:25:09.900778] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.438 [2024-06-07 12:25:09.903271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.438 [2024-06-07 12:25:09.903483] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:46.438 [2024-06-07 12:25:09.903696] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:46.438 [2024-06-07 12:25:09.903891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:46.438 pt1 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.438 12:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.696 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.696 "name": "raid_bdev1", 00:22:46.696 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:46.696 "strip_size_kb": 64, 00:22:46.696 "state": "configuring", 00:22:46.696 "raid_level": "concat", 00:22:46.696 "superblock": true, 00:22:46.696 "num_base_bdevs": 2, 00:22:46.696 "num_base_bdevs_discovered": 1, 00:22:46.696 "num_base_bdevs_operational": 2, 00:22:46.696 "base_bdevs_list": [ 00:22:46.696 { 00:22:46.696 "name": "pt1", 00:22:46.696 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:46.696 "is_configured": true, 00:22:46.696 "data_offset": 2048, 00:22:46.696 "data_size": 63488 00:22:46.696 }, 00:22:46.696 { 00:22:46.696 "name": null, 00:22:46.696 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:46.696 "is_configured": false, 00:22:46.696 "data_offset": 2048, 00:22:46.696 "data_size": 63488 00:22:46.696 } 00:22:46.696 ] 00:22:46.696 }' 00:22:46.696 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.696 12:25:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:47.262 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:47.262 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:47.262 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:47.262 12:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:47.520 [2024-06-07 12:25:11.068392] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:47.520 [2024-06-07 12:25:11.068755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.520 [2024-06-07 12:25:11.068906] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:22:47.520 [2024-06-07 12:25:11.069015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.520 [2024-06-07 12:25:11.069493] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.520 [2024-06-07 12:25:11.069654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:47.520 [2024-06-07 12:25:11.069821] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:47.520 [2024-06-07 12:25:11.069930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:47.520 [2024-06-07 12:25:11.070112] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:22:47.520 [2024-06-07 12:25:11.070211] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:47.520 [2024-06-07 12:25:11.070374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:22:47.520 [2024-06-07 12:25:11.070718] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:22:47.520 [2024-06-07 12:25:11.070825] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:22:47.520 [2024-06-07 12:25:11.071011] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:47.520 pt2 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.521 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.779 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.779 "name": "raid_bdev1", 00:22:47.779 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:47.779 "strip_size_kb": 64, 00:22:47.779 "state": "online", 00:22:47.779 "raid_level": "concat", 00:22:47.779 "superblock": true, 00:22:47.779 "num_base_bdevs": 2, 00:22:47.779 "num_base_bdevs_discovered": 2, 00:22:47.779 "num_base_bdevs_operational": 2, 00:22:47.779 "base_bdevs_list": [ 00:22:47.779 { 00:22:47.779 "name": "pt1", 00:22:47.779 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:47.779 "is_configured": true, 00:22:47.779 "data_offset": 2048, 00:22:47.779 "data_size": 63488 00:22:47.779 }, 00:22:47.779 { 00:22:47.779 "name": "pt2", 00:22:47.779 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:47.779 "is_configured": true, 00:22:47.779 "data_offset": 2048, 00:22:47.779 "data_size": 63488 00:22:47.779 } 00:22:47.779 ] 00:22:47.779 }' 00:22:47.779 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.779 12:25:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:48.347 12:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:48.620 [2024-06-07 12:25:12.100718] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:48.620 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:48.620 "name": "raid_bdev1", 00:22:48.620 "aliases": [ 00:22:48.620 "7650b3a7-8a54-4ab5-bc15-986e05a72b53" 00:22:48.620 ], 00:22:48.620 "product_name": "Raid Volume", 00:22:48.620 "block_size": 512, 00:22:48.620 "num_blocks": 126976, 00:22:48.620 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:48.620 "assigned_rate_limits": { 00:22:48.620 "rw_ios_per_sec": 0, 00:22:48.620 "rw_mbytes_per_sec": 0, 00:22:48.620 "r_mbytes_per_sec": 0, 00:22:48.620 "w_mbytes_per_sec": 0 00:22:48.620 }, 00:22:48.620 "claimed": false, 00:22:48.620 "zoned": false, 00:22:48.620 "supported_io_types": { 00:22:48.620 "read": true, 00:22:48.620 "write": true, 00:22:48.620 "unmap": true, 00:22:48.620 "write_zeroes": true, 00:22:48.620 "flush": true, 00:22:48.620 "reset": true, 00:22:48.620 "compare": false, 00:22:48.620 "compare_and_write": false, 00:22:48.620 "abort": false, 00:22:48.620 "nvme_admin": false, 00:22:48.620 "nvme_io": false 00:22:48.620 }, 00:22:48.620 "memory_domains": [ 00:22:48.620 { 00:22:48.620 "dma_device_id": "system", 00:22:48.620 "dma_device_type": 1 00:22:48.620 }, 00:22:48.620 { 00:22:48.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.620 "dma_device_type": 2 00:22:48.620 }, 00:22:48.620 { 00:22:48.620 "dma_device_id": "system", 00:22:48.620 "dma_device_type": 1 00:22:48.620 }, 00:22:48.620 { 00:22:48.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.620 "dma_device_type": 2 00:22:48.620 } 00:22:48.620 ], 00:22:48.620 "driver_specific": { 00:22:48.620 "raid": { 00:22:48.620 "uuid": "7650b3a7-8a54-4ab5-bc15-986e05a72b53", 00:22:48.620 "strip_size_kb": 64, 00:22:48.620 "state": "online", 00:22:48.620 "raid_level": "concat", 00:22:48.620 "superblock": true, 00:22:48.620 "num_base_bdevs": 2, 00:22:48.620 "num_base_bdevs_discovered": 2, 00:22:48.620 "num_base_bdevs_operational": 2, 00:22:48.620 "base_bdevs_list": [ 00:22:48.620 { 00:22:48.620 "name": "pt1", 00:22:48.620 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:48.620 "is_configured": true, 00:22:48.620 "data_offset": 2048, 00:22:48.620 "data_size": 63488 00:22:48.620 }, 00:22:48.620 { 00:22:48.620 "name": "pt2", 00:22:48.620 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:48.620 "is_configured": true, 00:22:48.620 "data_offset": 2048, 00:22:48.620 "data_size": 63488 00:22:48.620 } 00:22:48.620 ] 00:22:48.620 } 00:22:48.620 } 00:22:48.620 }' 00:22:48.620 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:48.621 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:48.621 pt2' 00:22:48.621 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:48.621 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:48.621 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:48.897 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:48.897 "name": "pt1", 00:22:48.897 "aliases": [ 00:22:48.897 "00000000-0000-0000-0000-000000000001" 00:22:48.897 ], 00:22:48.897 "product_name": "passthru", 00:22:48.897 "block_size": 512, 00:22:48.897 "num_blocks": 65536, 00:22:48.897 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:48.897 "assigned_rate_limits": { 00:22:48.897 "rw_ios_per_sec": 0, 00:22:48.897 "rw_mbytes_per_sec": 0, 00:22:48.897 "r_mbytes_per_sec": 0, 00:22:48.897 "w_mbytes_per_sec": 0 00:22:48.897 }, 00:22:48.897 "claimed": true, 00:22:48.897 "claim_type": "exclusive_write", 00:22:48.897 "zoned": false, 00:22:48.897 "supported_io_types": { 00:22:48.897 "read": true, 00:22:48.897 "write": true, 00:22:48.897 "unmap": true, 00:22:48.897 "write_zeroes": true, 00:22:48.897 "flush": true, 00:22:48.897 "reset": true, 00:22:48.897 "compare": false, 00:22:48.897 "compare_and_write": false, 00:22:48.897 "abort": true, 00:22:48.897 "nvme_admin": false, 00:22:48.897 "nvme_io": false 00:22:48.897 }, 00:22:48.897 "memory_domains": [ 00:22:48.897 { 00:22:48.897 "dma_device_id": "system", 00:22:48.897 "dma_device_type": 1 00:22:48.897 }, 00:22:48.897 { 00:22:48.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.897 "dma_device_type": 2 00:22:48.897 } 00:22:48.897 ], 00:22:48.897 "driver_specific": { 00:22:48.897 "passthru": { 00:22:48.897 "name": "pt1", 00:22:48.897 "base_bdev_name": "malloc1" 00:22:48.897 } 00:22:48.897 } 00:22:48.897 }' 00:22:48.897 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.897 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.156 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.415 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.415 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:49.415 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:49.415 12:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:49.415 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:49.415 "name": "pt2", 00:22:49.415 "aliases": [ 00:22:49.415 "00000000-0000-0000-0000-000000000002" 00:22:49.415 ], 00:22:49.415 "product_name": "passthru", 00:22:49.415 "block_size": 512, 00:22:49.415 "num_blocks": 65536, 00:22:49.415 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:49.415 "assigned_rate_limits": { 00:22:49.415 "rw_ios_per_sec": 0, 00:22:49.415 "rw_mbytes_per_sec": 0, 00:22:49.415 "r_mbytes_per_sec": 0, 00:22:49.415 "w_mbytes_per_sec": 0 00:22:49.415 }, 00:22:49.415 "claimed": true, 00:22:49.415 "claim_type": "exclusive_write", 00:22:49.415 "zoned": false, 00:22:49.415 "supported_io_types": { 00:22:49.415 "read": true, 00:22:49.415 "write": true, 00:22:49.415 "unmap": true, 00:22:49.415 "write_zeroes": true, 00:22:49.415 "flush": true, 00:22:49.415 "reset": true, 00:22:49.415 "compare": false, 00:22:49.415 "compare_and_write": false, 00:22:49.415 "abort": true, 00:22:49.415 "nvme_admin": false, 00:22:49.415 "nvme_io": false 00:22:49.415 }, 00:22:49.415 "memory_domains": [ 00:22:49.415 { 00:22:49.415 "dma_device_id": "system", 00:22:49.415 "dma_device_type": 1 00:22:49.415 }, 00:22:49.415 { 00:22:49.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:49.415 "dma_device_type": 2 00:22:49.415 } 00:22:49.415 ], 00:22:49.415 "driver_specific": { 00:22:49.415 "passthru": { 00:22:49.415 "name": "pt2", 00:22:49.415 "base_bdev_name": "malloc2" 00:22:49.415 } 00:22:49.415 } 00:22:49.415 }' 00:22:49.415 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.673 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:49.930 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:50.188 [2024-06-07 12:25:13.676915] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.188 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7650b3a7-8a54-4ab5-bc15-986e05a72b53 '!=' 7650b3a7-8a54-4ab5-bc15-986e05a72b53 ']' 00:22:50.188 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:22:50.188 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:50.188 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:50.188 12:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 200254 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 200254 ']' 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 200254 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 200254 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:50.189 killing process with pid 200254 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 200254' 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 200254 00:22:50.189 [2024-06-07 12:25:13.727586] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:50.189 [2024-06-07 12:25:13.727674] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.189 12:25:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 200254 00:22:50.189 [2024-06-07 12:25:13.727725] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.189 [2024-06-07 12:25:13.727736] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:22:50.189 [2024-06-07 12:25:13.771441] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:50.754 12:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:50.754 00:22:50.754 real 0m10.540s 00:22:50.754 user 0m18.985s 00:22:50.754 sys 0m1.889s 00:22:50.754 12:25:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:50.754 12:25:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.754 ************************************ 00:22:50.754 END TEST raid_superblock_test 00:22:50.754 ************************************ 00:22:50.754 12:25:14 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:22:50.754 12:25:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:50.754 12:25:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:50.754 12:25:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:50.754 ************************************ 00:22:50.754 START TEST raid_read_error_test 00:22:50.754 ************************************ 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 read 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Couazhikin 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=200612 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 200612 /var/tmp/spdk-raid.sock 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 200612 ']' 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:50.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:50.754 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:50.755 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:50.755 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.755 [2024-06-07 12:25:14.251028] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:50.755 [2024-06-07 12:25:14.251324] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200612 ] 00:22:50.755 [2024-06-07 12:25:14.392680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.012 [2024-06-07 12:25:14.484491] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.012 [2024-06-07 12:25:14.565448] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.012 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:51.012 12:25:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:22:51.012 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:51.012 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:51.577 BaseBdev1_malloc 00:22:51.577 12:25:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:51.835 true 00:22:51.835 12:25:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:52.093 [2024-06-07 12:25:15.491579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:52.093 [2024-06-07 12:25:15.492000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.093 [2024-06-07 12:25:15.492092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:22:52.093 [2024-06-07 12:25:15.492280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.093 [2024-06-07 12:25:15.494715] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.093 [2024-06-07 12:25:15.494921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:52.093 BaseBdev1 00:22:52.093 12:25:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:52.093 12:25:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:52.093 BaseBdev2_malloc 00:22:52.350 12:25:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:52.608 true 00:22:52.608 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:52.866 [2024-06-07 12:25:16.303515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:52.866 [2024-06-07 12:25:16.303871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.866 [2024-06-07 12:25:16.304020] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:22:52.866 [2024-06-07 12:25:16.304156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.866 [2024-06-07 12:25:16.306506] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.866 [2024-06-07 12:25:16.306675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:52.866 BaseBdev2 00:22:52.866 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:22:53.124 [2024-06-07 12:25:16.515654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:53.124 [2024-06-07 12:25:16.517832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:53.124 [2024-06-07 12:25:16.518122] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:22:53.124 [2024-06-07 12:25:16.518224] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:53.124 [2024-06-07 12:25:16.518417] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:22:53.124 [2024-06-07 12:25:16.518838] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:22:53.124 [2024-06-07 12:25:16.518888] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:22:53.124 [2024-06-07 12:25:16.519118] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.124 "name": "raid_bdev1", 00:22:53.124 "uuid": "0f373443-022d-4548-98cb-adc003f37b6e", 00:22:53.124 "strip_size_kb": 64, 00:22:53.124 "state": "online", 00:22:53.124 "raid_level": "concat", 00:22:53.124 "superblock": true, 00:22:53.124 "num_base_bdevs": 2, 00:22:53.124 "num_base_bdevs_discovered": 2, 00:22:53.124 "num_base_bdevs_operational": 2, 00:22:53.124 "base_bdevs_list": [ 00:22:53.124 { 00:22:53.124 "name": "BaseBdev1", 00:22:53.124 "uuid": "1271c530-f8d2-5b3d-b929-c315d399a79f", 00:22:53.124 "is_configured": true, 00:22:53.124 "data_offset": 2048, 00:22:53.124 "data_size": 63488 00:22:53.124 }, 00:22:53.124 { 00:22:53.124 "name": "BaseBdev2", 00:22:53.124 "uuid": "8994397f-d92f-5f0b-9d60-98a0c52ce727", 00:22:53.124 "is_configured": true, 00:22:53.124 "data_offset": 2048, 00:22:53.124 "data_size": 63488 00:22:53.124 } 00:22:53.124 ] 00:22:53.124 }' 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.124 12:25:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.691 12:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:53.691 12:25:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:53.948 [2024-06-07 12:25:17.403982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:22:54.880 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.138 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.396 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.396 "name": "raid_bdev1", 00:22:55.396 "uuid": "0f373443-022d-4548-98cb-adc003f37b6e", 00:22:55.396 "strip_size_kb": 64, 00:22:55.396 "state": "online", 00:22:55.396 "raid_level": "concat", 00:22:55.396 "superblock": true, 00:22:55.396 "num_base_bdevs": 2, 00:22:55.396 "num_base_bdevs_discovered": 2, 00:22:55.396 "num_base_bdevs_operational": 2, 00:22:55.396 "base_bdevs_list": [ 00:22:55.396 { 00:22:55.396 "name": "BaseBdev1", 00:22:55.396 "uuid": "1271c530-f8d2-5b3d-b929-c315d399a79f", 00:22:55.396 "is_configured": true, 00:22:55.396 "data_offset": 2048, 00:22:55.396 "data_size": 63488 00:22:55.396 }, 00:22:55.396 { 00:22:55.396 "name": "BaseBdev2", 00:22:55.396 "uuid": "8994397f-d92f-5f0b-9d60-98a0c52ce727", 00:22:55.396 "is_configured": true, 00:22:55.396 "data_offset": 2048, 00:22:55.396 "data_size": 63488 00:22:55.396 } 00:22:55.396 ] 00:22:55.396 }' 00:22:55.396 12:25:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.396 12:25:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.963 12:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:56.222 [2024-06-07 12:25:19.803261] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:56.222 [2024-06-07 12:25:19.803570] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:56.222 [2024-06-07 12:25:19.804801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:56.222 [2024-06-07 12:25:19.804956] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.222 [2024-06-07 12:25:19.805012] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:56.222 [2024-06-07 12:25:19.805206] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:22:56.222 0 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 200612 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 200612 ']' 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 200612 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 200612 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:56.222 killing process with pid 200612 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 200612' 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 200612 00:22:56.222 [2024-06-07 12:25:19.862008] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:56.222 12:25:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 200612 00:22:56.480 [2024-06-07 12:25:19.890692] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Couazhikin 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:22:56.738 00:22:56.738 real 0m6.070s 00:22:56.738 user 0m9.613s 00:22:56.738 sys 0m1.007s 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:56.738 12:25:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:56.738 ************************************ 00:22:56.738 END TEST raid_read_error_test 00:22:56.738 ************************************ 00:22:56.738 12:25:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:22:56.738 12:25:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:56.738 12:25:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:56.738 12:25:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:56.738 ************************************ 00:22:56.738 START TEST raid_write_error_test 00:22:56.738 ************************************ 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 write 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:56.738 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SopRvSfTtf 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=200788 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 200788 /var/tmp/spdk-raid.sock 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 200788 ']' 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:56.739 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:56.739 12:25:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:56.739 [2024-06-07 12:25:20.376625] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:22:56.739 [2024-06-07 12:25:20.376861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200788 ] 00:22:56.995 [2024-06-07 12:25:20.514290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:56.995 [2024-06-07 12:25:20.597228] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:57.251 [2024-06-07 12:25:20.675938] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:57.816 12:25:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:57.816 12:25:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:22:57.816 12:25:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:57.816 12:25:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:58.075 BaseBdev1_malloc 00:22:58.075 12:25:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:58.333 true 00:22:58.333 12:25:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:58.591 [2024-06-07 12:25:22.000280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:58.591 [2024-06-07 12:25:22.000387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:58.591 [2024-06-07 12:25:22.000445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:22:58.591 [2024-06-07 12:25:22.000515] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:58.591 [2024-06-07 12:25:22.003130] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:58.591 [2024-06-07 12:25:22.003321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:58.591 BaseBdev1 00:22:58.591 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:58.591 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:58.850 BaseBdev2_malloc 00:22:58.850 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:58.850 true 00:22:58.850 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:59.108 [2024-06-07 12:25:22.707889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:59.108 [2024-06-07 12:25:22.707999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.108 [2024-06-07 12:25:22.708049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:22:59.108 [2024-06-07 12:25:22.708100] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.108 [2024-06-07 12:25:22.710272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.108 [2024-06-07 12:25:22.710341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:59.108 BaseBdev2 00:22:59.108 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:22:59.366 [2024-06-07 12:25:22.927996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:59.366 [2024-06-07 12:25:22.929989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:59.366 [2024-06-07 12:25:22.930144] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:22:59.366 [2024-06-07 12:25:22.930157] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:22:59.366 [2024-06-07 12:25:22.930299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:22:59.366 [2024-06-07 12:25:22.930615] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:22:59.366 [2024-06-07 12:25:22.930632] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:22:59.366 [2024-06-07 12:25:22.930762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.366 12:25:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.624 12:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.624 "name": "raid_bdev1", 00:22:59.624 "uuid": "0cc0f7d8-4d60-4ee7-9086-195f26555e0c", 00:22:59.624 "strip_size_kb": 64, 00:22:59.624 "state": "online", 00:22:59.624 "raid_level": "concat", 00:22:59.624 "superblock": true, 00:22:59.624 "num_base_bdevs": 2, 00:22:59.624 "num_base_bdevs_discovered": 2, 00:22:59.624 "num_base_bdevs_operational": 2, 00:22:59.624 "base_bdevs_list": [ 00:22:59.624 { 00:22:59.624 "name": "BaseBdev1", 00:22:59.624 "uuid": "63b955ee-2ea2-574b-b878-64c3958fd575", 00:22:59.624 "is_configured": true, 00:22:59.624 "data_offset": 2048, 00:22:59.624 "data_size": 63488 00:22:59.624 }, 00:22:59.624 { 00:22:59.624 "name": "BaseBdev2", 00:22:59.624 "uuid": "f1bc447d-d120-5150-8d3c-883ecc792ddb", 00:22:59.624 "is_configured": true, 00:22:59.624 "data_offset": 2048, 00:22:59.624 "data_size": 63488 00:22:59.624 } 00:22:59.624 ] 00:22:59.624 }' 00:22:59.624 12:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.624 12:25:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.583 12:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:00.583 12:25:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:00.583 [2024-06-07 12:25:24.020392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:23:01.516 12:25:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.774 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.032 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.032 "name": "raid_bdev1", 00:23:02.032 "uuid": "0cc0f7d8-4d60-4ee7-9086-195f26555e0c", 00:23:02.032 "strip_size_kb": 64, 00:23:02.032 "state": "online", 00:23:02.032 "raid_level": "concat", 00:23:02.032 "superblock": true, 00:23:02.032 "num_base_bdevs": 2, 00:23:02.032 "num_base_bdevs_discovered": 2, 00:23:02.032 "num_base_bdevs_operational": 2, 00:23:02.032 "base_bdevs_list": [ 00:23:02.032 { 00:23:02.032 "name": "BaseBdev1", 00:23:02.032 "uuid": "63b955ee-2ea2-574b-b878-64c3958fd575", 00:23:02.032 "is_configured": true, 00:23:02.032 "data_offset": 2048, 00:23:02.032 "data_size": 63488 00:23:02.032 }, 00:23:02.032 { 00:23:02.032 "name": "BaseBdev2", 00:23:02.032 "uuid": "f1bc447d-d120-5150-8d3c-883ecc792ddb", 00:23:02.032 "is_configured": true, 00:23:02.032 "data_offset": 2048, 00:23:02.032 "data_size": 63488 00:23:02.032 } 00:23:02.032 ] 00:23:02.032 }' 00:23:02.032 12:25:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.032 12:25:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:02.606 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:03.171 [2024-06-07 12:25:26.536838] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:03.171 [2024-06-07 12:25:26.536891] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:03.171 [2024-06-07 12:25:26.538008] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:03.172 [2024-06-07 12:25:26.538051] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:03.172 [2024-06-07 12:25:26.538076] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:03.172 [2024-06-07 12:25:26.538085] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:23:03.172 0 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 200788 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 200788 ']' 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 200788 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 200788 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:03.172 killing process with pid 200788 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 200788' 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 200788 00:23:03.172 [2024-06-07 12:25:26.594118] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:03.172 12:25:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 200788 00:23:03.172 [2024-06-07 12:25:26.623374] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SopRvSfTtf 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:23:03.430 12:25:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:23:03.430 00:23:03.430 real 0m6.671s 00:23:03.430 user 0m10.413s 00:23:03.430 sys 0m1.073s 00:23:03.430 12:25:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:03.430 12:25:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.430 ************************************ 00:23:03.430 END TEST raid_write_error_test 00:23:03.430 ************************************ 00:23:03.430 12:25:27 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:23:03.430 12:25:27 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:23:03.430 12:25:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:03.430 12:25:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:03.430 12:25:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:03.430 ************************************ 00:23:03.430 START TEST raid_state_function_test 00:23:03.430 ************************************ 00:23:03.430 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 false 00:23:03.430 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:03.430 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:03.430 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:23:03.430 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=200969 00:23:03.689 Process raid pid: 200969 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 200969' 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 200969 /var/tmp/spdk-raid.sock 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 200969 ']' 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:03.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:03.689 12:25:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.689 [2024-06-07 12:25:27.115607] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:03.689 [2024-06-07 12:25:27.116340] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:03.689 [2024-06-07 12:25:27.260673] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:03.948 [2024-06-07 12:25:27.358120] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:03.948 [2024-06-07 12:25:27.446941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:04.514 12:25:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:04.514 12:25:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:23:04.514 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:05.079 [2024-06-07 12:25:28.437840] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:05.080 [2024-06-07 12:25:28.437958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:05.080 [2024-06-07 12:25:28.437972] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:05.080 [2024-06-07 12:25:28.438000] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.080 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:05.337 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.337 "name": "Existed_Raid", 00:23:05.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.337 "strip_size_kb": 0, 00:23:05.337 "state": "configuring", 00:23:05.337 "raid_level": "raid1", 00:23:05.337 "superblock": false, 00:23:05.337 "num_base_bdevs": 2, 00:23:05.337 "num_base_bdevs_discovered": 0, 00:23:05.337 "num_base_bdevs_operational": 2, 00:23:05.337 "base_bdevs_list": [ 00:23:05.337 { 00:23:05.337 "name": "BaseBdev1", 00:23:05.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.337 "is_configured": false, 00:23:05.337 "data_offset": 0, 00:23:05.337 "data_size": 0 00:23:05.337 }, 00:23:05.337 { 00:23:05.337 "name": "BaseBdev2", 00:23:05.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.337 "is_configured": false, 00:23:05.337 "data_offset": 0, 00:23:05.337 "data_size": 0 00:23:05.337 } 00:23:05.337 ] 00:23:05.337 }' 00:23:05.337 12:25:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.337 12:25:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.908 12:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:05.908 [2024-06-07 12:25:29.537873] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:05.909 [2024-06-07 12:25:29.537933] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:23:06.166 12:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:06.166 [2024-06-07 12:25:29.769923] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:06.166 [2024-06-07 12:25:29.770050] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:06.166 [2024-06-07 12:25:29.770065] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:06.166 [2024-06-07 12:25:29.770111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:06.166 12:25:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:06.424 [2024-06-07 12:25:30.014276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:06.424 BaseBdev1 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:06.424 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:06.681 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:06.982 [ 00:23:06.982 { 00:23:06.982 "name": "BaseBdev1", 00:23:06.982 "aliases": [ 00:23:06.982 "456b65b9-0528-4d70-9e9e-ef3b3ba9124d" 00:23:06.982 ], 00:23:06.982 "product_name": "Malloc disk", 00:23:06.982 "block_size": 512, 00:23:06.982 "num_blocks": 65536, 00:23:06.982 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:06.982 "assigned_rate_limits": { 00:23:06.982 "rw_ios_per_sec": 0, 00:23:06.982 "rw_mbytes_per_sec": 0, 00:23:06.982 "r_mbytes_per_sec": 0, 00:23:06.982 "w_mbytes_per_sec": 0 00:23:06.982 }, 00:23:06.982 "claimed": true, 00:23:06.982 "claim_type": "exclusive_write", 00:23:06.982 "zoned": false, 00:23:06.982 "supported_io_types": { 00:23:06.982 "read": true, 00:23:06.982 "write": true, 00:23:06.982 "unmap": true, 00:23:06.982 "write_zeroes": true, 00:23:06.982 "flush": true, 00:23:06.982 "reset": true, 00:23:06.982 "compare": false, 00:23:06.982 "compare_and_write": false, 00:23:06.982 "abort": true, 00:23:06.982 "nvme_admin": false, 00:23:06.982 "nvme_io": false 00:23:06.982 }, 00:23:06.982 "memory_domains": [ 00:23:06.982 { 00:23:06.982 "dma_device_id": "system", 00:23:06.982 "dma_device_type": 1 00:23:06.982 }, 00:23:06.982 { 00:23:06.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.982 "dma_device_type": 2 00:23:06.982 } 00:23:06.982 ], 00:23:06.982 "driver_specific": {} 00:23:06.982 } 00:23:06.982 ] 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:06.982 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.278 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:07.278 "name": "Existed_Raid", 00:23:07.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.278 "strip_size_kb": 0, 00:23:07.278 "state": "configuring", 00:23:07.278 "raid_level": "raid1", 00:23:07.278 "superblock": false, 00:23:07.278 "num_base_bdevs": 2, 00:23:07.278 "num_base_bdevs_discovered": 1, 00:23:07.278 "num_base_bdevs_operational": 2, 00:23:07.278 "base_bdevs_list": [ 00:23:07.278 { 00:23:07.278 "name": "BaseBdev1", 00:23:07.278 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:07.278 "is_configured": true, 00:23:07.278 "data_offset": 0, 00:23:07.278 "data_size": 65536 00:23:07.278 }, 00:23:07.278 { 00:23:07.278 "name": "BaseBdev2", 00:23:07.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.278 "is_configured": false, 00:23:07.278 "data_offset": 0, 00:23:07.278 "data_size": 0 00:23:07.278 } 00:23:07.278 ] 00:23:07.278 }' 00:23:07.278 12:25:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:07.278 12:25:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.843 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:07.843 [2024-06-07 12:25:31.446517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:07.843 [2024-06-07 12:25:31.446608] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:23:07.843 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:08.408 [2024-06-07 12:25:31.774647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:08.408 [2024-06-07 12:25:31.776820] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:08.408 [2024-06-07 12:25:31.777336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:08.408 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:08.408 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.409 12:25:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:08.667 12:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.667 "name": "Existed_Raid", 00:23:08.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.667 "strip_size_kb": 0, 00:23:08.667 "state": "configuring", 00:23:08.667 "raid_level": "raid1", 00:23:08.667 "superblock": false, 00:23:08.667 "num_base_bdevs": 2, 00:23:08.667 "num_base_bdevs_discovered": 1, 00:23:08.667 "num_base_bdevs_operational": 2, 00:23:08.667 "base_bdevs_list": [ 00:23:08.667 { 00:23:08.667 "name": "BaseBdev1", 00:23:08.667 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:08.667 "is_configured": true, 00:23:08.667 "data_offset": 0, 00:23:08.667 "data_size": 65536 00:23:08.667 }, 00:23:08.667 { 00:23:08.667 "name": "BaseBdev2", 00:23:08.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.667 "is_configured": false, 00:23:08.667 "data_offset": 0, 00:23:08.667 "data_size": 0 00:23:08.667 } 00:23:08.667 ] 00:23:08.667 }' 00:23:08.667 12:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.667 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:09.273 12:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:09.531 [2024-06-07 12:25:32.930714] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:09.531 [2024-06-07 12:25:32.930789] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:23:09.531 [2024-06-07 12:25:32.930803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:09.531 [2024-06-07 12:25:32.930967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:23:09.531 [2024-06-07 12:25:32.931324] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:23:09.531 [2024-06-07 12:25:32.931347] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:23:09.531 [2024-06-07 12:25:32.931577] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.531 BaseBdev2 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:09.531 12:25:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:09.531 12:25:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:09.789 [ 00:23:09.789 { 00:23:09.789 "name": "BaseBdev2", 00:23:09.789 "aliases": [ 00:23:09.789 "85d6bd9d-f4db-42f9-9f62-f1861b55eacc" 00:23:09.789 ], 00:23:09.789 "product_name": "Malloc disk", 00:23:09.789 "block_size": 512, 00:23:09.789 "num_blocks": 65536, 00:23:09.789 "uuid": "85d6bd9d-f4db-42f9-9f62-f1861b55eacc", 00:23:09.789 "assigned_rate_limits": { 00:23:09.789 "rw_ios_per_sec": 0, 00:23:09.789 "rw_mbytes_per_sec": 0, 00:23:09.789 "r_mbytes_per_sec": 0, 00:23:09.789 "w_mbytes_per_sec": 0 00:23:09.789 }, 00:23:09.789 "claimed": true, 00:23:09.789 "claim_type": "exclusive_write", 00:23:09.789 "zoned": false, 00:23:09.789 "supported_io_types": { 00:23:09.789 "read": true, 00:23:09.789 "write": true, 00:23:09.789 "unmap": true, 00:23:09.789 "write_zeroes": true, 00:23:09.789 "flush": true, 00:23:09.789 "reset": true, 00:23:09.789 "compare": false, 00:23:09.789 "compare_and_write": false, 00:23:09.789 "abort": true, 00:23:09.789 "nvme_admin": false, 00:23:09.789 "nvme_io": false 00:23:09.789 }, 00:23:09.789 "memory_domains": [ 00:23:09.789 { 00:23:09.789 "dma_device_id": "system", 00:23:09.789 "dma_device_type": 1 00:23:09.789 }, 00:23:09.789 { 00:23:09.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.789 "dma_device_type": 2 00:23:09.789 } 00:23:09.789 ], 00:23:09.789 "driver_specific": {} 00:23:09.789 } 00:23:09.789 ] 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:09.789 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.354 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:10.354 "name": "Existed_Raid", 00:23:10.354 "uuid": "bf832532-9195-4d9f-8c19-1853e89596ed", 00:23:10.354 "strip_size_kb": 0, 00:23:10.354 "state": "online", 00:23:10.354 "raid_level": "raid1", 00:23:10.354 "superblock": false, 00:23:10.354 "num_base_bdevs": 2, 00:23:10.354 "num_base_bdevs_discovered": 2, 00:23:10.354 "num_base_bdevs_operational": 2, 00:23:10.354 "base_bdevs_list": [ 00:23:10.354 { 00:23:10.354 "name": "BaseBdev1", 00:23:10.354 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:10.354 "is_configured": true, 00:23:10.354 "data_offset": 0, 00:23:10.354 "data_size": 65536 00:23:10.354 }, 00:23:10.354 { 00:23:10.354 "name": "BaseBdev2", 00:23:10.354 "uuid": "85d6bd9d-f4db-42f9-9f62-f1861b55eacc", 00:23:10.354 "is_configured": true, 00:23:10.354 "data_offset": 0, 00:23:10.354 "data_size": 65536 00:23:10.354 } 00:23:10.354 ] 00:23:10.354 }' 00:23:10.354 12:25:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:10.354 12:25:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:10.921 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:11.180 [2024-06-07 12:25:34.661842] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:11.180 "name": "Existed_Raid", 00:23:11.180 "aliases": [ 00:23:11.180 "bf832532-9195-4d9f-8c19-1853e89596ed" 00:23:11.180 ], 00:23:11.180 "product_name": "Raid Volume", 00:23:11.180 "block_size": 512, 00:23:11.180 "num_blocks": 65536, 00:23:11.180 "uuid": "bf832532-9195-4d9f-8c19-1853e89596ed", 00:23:11.180 "assigned_rate_limits": { 00:23:11.180 "rw_ios_per_sec": 0, 00:23:11.180 "rw_mbytes_per_sec": 0, 00:23:11.180 "r_mbytes_per_sec": 0, 00:23:11.180 "w_mbytes_per_sec": 0 00:23:11.180 }, 00:23:11.180 "claimed": false, 00:23:11.180 "zoned": false, 00:23:11.180 "supported_io_types": { 00:23:11.180 "read": true, 00:23:11.180 "write": true, 00:23:11.180 "unmap": false, 00:23:11.180 "write_zeroes": true, 00:23:11.180 "flush": false, 00:23:11.180 "reset": true, 00:23:11.180 "compare": false, 00:23:11.180 "compare_and_write": false, 00:23:11.180 "abort": false, 00:23:11.180 "nvme_admin": false, 00:23:11.180 "nvme_io": false 00:23:11.180 }, 00:23:11.180 "memory_domains": [ 00:23:11.180 { 00:23:11.180 "dma_device_id": "system", 00:23:11.180 "dma_device_type": 1 00:23:11.180 }, 00:23:11.180 { 00:23:11.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.180 "dma_device_type": 2 00:23:11.180 }, 00:23:11.180 { 00:23:11.180 "dma_device_id": "system", 00:23:11.180 "dma_device_type": 1 00:23:11.180 }, 00:23:11.180 { 00:23:11.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.180 "dma_device_type": 2 00:23:11.180 } 00:23:11.180 ], 00:23:11.180 "driver_specific": { 00:23:11.180 "raid": { 00:23:11.180 "uuid": "bf832532-9195-4d9f-8c19-1853e89596ed", 00:23:11.180 "strip_size_kb": 0, 00:23:11.180 "state": "online", 00:23:11.180 "raid_level": "raid1", 00:23:11.180 "superblock": false, 00:23:11.180 "num_base_bdevs": 2, 00:23:11.180 "num_base_bdevs_discovered": 2, 00:23:11.180 "num_base_bdevs_operational": 2, 00:23:11.180 "base_bdevs_list": [ 00:23:11.180 { 00:23:11.180 "name": "BaseBdev1", 00:23:11.180 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:11.180 "is_configured": true, 00:23:11.180 "data_offset": 0, 00:23:11.180 "data_size": 65536 00:23:11.180 }, 00:23:11.180 { 00:23:11.180 "name": "BaseBdev2", 00:23:11.180 "uuid": "85d6bd9d-f4db-42f9-9f62-f1861b55eacc", 00:23:11.180 "is_configured": true, 00:23:11.180 "data_offset": 0, 00:23:11.180 "data_size": 65536 00:23:11.180 } 00:23:11.180 ] 00:23:11.180 } 00:23:11.180 } 00:23:11.180 }' 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:11.180 BaseBdev2' 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:11.180 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:11.479 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:11.479 "name": "BaseBdev1", 00:23:11.479 "aliases": [ 00:23:11.479 "456b65b9-0528-4d70-9e9e-ef3b3ba9124d" 00:23:11.479 ], 00:23:11.479 "product_name": "Malloc disk", 00:23:11.479 "block_size": 512, 00:23:11.479 "num_blocks": 65536, 00:23:11.479 "uuid": "456b65b9-0528-4d70-9e9e-ef3b3ba9124d", 00:23:11.479 "assigned_rate_limits": { 00:23:11.479 "rw_ios_per_sec": 0, 00:23:11.479 "rw_mbytes_per_sec": 0, 00:23:11.479 "r_mbytes_per_sec": 0, 00:23:11.479 "w_mbytes_per_sec": 0 00:23:11.479 }, 00:23:11.479 "claimed": true, 00:23:11.479 "claim_type": "exclusive_write", 00:23:11.479 "zoned": false, 00:23:11.479 "supported_io_types": { 00:23:11.479 "read": true, 00:23:11.479 "write": true, 00:23:11.479 "unmap": true, 00:23:11.479 "write_zeroes": true, 00:23:11.479 "flush": true, 00:23:11.479 "reset": true, 00:23:11.479 "compare": false, 00:23:11.479 "compare_and_write": false, 00:23:11.479 "abort": true, 00:23:11.479 "nvme_admin": false, 00:23:11.479 "nvme_io": false 00:23:11.479 }, 00:23:11.479 "memory_domains": [ 00:23:11.479 { 00:23:11.479 "dma_device_id": "system", 00:23:11.479 "dma_device_type": 1 00:23:11.479 }, 00:23:11.479 { 00:23:11.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.479 "dma_device_type": 2 00:23:11.479 } 00:23:11.479 ], 00:23:11.479 "driver_specific": {} 00:23:11.479 }' 00:23:11.479 12:25:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.479 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.479 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:11.479 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.479 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:11.737 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:11.996 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:11.996 "name": "BaseBdev2", 00:23:11.996 "aliases": [ 00:23:11.996 "85d6bd9d-f4db-42f9-9f62-f1861b55eacc" 00:23:11.996 ], 00:23:11.996 "product_name": "Malloc disk", 00:23:11.996 "block_size": 512, 00:23:11.996 "num_blocks": 65536, 00:23:11.996 "uuid": "85d6bd9d-f4db-42f9-9f62-f1861b55eacc", 00:23:11.996 "assigned_rate_limits": { 00:23:11.996 "rw_ios_per_sec": 0, 00:23:11.996 "rw_mbytes_per_sec": 0, 00:23:11.996 "r_mbytes_per_sec": 0, 00:23:11.996 "w_mbytes_per_sec": 0 00:23:11.996 }, 00:23:11.996 "claimed": true, 00:23:11.996 "claim_type": "exclusive_write", 00:23:11.996 "zoned": false, 00:23:11.996 "supported_io_types": { 00:23:11.996 "read": true, 00:23:11.996 "write": true, 00:23:11.996 "unmap": true, 00:23:11.996 "write_zeroes": true, 00:23:11.996 "flush": true, 00:23:11.996 "reset": true, 00:23:11.996 "compare": false, 00:23:11.996 "compare_and_write": false, 00:23:11.996 "abort": true, 00:23:11.996 "nvme_admin": false, 00:23:11.996 "nvme_io": false 00:23:11.996 }, 00:23:11.996 "memory_domains": [ 00:23:11.996 { 00:23:11.996 "dma_device_id": "system", 00:23:11.996 "dma_device_type": 1 00:23:11.996 }, 00:23:11.996 { 00:23:11.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.996 "dma_device_type": 2 00:23:11.996 } 00:23:11.996 ], 00:23:11.996 "driver_specific": {} 00:23:11.996 }' 00:23:11.996 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:12.255 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.514 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.514 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:12.514 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.514 12:25:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.514 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:12.514 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:12.772 [2024-06-07 12:25:36.326609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.772 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:13.029 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.029 "name": "Existed_Raid", 00:23:13.029 "uuid": "bf832532-9195-4d9f-8c19-1853e89596ed", 00:23:13.029 "strip_size_kb": 0, 00:23:13.030 "state": "online", 00:23:13.030 "raid_level": "raid1", 00:23:13.030 "superblock": false, 00:23:13.030 "num_base_bdevs": 2, 00:23:13.030 "num_base_bdevs_discovered": 1, 00:23:13.030 "num_base_bdevs_operational": 1, 00:23:13.030 "base_bdevs_list": [ 00:23:13.030 { 00:23:13.030 "name": null, 00:23:13.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.030 "is_configured": false, 00:23:13.030 "data_offset": 0, 00:23:13.030 "data_size": 65536 00:23:13.030 }, 00:23:13.030 { 00:23:13.030 "name": "BaseBdev2", 00:23:13.030 "uuid": "85d6bd9d-f4db-42f9-9f62-f1861b55eacc", 00:23:13.030 "is_configured": true, 00:23:13.030 "data_offset": 0, 00:23:13.030 "data_size": 65536 00:23:13.030 } 00:23:13.030 ] 00:23:13.030 }' 00:23:13.030 12:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.030 12:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:13.963 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:14.221 [2024-06-07 12:25:37.806675] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:14.221 [2024-06-07 12:25:37.807078] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:14.221 [2024-06-07 12:25:37.830052] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:14.221 [2024-06-07 12:25:37.840648] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:14.221 [2024-06-07 12:25:37.841181] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:23:14.221 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:14.221 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:14.221 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:14.221 12:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 200969 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 200969 ']' 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 200969 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 200969 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 200969' 00:23:14.788 killing process with pid 200969 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 200969 00:23:14.788 [2024-06-07 12:25:38.204040] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:14.788 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 200969 00:23:14.788 [2024-06-07 12:25:38.204329] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:23:15.047 00:23:15.047 real 0m11.481s 00:23:15.047 user 0m20.226s 00:23:15.047 sys 0m1.905s 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:15.047 ************************************ 00:23:15.047 END TEST raid_state_function_test 00:23:15.047 ************************************ 00:23:15.047 12:25:38 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:23:15.047 12:25:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:15.047 12:25:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:15.047 12:25:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:15.047 ************************************ 00:23:15.047 START TEST raid_state_function_test_sb 00:23:15.047 ************************************ 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=201350 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 201350' 00:23:15.047 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:15.047 Process raid pid: 201350 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 201350 /var/tmp/spdk-raid.sock 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 201350 ']' 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:15.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:15.048 12:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.048 [2024-06-07 12:25:38.673276] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:15.048 [2024-06-07 12:25:38.673860] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:15.306 [2024-06-07 12:25:38.826096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:15.306 [2024-06-07 12:25:38.926428] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:15.564 [2024-06-07 12:25:39.014262] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:16.130 12:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:16.130 12:25:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:23:16.130 12:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:16.696 [2024-06-07 12:25:40.044371] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:16.696 [2024-06-07 12:25:40.045150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:16.696 [2024-06-07 12:25:40.045339] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:16.696 [2024-06-07 12:25:40.045507] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.696 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:16.954 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.954 "name": "Existed_Raid", 00:23:16.954 "uuid": "dae27b18-b508-422d-97fc-b213936d16d7", 00:23:16.954 "strip_size_kb": 0, 00:23:16.954 "state": "configuring", 00:23:16.954 "raid_level": "raid1", 00:23:16.954 "superblock": true, 00:23:16.954 "num_base_bdevs": 2, 00:23:16.954 "num_base_bdevs_discovered": 0, 00:23:16.954 "num_base_bdevs_operational": 2, 00:23:16.954 "base_bdevs_list": [ 00:23:16.954 { 00:23:16.954 "name": "BaseBdev1", 00:23:16.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.954 "is_configured": false, 00:23:16.954 "data_offset": 0, 00:23:16.954 "data_size": 0 00:23:16.954 }, 00:23:16.954 { 00:23:16.954 "name": "BaseBdev2", 00:23:16.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.954 "is_configured": false, 00:23:16.954 "data_offset": 0, 00:23:16.954 "data_size": 0 00:23:16.954 } 00:23:16.954 ] 00:23:16.954 }' 00:23:16.954 12:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.954 12:25:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:17.520 12:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:17.778 [2024-06-07 12:25:41.320384] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:17.778 [2024-06-07 12:25:41.320604] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:23:17.778 12:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:18.036 [2024-06-07 12:25:41.628517] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:18.036 [2024-06-07 12:25:41.629287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:18.036 [2024-06-07 12:25:41.629428] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:18.036 [2024-06-07 12:25:41.629570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:18.036 12:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:18.295 [2024-06-07 12:25:41.936677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:18.295 BaseBdev1 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:18.610 12:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:18.884 12:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:19.143 [ 00:23:19.143 { 00:23:19.143 "name": "BaseBdev1", 00:23:19.143 "aliases": [ 00:23:19.143 "28c77925-d622-4d2a-9644-6522d80716d6" 00:23:19.143 ], 00:23:19.143 "product_name": "Malloc disk", 00:23:19.143 "block_size": 512, 00:23:19.143 "num_blocks": 65536, 00:23:19.143 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:19.143 "assigned_rate_limits": { 00:23:19.143 "rw_ios_per_sec": 0, 00:23:19.143 "rw_mbytes_per_sec": 0, 00:23:19.143 "r_mbytes_per_sec": 0, 00:23:19.143 "w_mbytes_per_sec": 0 00:23:19.143 }, 00:23:19.143 "claimed": true, 00:23:19.143 "claim_type": "exclusive_write", 00:23:19.143 "zoned": false, 00:23:19.143 "supported_io_types": { 00:23:19.143 "read": true, 00:23:19.143 "write": true, 00:23:19.143 "unmap": true, 00:23:19.143 "write_zeroes": true, 00:23:19.143 "flush": true, 00:23:19.143 "reset": true, 00:23:19.143 "compare": false, 00:23:19.143 "compare_and_write": false, 00:23:19.143 "abort": true, 00:23:19.143 "nvme_admin": false, 00:23:19.143 "nvme_io": false 00:23:19.143 }, 00:23:19.143 "memory_domains": [ 00:23:19.143 { 00:23:19.143 "dma_device_id": "system", 00:23:19.143 "dma_device_type": 1 00:23:19.143 }, 00:23:19.143 { 00:23:19.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.143 "dma_device_type": 2 00:23:19.143 } 00:23:19.143 ], 00:23:19.143 "driver_specific": {} 00:23:19.143 } 00:23:19.143 ] 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.143 12:25:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:19.708 12:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.708 "name": "Existed_Raid", 00:23:19.708 "uuid": "5f2fe37c-cb3e-4bcb-8264-a979cb5434aa", 00:23:19.708 "strip_size_kb": 0, 00:23:19.708 "state": "configuring", 00:23:19.708 "raid_level": "raid1", 00:23:19.708 "superblock": true, 00:23:19.708 "num_base_bdevs": 2, 00:23:19.708 "num_base_bdevs_discovered": 1, 00:23:19.708 "num_base_bdevs_operational": 2, 00:23:19.708 "base_bdevs_list": [ 00:23:19.708 { 00:23:19.708 "name": "BaseBdev1", 00:23:19.708 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:19.708 "is_configured": true, 00:23:19.708 "data_offset": 2048, 00:23:19.708 "data_size": 63488 00:23:19.708 }, 00:23:19.708 { 00:23:19.708 "name": "BaseBdev2", 00:23:19.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.708 "is_configured": false, 00:23:19.708 "data_offset": 0, 00:23:19.708 "data_size": 0 00:23:19.708 } 00:23:19.708 ] 00:23:19.708 }' 00:23:19.708 12:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.708 12:25:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:20.273 12:25:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:20.531 [2024-06-07 12:25:44.021026] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:20.531 [2024-06-07 12:25:44.021395] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:23:20.531 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:20.790 [2024-06-07 12:25:44.349179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:20.790 [2024-06-07 12:25:44.351608] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:20.790 [2024-06-07 12:25:44.352325] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.790 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:21.356 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.356 "name": "Existed_Raid", 00:23:21.356 "uuid": "5f4a0cca-b796-4049-afed-b0a87f72e448", 00:23:21.356 "strip_size_kb": 0, 00:23:21.356 "state": "configuring", 00:23:21.356 "raid_level": "raid1", 00:23:21.356 "superblock": true, 00:23:21.356 "num_base_bdevs": 2, 00:23:21.356 "num_base_bdevs_discovered": 1, 00:23:21.356 "num_base_bdevs_operational": 2, 00:23:21.356 "base_bdevs_list": [ 00:23:21.356 { 00:23:21.356 "name": "BaseBdev1", 00:23:21.356 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:21.356 "is_configured": true, 00:23:21.356 "data_offset": 2048, 00:23:21.356 "data_size": 63488 00:23:21.356 }, 00:23:21.356 { 00:23:21.356 "name": "BaseBdev2", 00:23:21.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.356 "is_configured": false, 00:23:21.356 "data_offset": 0, 00:23:21.356 "data_size": 0 00:23:21.356 } 00:23:21.356 ] 00:23:21.356 }' 00:23:21.356 12:25:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.356 12:25:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:21.923 12:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:22.181 [2024-06-07 12:25:45.621389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:22.181 [2024-06-07 12:25:45.621683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:23:22.181 [2024-06-07 12:25:45.621709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:22.181 [2024-06-07 12:25:45.621871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:23:22.181 [2024-06-07 12:25:45.622356] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:23:22.181 [2024-06-07 12:25:45.622390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:23:22.181 [2024-06-07 12:25:45.622597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:22.181 BaseBdev2 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:22.181 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:22.438 12:25:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:22.702 [ 00:23:22.702 { 00:23:22.702 "name": "BaseBdev2", 00:23:22.702 "aliases": [ 00:23:22.702 "5a056e40-5070-48d4-b253-aae25b9b5d31" 00:23:22.702 ], 00:23:22.702 "product_name": "Malloc disk", 00:23:22.702 "block_size": 512, 00:23:22.702 "num_blocks": 65536, 00:23:22.702 "uuid": "5a056e40-5070-48d4-b253-aae25b9b5d31", 00:23:22.702 "assigned_rate_limits": { 00:23:22.702 "rw_ios_per_sec": 0, 00:23:22.702 "rw_mbytes_per_sec": 0, 00:23:22.702 "r_mbytes_per_sec": 0, 00:23:22.702 "w_mbytes_per_sec": 0 00:23:22.702 }, 00:23:22.702 "claimed": true, 00:23:22.702 "claim_type": "exclusive_write", 00:23:22.702 "zoned": false, 00:23:22.703 "supported_io_types": { 00:23:22.703 "read": true, 00:23:22.703 "write": true, 00:23:22.703 "unmap": true, 00:23:22.703 "write_zeroes": true, 00:23:22.703 "flush": true, 00:23:22.703 "reset": true, 00:23:22.703 "compare": false, 00:23:22.703 "compare_and_write": false, 00:23:22.703 "abort": true, 00:23:22.703 "nvme_admin": false, 00:23:22.703 "nvme_io": false 00:23:22.703 }, 00:23:22.703 "memory_domains": [ 00:23:22.703 { 00:23:22.703 "dma_device_id": "system", 00:23:22.703 "dma_device_type": 1 00:23:22.703 }, 00:23:22.703 { 00:23:22.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.703 "dma_device_type": 2 00:23:22.703 } 00:23:22.703 ], 00:23:22.703 "driver_specific": {} 00:23:22.703 } 00:23:22.703 ] 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.703 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:22.961 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.961 "name": "Existed_Raid", 00:23:22.961 "uuid": "5f4a0cca-b796-4049-afed-b0a87f72e448", 00:23:22.961 "strip_size_kb": 0, 00:23:22.961 "state": "online", 00:23:22.961 "raid_level": "raid1", 00:23:22.961 "superblock": true, 00:23:22.961 "num_base_bdevs": 2, 00:23:22.961 "num_base_bdevs_discovered": 2, 00:23:22.961 "num_base_bdevs_operational": 2, 00:23:22.961 "base_bdevs_list": [ 00:23:22.961 { 00:23:22.961 "name": "BaseBdev1", 00:23:22.961 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:22.961 "is_configured": true, 00:23:22.961 "data_offset": 2048, 00:23:22.961 "data_size": 63488 00:23:22.961 }, 00:23:22.961 { 00:23:22.961 "name": "BaseBdev2", 00:23:22.961 "uuid": "5a056e40-5070-48d4-b253-aae25b9b5d31", 00:23:22.961 "is_configured": true, 00:23:22.961 "data_offset": 2048, 00:23:22.961 "data_size": 63488 00:23:22.961 } 00:23:22.961 ] 00:23:22.961 }' 00:23:22.961 12:25:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.961 12:25:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:23.527 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:23.786 [2024-06-07 12:25:47.349861] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:23.786 "name": "Existed_Raid", 00:23:23.786 "aliases": [ 00:23:23.786 "5f4a0cca-b796-4049-afed-b0a87f72e448" 00:23:23.786 ], 00:23:23.786 "product_name": "Raid Volume", 00:23:23.786 "block_size": 512, 00:23:23.786 "num_blocks": 63488, 00:23:23.786 "uuid": "5f4a0cca-b796-4049-afed-b0a87f72e448", 00:23:23.786 "assigned_rate_limits": { 00:23:23.786 "rw_ios_per_sec": 0, 00:23:23.786 "rw_mbytes_per_sec": 0, 00:23:23.786 "r_mbytes_per_sec": 0, 00:23:23.786 "w_mbytes_per_sec": 0 00:23:23.786 }, 00:23:23.786 "claimed": false, 00:23:23.786 "zoned": false, 00:23:23.786 "supported_io_types": { 00:23:23.786 "read": true, 00:23:23.786 "write": true, 00:23:23.786 "unmap": false, 00:23:23.786 "write_zeroes": true, 00:23:23.786 "flush": false, 00:23:23.786 "reset": true, 00:23:23.786 "compare": false, 00:23:23.786 "compare_and_write": false, 00:23:23.786 "abort": false, 00:23:23.786 "nvme_admin": false, 00:23:23.786 "nvme_io": false 00:23:23.786 }, 00:23:23.786 "memory_domains": [ 00:23:23.786 { 00:23:23.786 "dma_device_id": "system", 00:23:23.786 "dma_device_type": 1 00:23:23.786 }, 00:23:23.786 { 00:23:23.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.786 "dma_device_type": 2 00:23:23.786 }, 00:23:23.786 { 00:23:23.786 "dma_device_id": "system", 00:23:23.786 "dma_device_type": 1 00:23:23.786 }, 00:23:23.786 { 00:23:23.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.786 "dma_device_type": 2 00:23:23.786 } 00:23:23.786 ], 00:23:23.786 "driver_specific": { 00:23:23.786 "raid": { 00:23:23.786 "uuid": "5f4a0cca-b796-4049-afed-b0a87f72e448", 00:23:23.786 "strip_size_kb": 0, 00:23:23.786 "state": "online", 00:23:23.786 "raid_level": "raid1", 00:23:23.786 "superblock": true, 00:23:23.786 "num_base_bdevs": 2, 00:23:23.786 "num_base_bdevs_discovered": 2, 00:23:23.786 "num_base_bdevs_operational": 2, 00:23:23.786 "base_bdevs_list": [ 00:23:23.786 { 00:23:23.786 "name": "BaseBdev1", 00:23:23.786 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:23.786 "is_configured": true, 00:23:23.786 "data_offset": 2048, 00:23:23.786 "data_size": 63488 00:23:23.786 }, 00:23:23.786 { 00:23:23.786 "name": "BaseBdev2", 00:23:23.786 "uuid": "5a056e40-5070-48d4-b253-aae25b9b5d31", 00:23:23.786 "is_configured": true, 00:23:23.786 "data_offset": 2048, 00:23:23.786 "data_size": 63488 00:23:23.786 } 00:23:23.786 ] 00:23:23.786 } 00:23:23.786 } 00:23:23.786 }' 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:23.786 BaseBdev2' 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:23.786 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:24.045 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:24.045 "name": "BaseBdev1", 00:23:24.045 "aliases": [ 00:23:24.045 "28c77925-d622-4d2a-9644-6522d80716d6" 00:23:24.045 ], 00:23:24.045 "product_name": "Malloc disk", 00:23:24.045 "block_size": 512, 00:23:24.045 "num_blocks": 65536, 00:23:24.045 "uuid": "28c77925-d622-4d2a-9644-6522d80716d6", 00:23:24.045 "assigned_rate_limits": { 00:23:24.045 "rw_ios_per_sec": 0, 00:23:24.045 "rw_mbytes_per_sec": 0, 00:23:24.045 "r_mbytes_per_sec": 0, 00:23:24.045 "w_mbytes_per_sec": 0 00:23:24.045 }, 00:23:24.045 "claimed": true, 00:23:24.045 "claim_type": "exclusive_write", 00:23:24.045 "zoned": false, 00:23:24.045 "supported_io_types": { 00:23:24.045 "read": true, 00:23:24.045 "write": true, 00:23:24.045 "unmap": true, 00:23:24.045 "write_zeroes": true, 00:23:24.045 "flush": true, 00:23:24.045 "reset": true, 00:23:24.045 "compare": false, 00:23:24.045 "compare_and_write": false, 00:23:24.045 "abort": true, 00:23:24.045 "nvme_admin": false, 00:23:24.045 "nvme_io": false 00:23:24.045 }, 00:23:24.045 "memory_domains": [ 00:23:24.045 { 00:23:24.045 "dma_device_id": "system", 00:23:24.045 "dma_device_type": 1 00:23:24.045 }, 00:23:24.045 { 00:23:24.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.045 "dma_device_type": 2 00:23:24.045 } 00:23:24.045 ], 00:23:24.045 "driver_specific": {} 00:23:24.045 }' 00:23:24.045 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.303 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.560 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:24.560 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.560 12:25:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.560 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:24.560 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:24.560 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:24.560 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:24.818 "name": "BaseBdev2", 00:23:24.818 "aliases": [ 00:23:24.818 "5a056e40-5070-48d4-b253-aae25b9b5d31" 00:23:24.818 ], 00:23:24.818 "product_name": "Malloc disk", 00:23:24.818 "block_size": 512, 00:23:24.818 "num_blocks": 65536, 00:23:24.818 "uuid": "5a056e40-5070-48d4-b253-aae25b9b5d31", 00:23:24.818 "assigned_rate_limits": { 00:23:24.818 "rw_ios_per_sec": 0, 00:23:24.818 "rw_mbytes_per_sec": 0, 00:23:24.818 "r_mbytes_per_sec": 0, 00:23:24.818 "w_mbytes_per_sec": 0 00:23:24.818 }, 00:23:24.818 "claimed": true, 00:23:24.818 "claim_type": "exclusive_write", 00:23:24.818 "zoned": false, 00:23:24.818 "supported_io_types": { 00:23:24.818 "read": true, 00:23:24.818 "write": true, 00:23:24.818 "unmap": true, 00:23:24.818 "write_zeroes": true, 00:23:24.818 "flush": true, 00:23:24.818 "reset": true, 00:23:24.818 "compare": false, 00:23:24.818 "compare_and_write": false, 00:23:24.818 "abort": true, 00:23:24.818 "nvme_admin": false, 00:23:24.818 "nvme_io": false 00:23:24.818 }, 00:23:24.818 "memory_domains": [ 00:23:24.818 { 00:23:24.818 "dma_device_id": "system", 00:23:24.818 "dma_device_type": 1 00:23:24.818 }, 00:23:24.818 { 00:23:24.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.818 "dma_device_type": 2 00:23:24.818 } 00:23:24.818 ], 00:23:24.818 "driver_specific": {} 00:23:24.818 }' 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:24.818 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:25.076 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:25.334 [2024-06-07 12:25:48.867670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.334 12:25:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:25.899 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.899 "name": "Existed_Raid", 00:23:25.899 "uuid": "5f4a0cca-b796-4049-afed-b0a87f72e448", 00:23:25.899 "strip_size_kb": 0, 00:23:25.899 "state": "online", 00:23:25.899 "raid_level": "raid1", 00:23:25.899 "superblock": true, 00:23:25.899 "num_base_bdevs": 2, 00:23:25.899 "num_base_bdevs_discovered": 1, 00:23:25.899 "num_base_bdevs_operational": 1, 00:23:25.899 "base_bdevs_list": [ 00:23:25.899 { 00:23:25.899 "name": null, 00:23:25.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.900 "is_configured": false, 00:23:25.900 "data_offset": 2048, 00:23:25.900 "data_size": 63488 00:23:25.900 }, 00:23:25.900 { 00:23:25.900 "name": "BaseBdev2", 00:23:25.900 "uuid": "5a056e40-5070-48d4-b253-aae25b9b5d31", 00:23:25.900 "is_configured": true, 00:23:25.900 "data_offset": 2048, 00:23:25.900 "data_size": 63488 00:23:25.900 } 00:23:25.900 ] 00:23:25.900 }' 00:23:25.900 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.900 12:25:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:26.465 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:26.465 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:26.465 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.465 12:25:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:26.723 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:26.723 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:26.723 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:26.981 [2024-06-07 12:25:50.535008] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:26.981 [2024-06-07 12:25:50.535371] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:26.981 [2024-06-07 12:25:50.559200] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:26.981 [2024-06-07 12:25:50.559571] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:26.981 [2024-06-07 12:25:50.559678] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:23:26.981 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:26.981 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:26.981 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:26.981 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 201350 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 201350 ']' 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 201350 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 201350 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 201350' 00:23:27.239 killing process with pid 201350 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 201350 00:23:27.239 [2024-06-07 12:25:50.866271] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:27.239 12:25:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 201350 00:23:27.239 [2024-06-07 12:25:50.866526] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:27.806 12:25:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:23:27.806 00:23:27.806 real 0m12.602s 00:23:27.806 user 0m22.395s 00:23:27.806 sys 0m1.992s 00:23:27.806 12:25:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:27.806 12:25:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:27.806 ************************************ 00:23:27.806 END TEST raid_state_function_test_sb 00:23:27.806 ************************************ 00:23:27.806 12:25:51 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:23:27.806 12:25:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:23:27.806 12:25:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:27.806 12:25:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:27.806 ************************************ 00:23:27.806 START TEST raid_superblock_test 00:23:27.806 ************************************ 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=201735 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 201735 /var/tmp/spdk-raid.sock 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 201735 ']' 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:27.806 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:27.806 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.806 [2024-06-07 12:25:51.343582] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:27.806 [2024-06-07 12:25:51.344659] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201735 ] 00:23:28.064 [2024-06-07 12:25:51.490246] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:28.064 [2024-06-07 12:25:51.592415] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:28.064 [2024-06-07 12:25:51.676233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:28.322 12:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:23:28.580 malloc1 00:23:28.580 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:28.840 [2024-06-07 12:25:52.295269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:28.840 [2024-06-07 12:25:52.295693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.840 [2024-06-07 12:25:52.295859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:23:28.840 [2024-06-07 12:25:52.296018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.840 [2024-06-07 12:25:52.298641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.840 [2024-06-07 12:25:52.298861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:28.840 pt1 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:28.840 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:23:29.135 malloc2 00:23:29.135 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:29.135 [2024-06-07 12:25:52.779087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:29.135 [2024-06-07 12:25:52.779447] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.135 [2024-06-07 12:25:52.779632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:23:29.135 [2024-06-07 12:25:52.779792] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.394 [2024-06-07 12:25:52.782524] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.394 [2024-06-07 12:25:52.782737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:29.394 pt2 00:23:29.394 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:29.394 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:29.394 12:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:29.394 [2024-06-07 12:25:53.035201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:29.394 [2024-06-07 12:25:53.037633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:29.394 [2024-06-07 12:25:53.037997] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:23:29.394 [2024-06-07 12:25:53.038114] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:29.394 [2024-06-07 12:25:53.038312] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:23:29.394 [2024-06-07 12:25:53.038809] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:23:29.394 [2024-06-07 12:25:53.038911] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:23:29.394 [2024-06-07 12:25:53.039257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.655 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.913 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:29.913 "name": "raid_bdev1", 00:23:29.913 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:29.913 "strip_size_kb": 0, 00:23:29.913 "state": "online", 00:23:29.913 "raid_level": "raid1", 00:23:29.913 "superblock": true, 00:23:29.913 "num_base_bdevs": 2, 00:23:29.913 "num_base_bdevs_discovered": 2, 00:23:29.913 "num_base_bdevs_operational": 2, 00:23:29.913 "base_bdevs_list": [ 00:23:29.913 { 00:23:29.913 "name": "pt1", 00:23:29.913 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:29.913 "is_configured": true, 00:23:29.913 "data_offset": 2048, 00:23:29.913 "data_size": 63488 00:23:29.913 }, 00:23:29.913 { 00:23:29.913 "name": "pt2", 00:23:29.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:29.913 "is_configured": true, 00:23:29.913 "data_offset": 2048, 00:23:29.913 "data_size": 63488 00:23:29.913 } 00:23:29.913 ] 00:23:29.913 }' 00:23:29.913 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:29.913 12:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:30.480 12:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:30.738 [2024-06-07 12:25:54.239603] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:30.738 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:30.738 "name": "raid_bdev1", 00:23:30.738 "aliases": [ 00:23:30.738 "3a363170-c05f-4ad9-b961-fde6f270e79a" 00:23:30.738 ], 00:23:30.738 "product_name": "Raid Volume", 00:23:30.738 "block_size": 512, 00:23:30.738 "num_blocks": 63488, 00:23:30.738 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:30.738 "assigned_rate_limits": { 00:23:30.738 "rw_ios_per_sec": 0, 00:23:30.738 "rw_mbytes_per_sec": 0, 00:23:30.738 "r_mbytes_per_sec": 0, 00:23:30.738 "w_mbytes_per_sec": 0 00:23:30.738 }, 00:23:30.738 "claimed": false, 00:23:30.738 "zoned": false, 00:23:30.738 "supported_io_types": { 00:23:30.738 "read": true, 00:23:30.738 "write": true, 00:23:30.738 "unmap": false, 00:23:30.738 "write_zeroes": true, 00:23:30.738 "flush": false, 00:23:30.738 "reset": true, 00:23:30.738 "compare": false, 00:23:30.738 "compare_and_write": false, 00:23:30.738 "abort": false, 00:23:30.738 "nvme_admin": false, 00:23:30.738 "nvme_io": false 00:23:30.738 }, 00:23:30.738 "memory_domains": [ 00:23:30.738 { 00:23:30.738 "dma_device_id": "system", 00:23:30.738 "dma_device_type": 1 00:23:30.738 }, 00:23:30.738 { 00:23:30.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.738 "dma_device_type": 2 00:23:30.738 }, 00:23:30.738 { 00:23:30.738 "dma_device_id": "system", 00:23:30.738 "dma_device_type": 1 00:23:30.738 }, 00:23:30.738 { 00:23:30.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.738 "dma_device_type": 2 00:23:30.738 } 00:23:30.738 ], 00:23:30.738 "driver_specific": { 00:23:30.738 "raid": { 00:23:30.738 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:30.738 "strip_size_kb": 0, 00:23:30.738 "state": "online", 00:23:30.738 "raid_level": "raid1", 00:23:30.738 "superblock": true, 00:23:30.738 "num_base_bdevs": 2, 00:23:30.738 "num_base_bdevs_discovered": 2, 00:23:30.738 "num_base_bdevs_operational": 2, 00:23:30.738 "base_bdevs_list": [ 00:23:30.738 { 00:23:30.738 "name": "pt1", 00:23:30.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:30.738 "is_configured": true, 00:23:30.738 "data_offset": 2048, 00:23:30.738 "data_size": 63488 00:23:30.738 }, 00:23:30.738 { 00:23:30.738 "name": "pt2", 00:23:30.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.738 "is_configured": true, 00:23:30.738 "data_offset": 2048, 00:23:30.738 "data_size": 63488 00:23:30.738 } 00:23:30.738 ] 00:23:30.738 } 00:23:30.738 } 00:23:30.738 }' 00:23:30.738 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:30.738 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:30.738 pt2' 00:23:30.738 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:30.739 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:30.739 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:30.997 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:30.997 "name": "pt1", 00:23:30.997 "aliases": [ 00:23:30.997 "00000000-0000-0000-0000-000000000001" 00:23:30.997 ], 00:23:30.997 "product_name": "passthru", 00:23:30.997 "block_size": 512, 00:23:30.997 "num_blocks": 65536, 00:23:30.997 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:30.997 "assigned_rate_limits": { 00:23:30.997 "rw_ios_per_sec": 0, 00:23:30.997 "rw_mbytes_per_sec": 0, 00:23:30.997 "r_mbytes_per_sec": 0, 00:23:30.997 "w_mbytes_per_sec": 0 00:23:30.997 }, 00:23:30.997 "claimed": true, 00:23:30.997 "claim_type": "exclusive_write", 00:23:30.997 "zoned": false, 00:23:30.997 "supported_io_types": { 00:23:30.997 "read": true, 00:23:30.997 "write": true, 00:23:30.997 "unmap": true, 00:23:30.997 "write_zeroes": true, 00:23:30.997 "flush": true, 00:23:30.997 "reset": true, 00:23:30.997 "compare": false, 00:23:30.997 "compare_and_write": false, 00:23:30.997 "abort": true, 00:23:30.997 "nvme_admin": false, 00:23:30.997 "nvme_io": false 00:23:30.997 }, 00:23:30.997 "memory_domains": [ 00:23:30.997 { 00:23:30.997 "dma_device_id": "system", 00:23:30.997 "dma_device_type": 1 00:23:30.997 }, 00:23:30.997 { 00:23:30.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.997 "dma_device_type": 2 00:23:30.997 } 00:23:30.997 ], 00:23:30.997 "driver_specific": { 00:23:30.997 "passthru": { 00:23:30.997 "name": "pt1", 00:23:30.997 "base_bdev_name": "malloc1" 00:23:30.997 } 00:23:30.997 } 00:23:30.997 }' 00:23:30.997 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:31.255 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.575 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.576 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:31.576 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:31.576 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:31.576 12:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:31.833 "name": "pt2", 00:23:31.833 "aliases": [ 00:23:31.833 "00000000-0000-0000-0000-000000000002" 00:23:31.833 ], 00:23:31.833 "product_name": "passthru", 00:23:31.833 "block_size": 512, 00:23:31.833 "num_blocks": 65536, 00:23:31.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:31.833 "assigned_rate_limits": { 00:23:31.833 "rw_ios_per_sec": 0, 00:23:31.833 "rw_mbytes_per_sec": 0, 00:23:31.833 "r_mbytes_per_sec": 0, 00:23:31.833 "w_mbytes_per_sec": 0 00:23:31.833 }, 00:23:31.833 "claimed": true, 00:23:31.833 "claim_type": "exclusive_write", 00:23:31.833 "zoned": false, 00:23:31.833 "supported_io_types": { 00:23:31.833 "read": true, 00:23:31.833 "write": true, 00:23:31.833 "unmap": true, 00:23:31.833 "write_zeroes": true, 00:23:31.833 "flush": true, 00:23:31.833 "reset": true, 00:23:31.833 "compare": false, 00:23:31.833 "compare_and_write": false, 00:23:31.833 "abort": true, 00:23:31.833 "nvme_admin": false, 00:23:31.833 "nvme_io": false 00:23:31.833 }, 00:23:31.833 "memory_domains": [ 00:23:31.833 { 00:23:31.833 "dma_device_id": "system", 00:23:31.833 "dma_device_type": 1 00:23:31.833 }, 00:23:31.833 { 00:23:31.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.833 "dma_device_type": 2 00:23:31.833 } 00:23:31.833 ], 00:23:31.833 "driver_specific": { 00:23:31.833 "passthru": { 00:23:31.833 "name": "pt2", 00:23:31.833 "base_bdev_name": "malloc2" 00:23:31.833 } 00:23:31.833 } 00:23:31.833 }' 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.833 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.834 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:31.834 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:32.091 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:32.349 [2024-06-07 12:25:55.879732] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:32.349 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3a363170-c05f-4ad9-b961-fde6f270e79a 00:23:32.349 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 3a363170-c05f-4ad9-b961-fde6f270e79a ']' 00:23:32.349 12:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:32.606 [2024-06-07 12:25:56.183673] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:32.606 [2024-06-07 12:25:56.183974] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:32.606 [2024-06-07 12:25:56.184243] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:32.606 [2024-06-07 12:25:56.184416] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:32.606 [2024-06-07 12:25:56.184525] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:23:32.606 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.606 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:32.865 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:32.865 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:32.865 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:32.865 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:33.122 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:33.122 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:33.380 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:33.380 12:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:23:33.946 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:34.204 [2024-06-07 12:25:57.591903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:34.204 [2024-06-07 12:25:57.594508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:34.204 [2024-06-07 12:25:57.594772] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:34.204 [2024-06-07 12:25:57.595041] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:34.204 [2024-06-07 12:25:57.595210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:34.204 [2024-06-07 12:25:57.595350] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:23:34.204 request: 00:23:34.204 { 00:23:34.204 "name": "raid_bdev1", 00:23:34.204 "raid_level": "raid1", 00:23:34.204 "base_bdevs": [ 00:23:34.204 "malloc1", 00:23:34.204 "malloc2" 00:23:34.204 ], 00:23:34.204 "superblock": false, 00:23:34.204 "method": "bdev_raid_create", 00:23:34.204 "req_id": 1 00:23:34.204 } 00:23:34.204 Got JSON-RPC error response 00:23:34.204 response: 00:23:34.204 { 00:23:34.204 "code": -17, 00:23:34.204 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:34.204 } 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:34.204 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.462 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:34.462 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:34.462 12:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:34.720 [2024-06-07 12:25:58.163966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:34.720 [2024-06-07 12:25:58.164400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.720 [2024-06-07 12:25:58.164566] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:23:34.720 [2024-06-07 12:25:58.164709] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.720 [2024-06-07 12:25:58.167266] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.720 [2024-06-07 12:25:58.167498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:34.720 [2024-06-07 12:25:58.167737] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:34.720 [2024-06-07 12:25:58.167941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:34.720 pt1 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.720 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.979 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.979 "name": "raid_bdev1", 00:23:34.979 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:34.979 "strip_size_kb": 0, 00:23:34.979 "state": "configuring", 00:23:34.979 "raid_level": "raid1", 00:23:34.979 "superblock": true, 00:23:34.979 "num_base_bdevs": 2, 00:23:34.979 "num_base_bdevs_discovered": 1, 00:23:34.979 "num_base_bdevs_operational": 2, 00:23:34.979 "base_bdevs_list": [ 00:23:34.979 { 00:23:34.979 "name": "pt1", 00:23:34.979 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:34.979 "is_configured": true, 00:23:34.979 "data_offset": 2048, 00:23:34.979 "data_size": 63488 00:23:34.979 }, 00:23:34.979 { 00:23:34.979 "name": null, 00:23:34.979 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:34.979 "is_configured": false, 00:23:34.979 "data_offset": 2048, 00:23:34.979 "data_size": 63488 00:23:34.979 } 00:23:34.979 ] 00:23:34.979 }' 00:23:34.979 12:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.979 12:25:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:35.912 [2024-06-07 12:25:59.468585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:35.912 [2024-06-07 12:25:59.469119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.912 [2024-06-07 12:25:59.469356] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:23:35.912 [2024-06-07 12:25:59.469533] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.912 [2024-06-07 12:25:59.470145] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.912 [2024-06-07 12:25:59.470366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:35.912 [2024-06-07 12:25:59.470627] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:35.912 [2024-06-07 12:25:59.470793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:35.912 [2024-06-07 12:25:59.471113] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:23:35.912 [2024-06-07 12:25:59.471249] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:35.912 [2024-06-07 12:25:59.471368] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:23:35.912 [2024-06-07 12:25:59.471799] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:23:35.912 [2024-06-07 12:25:59.471923] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:23:35.912 [2024-06-07 12:25:59.472092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.912 pt2 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.912 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.170 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.170 "name": "raid_bdev1", 00:23:36.170 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:36.170 "strip_size_kb": 0, 00:23:36.170 "state": "online", 00:23:36.170 "raid_level": "raid1", 00:23:36.170 "superblock": true, 00:23:36.170 "num_base_bdevs": 2, 00:23:36.170 "num_base_bdevs_discovered": 2, 00:23:36.170 "num_base_bdevs_operational": 2, 00:23:36.170 "base_bdevs_list": [ 00:23:36.170 { 00:23:36.170 "name": "pt1", 00:23:36.170 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:36.170 "is_configured": true, 00:23:36.170 "data_offset": 2048, 00:23:36.170 "data_size": 63488 00:23:36.170 }, 00:23:36.170 { 00:23:36.170 "name": "pt2", 00:23:36.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:36.170 "is_configured": true, 00:23:36.170 "data_offset": 2048, 00:23:36.170 "data_size": 63488 00:23:36.170 } 00:23:36.170 ] 00:23:36.170 }' 00:23:36.170 12:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.170 12:25:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:36.736 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:36.994 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:37.252 [2024-06-07 12:26:00.656823] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:37.252 "name": "raid_bdev1", 00:23:37.252 "aliases": [ 00:23:37.252 "3a363170-c05f-4ad9-b961-fde6f270e79a" 00:23:37.252 ], 00:23:37.252 "product_name": "Raid Volume", 00:23:37.252 "block_size": 512, 00:23:37.252 "num_blocks": 63488, 00:23:37.252 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:37.252 "assigned_rate_limits": { 00:23:37.252 "rw_ios_per_sec": 0, 00:23:37.252 "rw_mbytes_per_sec": 0, 00:23:37.252 "r_mbytes_per_sec": 0, 00:23:37.252 "w_mbytes_per_sec": 0 00:23:37.252 }, 00:23:37.252 "claimed": false, 00:23:37.252 "zoned": false, 00:23:37.252 "supported_io_types": { 00:23:37.252 "read": true, 00:23:37.252 "write": true, 00:23:37.252 "unmap": false, 00:23:37.252 "write_zeroes": true, 00:23:37.252 "flush": false, 00:23:37.252 "reset": true, 00:23:37.252 "compare": false, 00:23:37.252 "compare_and_write": false, 00:23:37.252 "abort": false, 00:23:37.252 "nvme_admin": false, 00:23:37.252 "nvme_io": false 00:23:37.252 }, 00:23:37.252 "memory_domains": [ 00:23:37.252 { 00:23:37.252 "dma_device_id": "system", 00:23:37.252 "dma_device_type": 1 00:23:37.252 }, 00:23:37.252 { 00:23:37.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:37.252 "dma_device_type": 2 00:23:37.252 }, 00:23:37.252 { 00:23:37.252 "dma_device_id": "system", 00:23:37.252 "dma_device_type": 1 00:23:37.252 }, 00:23:37.252 { 00:23:37.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:37.252 "dma_device_type": 2 00:23:37.252 } 00:23:37.252 ], 00:23:37.252 "driver_specific": { 00:23:37.252 "raid": { 00:23:37.252 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:37.252 "strip_size_kb": 0, 00:23:37.252 "state": "online", 00:23:37.252 "raid_level": "raid1", 00:23:37.252 "superblock": true, 00:23:37.252 "num_base_bdevs": 2, 00:23:37.252 "num_base_bdevs_discovered": 2, 00:23:37.252 "num_base_bdevs_operational": 2, 00:23:37.252 "base_bdevs_list": [ 00:23:37.252 { 00:23:37.252 "name": "pt1", 00:23:37.252 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:37.252 "is_configured": true, 00:23:37.252 "data_offset": 2048, 00:23:37.252 "data_size": 63488 00:23:37.252 }, 00:23:37.252 { 00:23:37.252 "name": "pt2", 00:23:37.252 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:37.252 "is_configured": true, 00:23:37.252 "data_offset": 2048, 00:23:37.252 "data_size": 63488 00:23:37.252 } 00:23:37.252 ] 00:23:37.252 } 00:23:37.252 } 00:23:37.252 }' 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:37.252 pt2' 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:37.252 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:37.510 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:37.510 "name": "pt1", 00:23:37.510 "aliases": [ 00:23:37.510 "00000000-0000-0000-0000-000000000001" 00:23:37.510 ], 00:23:37.510 "product_name": "passthru", 00:23:37.510 "block_size": 512, 00:23:37.510 "num_blocks": 65536, 00:23:37.510 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:37.510 "assigned_rate_limits": { 00:23:37.510 "rw_ios_per_sec": 0, 00:23:37.510 "rw_mbytes_per_sec": 0, 00:23:37.510 "r_mbytes_per_sec": 0, 00:23:37.510 "w_mbytes_per_sec": 0 00:23:37.510 }, 00:23:37.510 "claimed": true, 00:23:37.510 "claim_type": "exclusive_write", 00:23:37.510 "zoned": false, 00:23:37.510 "supported_io_types": { 00:23:37.510 "read": true, 00:23:37.510 "write": true, 00:23:37.510 "unmap": true, 00:23:37.510 "write_zeroes": true, 00:23:37.510 "flush": true, 00:23:37.510 "reset": true, 00:23:37.510 "compare": false, 00:23:37.510 "compare_and_write": false, 00:23:37.510 "abort": true, 00:23:37.510 "nvme_admin": false, 00:23:37.510 "nvme_io": false 00:23:37.510 }, 00:23:37.510 "memory_domains": [ 00:23:37.510 { 00:23:37.510 "dma_device_id": "system", 00:23:37.510 "dma_device_type": 1 00:23:37.510 }, 00:23:37.510 { 00:23:37.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:37.510 "dma_device_type": 2 00:23:37.510 } 00:23:37.510 ], 00:23:37.510 "driver_specific": { 00:23:37.510 "passthru": { 00:23:37.510 "name": "pt1", 00:23:37.510 "base_bdev_name": "malloc1" 00:23:37.510 } 00:23:37.510 } 00:23:37.510 }' 00:23:37.510 12:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:37.510 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:37.768 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:38.025 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:38.025 "name": "pt2", 00:23:38.025 "aliases": [ 00:23:38.025 "00000000-0000-0000-0000-000000000002" 00:23:38.025 ], 00:23:38.025 "product_name": "passthru", 00:23:38.025 "block_size": 512, 00:23:38.025 "num_blocks": 65536, 00:23:38.025 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:38.025 "assigned_rate_limits": { 00:23:38.026 "rw_ios_per_sec": 0, 00:23:38.026 "rw_mbytes_per_sec": 0, 00:23:38.026 "r_mbytes_per_sec": 0, 00:23:38.026 "w_mbytes_per_sec": 0 00:23:38.026 }, 00:23:38.026 "claimed": true, 00:23:38.026 "claim_type": "exclusive_write", 00:23:38.026 "zoned": false, 00:23:38.026 "supported_io_types": { 00:23:38.026 "read": true, 00:23:38.026 "write": true, 00:23:38.026 "unmap": true, 00:23:38.026 "write_zeroes": true, 00:23:38.026 "flush": true, 00:23:38.026 "reset": true, 00:23:38.026 "compare": false, 00:23:38.026 "compare_and_write": false, 00:23:38.026 "abort": true, 00:23:38.026 "nvme_admin": false, 00:23:38.026 "nvme_io": false 00:23:38.026 }, 00:23:38.026 "memory_domains": [ 00:23:38.026 { 00:23:38.026 "dma_device_id": "system", 00:23:38.026 "dma_device_type": 1 00:23:38.026 }, 00:23:38.026 { 00:23:38.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:38.026 "dma_device_type": 2 00:23:38.026 } 00:23:38.026 ], 00:23:38.026 "driver_specific": { 00:23:38.026 "passthru": { 00:23:38.026 "name": "pt2", 00:23:38.026 "base_bdev_name": "malloc2" 00:23:38.026 } 00:23:38.026 } 00:23:38.026 }' 00:23:38.026 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:38.026 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:38.026 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:38.026 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:38.283 12:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:38.549 [2024-06-07 12:26:02.117043] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:38.549 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 3a363170-c05f-4ad9-b961-fde6f270e79a '!=' 3a363170-c05f-4ad9-b961-fde6f270e79a ']' 00:23:38.549 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:38.549 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:38.549 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:38.549 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:38.831 [2024-06-07 12:26:02.444931] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.831 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.089 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.089 "name": "raid_bdev1", 00:23:39.089 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:39.089 "strip_size_kb": 0, 00:23:39.089 "state": "online", 00:23:39.089 "raid_level": "raid1", 00:23:39.089 "superblock": true, 00:23:39.089 "num_base_bdevs": 2, 00:23:39.089 "num_base_bdevs_discovered": 1, 00:23:39.089 "num_base_bdevs_operational": 1, 00:23:39.089 "base_bdevs_list": [ 00:23:39.089 { 00:23:39.089 "name": null, 00:23:39.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.089 "is_configured": false, 00:23:39.089 "data_offset": 2048, 00:23:39.089 "data_size": 63488 00:23:39.089 }, 00:23:39.089 { 00:23:39.089 "name": "pt2", 00:23:39.089 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:39.089 "is_configured": true, 00:23:39.089 "data_offset": 2048, 00:23:39.089 "data_size": 63488 00:23:39.089 } 00:23:39.089 ] 00:23:39.089 }' 00:23:39.089 12:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.089 12:26:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.020 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:40.020 [2024-06-07 12:26:03.592993] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.020 [2024-06-07 12:26:03.593352] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:40.020 [2024-06-07 12:26:03.593553] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.020 [2024-06-07 12:26:03.593693] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.020 [2024-06-07 12:26:03.593788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:23:40.020 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.020 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:40.277 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:40.277 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:40.277 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:40.277 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:40.277 12:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:40.844 [2024-06-07 12:26:04.429204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:40.844 [2024-06-07 12:26:04.429672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:40.844 [2024-06-07 12:26:04.429849] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:23:40.844 [2024-06-07 12:26:04.429990] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:40.844 [2024-06-07 12:26:04.432426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:40.844 [2024-06-07 12:26:04.432668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:40.844 [2024-06-07 12:26:04.432893] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:40.844 [2024-06-07 12:26:04.433021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:40.844 [2024-06-07 12:26:04.433221] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:23:40.844 [2024-06-07 12:26:04.433340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:40.844 [2024-06-07 12:26:04.433464] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:23:40.844 [2024-06-07 12:26:04.433779] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:23:40.844 [2024-06-07 12:26:04.433890] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:23:40.844 [2024-06-07 12:26:04.434116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.844 pt2 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.844 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:41.410 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:41.410 "name": "raid_bdev1", 00:23:41.410 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:41.411 "strip_size_kb": 0, 00:23:41.411 "state": "online", 00:23:41.411 "raid_level": "raid1", 00:23:41.411 "superblock": true, 00:23:41.411 "num_base_bdevs": 2, 00:23:41.411 "num_base_bdevs_discovered": 1, 00:23:41.411 "num_base_bdevs_operational": 1, 00:23:41.411 "base_bdevs_list": [ 00:23:41.411 { 00:23:41.411 "name": null, 00:23:41.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:41.411 "is_configured": false, 00:23:41.411 "data_offset": 2048, 00:23:41.411 "data_size": 63488 00:23:41.411 }, 00:23:41.411 { 00:23:41.411 "name": "pt2", 00:23:41.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:41.411 "is_configured": true, 00:23:41.411 "data_offset": 2048, 00:23:41.411 "data_size": 63488 00:23:41.411 } 00:23:41.411 ] 00:23:41.411 }' 00:23:41.411 12:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:41.411 12:26:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.010 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:42.267 [2024-06-07 12:26:05.666640] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:42.267 [2024-06-07 12:26:05.666976] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:42.267 [2024-06-07 12:26:05.667145] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:42.267 [2024-06-07 12:26:05.667282] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:42.267 [2024-06-07 12:26:05.667384] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:23:42.267 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.267 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:42.525 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:42.525 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:42.525 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:42.525 12:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:42.784 [2024-06-07 12:26:06.206684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:42.784 [2024-06-07 12:26:06.207599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.784 [2024-06-07 12:26:06.207869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:23:42.784 [2024-06-07 12:26:06.208078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.784 [2024-06-07 12:26:06.210617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.784 [2024-06-07 12:26:06.210869] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:42.784 [2024-06-07 12:26:06.211171] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:42.784 [2024-06-07 12:26:06.211320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:42.784 [2024-06-07 12:26:06.211634] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:42.784 pt1 00:23:42.784 [2024-06-07 12:26:06.213391] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:42.784 [2024-06-07 12:26:06.213514] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state configuring 00:23:42.784 [2024-06-07 12:26:06.213668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:42.784 [2024-06-07 12:26:06.213839] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:23:42.784 [2024-06-07 12:26:06.213941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:42.784 [2024-06-07 12:26:06.214073] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:23:42.784 [2024-06-07 12:26:06.214443] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:23:42.784 [2024-06-07 12:26:06.214584] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:23:42.784 [2024-06-07 12:26:06.214789] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.784 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.043 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.043 "name": "raid_bdev1", 00:23:43.043 "uuid": "3a363170-c05f-4ad9-b961-fde6f270e79a", 00:23:43.043 "strip_size_kb": 0, 00:23:43.043 "state": "online", 00:23:43.043 "raid_level": "raid1", 00:23:43.043 "superblock": true, 00:23:43.043 "num_base_bdevs": 2, 00:23:43.043 "num_base_bdevs_discovered": 1, 00:23:43.043 "num_base_bdevs_operational": 1, 00:23:43.043 "base_bdevs_list": [ 00:23:43.043 { 00:23:43.043 "name": null, 00:23:43.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.043 "is_configured": false, 00:23:43.043 "data_offset": 2048, 00:23:43.043 "data_size": 63488 00:23:43.043 }, 00:23:43.043 { 00:23:43.043 "name": "pt2", 00:23:43.043 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:43.043 "is_configured": true, 00:23:43.043 "data_offset": 2048, 00:23:43.043 "data_size": 63488 00:23:43.043 } 00:23:43.043 ] 00:23:43.043 }' 00:23:43.043 12:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.043 12:26:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:43.607 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:43.607 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:43.864 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:43.864 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:43.864 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:44.122 [2024-06-07 12:26:07.621841] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 3a363170-c05f-4ad9-b961-fde6f270e79a '!=' 3a363170-c05f-4ad9-b961-fde6f270e79a ']' 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 201735 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 201735 ']' 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 201735 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 201735 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 201735' 00:23:44.122 killing process with pid 201735 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 201735 00:23:44.122 12:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 201735 00:23:44.122 [2024-06-07 12:26:07.683685] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:44.122 [2024-06-07 12:26:07.683800] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:44.122 [2024-06-07 12:26:07.684006] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:44.122 [2024-06-07 12:26:07.684119] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:23:44.122 [2024-06-07 12:26:07.729845] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:44.688 12:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:44.688 00:23:44.688 real 0m16.803s 00:23:44.688 user 0m30.696s 00:23:44.688 sys 0m2.788s 00:23:44.688 12:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:44.688 12:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:44.688 ************************************ 00:23:44.688 END TEST raid_superblock_test 00:23:44.688 ************************************ 00:23:44.688 12:26:08 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:23:44.688 12:26:08 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:44.688 12:26:08 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:44.688 12:26:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:44.688 ************************************ 00:23:44.688 START TEST raid_read_error_test 00:23:44.688 ************************************ 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 read 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.22ylc5rBKy 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=202275 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 202275 /var/tmp/spdk-raid.sock 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 202275 ']' 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:44.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:44.688 12:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:44.688 [2024-06-07 12:26:08.221923] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:44.688 [2024-06-07 12:26:08.222623] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202275 ] 00:23:44.946 [2024-06-07 12:26:08.367215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:44.946 [2024-06-07 12:26:08.469636] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:44.946 [2024-06-07 12:26:08.555563] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:45.879 12:26:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:45.879 12:26:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:23:45.879 12:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:45.879 12:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:45.879 BaseBdev1_malloc 00:23:46.137 12:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:46.395 true 00:23:46.395 12:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:46.653 [2024-06-07 12:26:10.106416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:46.653 [2024-06-07 12:26:10.106802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.653 [2024-06-07 12:26:10.107045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:23:46.653 [2024-06-07 12:26:10.107217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.653 [2024-06-07 12:26:10.109977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.653 [2024-06-07 12:26:10.110217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:46.653 BaseBdev1 00:23:46.653 12:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:46.653 12:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:46.911 BaseBdev2_malloc 00:23:46.911 12:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:47.196 true 00:23:47.196 12:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:47.456 [2024-06-07 12:26:10.966787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:47.456 [2024-06-07 12:26:10.967248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:47.456 [2024-06-07 12:26:10.967351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:23:47.456 [2024-06-07 12:26:10.967646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:47.456 [2024-06-07 12:26:10.970327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:47.456 [2024-06-07 12:26:10.970561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:47.456 BaseBdev2 00:23:47.456 12:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:23:47.714 [2024-06-07 12:26:11.275050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:47.714 [2024-06-07 12:26:11.277604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:47.714 [2024-06-07 12:26:11.278023] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:23:47.714 [2024-06-07 12:26:11.278144] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:47.714 [2024-06-07 12:26:11.278358] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:23:47.714 [2024-06-07 12:26:11.278833] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:23:47.714 [2024-06-07 12:26:11.278950] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:23:47.714 [2024-06-07 12:26:11.279304] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.714 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.715 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.715 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.715 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.973 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.973 "name": "raid_bdev1", 00:23:47.973 "uuid": "2618640d-5149-42bd-9589-3563439423c7", 00:23:47.973 "strip_size_kb": 0, 00:23:47.973 "state": "online", 00:23:47.973 "raid_level": "raid1", 00:23:47.973 "superblock": true, 00:23:47.973 "num_base_bdevs": 2, 00:23:47.973 "num_base_bdevs_discovered": 2, 00:23:47.973 "num_base_bdevs_operational": 2, 00:23:47.973 "base_bdevs_list": [ 00:23:47.973 { 00:23:47.973 "name": "BaseBdev1", 00:23:47.973 "uuid": "ead2d0a1-fa84-55b0-aae0-768576abc345", 00:23:47.973 "is_configured": true, 00:23:47.973 "data_offset": 2048, 00:23:47.973 "data_size": 63488 00:23:47.973 }, 00:23:47.973 { 00:23:47.973 "name": "BaseBdev2", 00:23:47.973 "uuid": "89cb9a00-044c-538a-b648-7af571a4dde5", 00:23:47.973 "is_configured": true, 00:23:47.973 "data_offset": 2048, 00:23:47.973 "data_size": 63488 00:23:47.973 } 00:23:47.973 ] 00:23:47.973 }' 00:23:47.973 12:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.973 12:26:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:48.537 12:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:48.537 12:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:48.795 [2024-06-07 12:26:12.279770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:23:49.731 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.989 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.246 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.246 "name": "raid_bdev1", 00:23:50.246 "uuid": "2618640d-5149-42bd-9589-3563439423c7", 00:23:50.246 "strip_size_kb": 0, 00:23:50.246 "state": "online", 00:23:50.246 "raid_level": "raid1", 00:23:50.246 "superblock": true, 00:23:50.246 "num_base_bdevs": 2, 00:23:50.246 "num_base_bdevs_discovered": 2, 00:23:50.246 "num_base_bdevs_operational": 2, 00:23:50.246 "base_bdevs_list": [ 00:23:50.246 { 00:23:50.246 "name": "BaseBdev1", 00:23:50.246 "uuid": "ead2d0a1-fa84-55b0-aae0-768576abc345", 00:23:50.246 "is_configured": true, 00:23:50.246 "data_offset": 2048, 00:23:50.246 "data_size": 63488 00:23:50.246 }, 00:23:50.246 { 00:23:50.246 "name": "BaseBdev2", 00:23:50.246 "uuid": "89cb9a00-044c-538a-b648-7af571a4dde5", 00:23:50.246 "is_configured": true, 00:23:50.246 "data_offset": 2048, 00:23:50.246 "data_size": 63488 00:23:50.246 } 00:23:50.246 ] 00:23:50.246 }' 00:23:50.246 12:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.246 12:26:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:50.811 12:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:51.067 [2024-06-07 12:26:14.659208] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:51.067 [2024-06-07 12:26:14.659579] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:51.067 [2024-06-07 12:26:14.661159] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:51.067 [2024-06-07 12:26:14.661409] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:51.067 [2024-06-07 12:26:14.661571] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:51.067 [2024-06-07 12:26:14.661667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:23:51.067 0 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 202275 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 202275 ']' 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 202275 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:51.067 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 202275 00:23:51.325 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:51.325 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:51.325 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 202275' 00:23:51.325 killing process with pid 202275 00:23:51.325 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 202275 00:23:51.325 [2024-06-07 12:26:14.719570] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:51.325 12:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 202275 00:23:51.325 [2024-06-07 12:26:14.749646] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.22ylc5rBKy 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:51.584 00:23:51.584 real 0m6.976s 00:23:51.584 user 0m10.941s 00:23:51.584 sys 0m1.126s 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:51.584 12:26:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:51.584 ************************************ 00:23:51.584 END TEST raid_read_error_test 00:23:51.584 ************************************ 00:23:51.584 12:26:15 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:23:51.584 12:26:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:51.584 12:26:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:51.584 12:26:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:51.584 ************************************ 00:23:51.584 START TEST raid_write_error_test 00:23:51.584 ************************************ 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 write 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:51.584 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.rE4VmjdQGe 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=202458 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 202458 /var/tmp/spdk-raid.sock 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 202458 ']' 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:51.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:51.841 12:26:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:51.841 [2024-06-07 12:26:15.269805] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:51.841 [2024-06-07 12:26:15.270405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202458 ] 00:23:51.841 [2024-06-07 12:26:15.415638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:52.099 [2024-06-07 12:26:15.514304] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:52.099 [2024-06-07 12:26:15.595618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:52.665 12:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:52.665 12:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:23:52.665 12:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:52.666 12:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:53.002 BaseBdev1_malloc 00:23:53.002 12:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:53.263 true 00:23:53.263 12:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:53.524 [2024-06-07 12:26:16.994484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:53.524 [2024-06-07 12:26:16.995382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.524 [2024-06-07 12:26:16.995700] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:23:53.524 [2024-06-07 12:26:16.995983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.524 [2024-06-07 12:26:16.999092] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.524 [2024-06-07 12:26:16.999441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:53.524 BaseBdev1 00:23:53.524 12:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:53.524 12:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:53.784 BaseBdev2_malloc 00:23:53.784 12:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:54.041 true 00:23:54.041 12:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:54.299 [2024-06-07 12:26:17.863960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:54.299 [2024-06-07 12:26:17.864569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.299 [2024-06-07 12:26:17.864835] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:23:54.299 [2024-06-07 12:26:17.865071] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.299 [2024-06-07 12:26:17.867696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.299 [2024-06-07 12:26:17.867973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:54.299 BaseBdev2 00:23:54.299 12:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:23:54.866 [2024-06-07 12:26:18.204545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.866 [2024-06-07 12:26:18.207077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:54.866 [2024-06-07 12:26:18.207504] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007280 00:23:54.866 [2024-06-07 12:26:18.207628] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:54.866 [2024-06-07 12:26:18.207827] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:23:54.866 [2024-06-07 12:26:18.208328] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007280 00:23:54.866 [2024-06-07 12:26:18.208447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007280 00:23:54.866 [2024-06-07 12:26:18.208739] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.867 "name": "raid_bdev1", 00:23:54.867 "uuid": "ed3df163-cd34-40e7-87a2-4f9227745642", 00:23:54.867 "strip_size_kb": 0, 00:23:54.867 "state": "online", 00:23:54.867 "raid_level": "raid1", 00:23:54.867 "superblock": true, 00:23:54.867 "num_base_bdevs": 2, 00:23:54.867 "num_base_bdevs_discovered": 2, 00:23:54.867 "num_base_bdevs_operational": 2, 00:23:54.867 "base_bdevs_list": [ 00:23:54.867 { 00:23:54.867 "name": "BaseBdev1", 00:23:54.867 "uuid": "21ab782a-a899-558a-acad-45c778780f13", 00:23:54.867 "is_configured": true, 00:23:54.867 "data_offset": 2048, 00:23:54.867 "data_size": 63488 00:23:54.867 }, 00:23:54.867 { 00:23:54.867 "name": "BaseBdev2", 00:23:54.867 "uuid": "6f4fdc58-dba1-5380-b04e-804bf96c8898", 00:23:54.867 "is_configured": true, 00:23:54.867 "data_offset": 2048, 00:23:54.867 "data_size": 63488 00:23:54.867 } 00:23:54.867 ] 00:23:54.867 }' 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.867 12:26:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:55.801 12:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:55.801 12:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:55.801 [2024-06-07 12:26:19.217184] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:56.736 [2024-06-07 12:26:20.320069] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:56.736 [2024-06-07 12:26:20.320945] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:56.736 [2024-06-07 12:26:20.321306] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002460 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.736 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.302 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.302 "name": "raid_bdev1", 00:23:57.302 "uuid": "ed3df163-cd34-40e7-87a2-4f9227745642", 00:23:57.302 "strip_size_kb": 0, 00:23:57.302 "state": "online", 00:23:57.302 "raid_level": "raid1", 00:23:57.302 "superblock": true, 00:23:57.302 "num_base_bdevs": 2, 00:23:57.302 "num_base_bdevs_discovered": 1, 00:23:57.302 "num_base_bdevs_operational": 1, 00:23:57.302 "base_bdevs_list": [ 00:23:57.302 { 00:23:57.302 "name": null, 00:23:57.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.302 "is_configured": false, 00:23:57.302 "data_offset": 2048, 00:23:57.302 "data_size": 63488 00:23:57.302 }, 00:23:57.302 { 00:23:57.302 "name": "BaseBdev2", 00:23:57.302 "uuid": "6f4fdc58-dba1-5380-b04e-804bf96c8898", 00:23:57.302 "is_configured": true, 00:23:57.302 "data_offset": 2048, 00:23:57.302 "data_size": 63488 00:23:57.302 } 00:23:57.302 ] 00:23:57.302 }' 00:23:57.302 12:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.302 12:26:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:57.951 12:26:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:57.951 [2024-06-07 12:26:21.577117] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:57.951 [2024-06-07 12:26:21.577430] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:57.951 [2024-06-07 12:26:21.578756] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:57.951 [2024-06-07 12:26:21.578933] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.951 [2024-06-07 12:26:21.579100] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:57.951 [2024-06-07 12:26:21.579193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state offline 00:23:57.951 0 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 202458 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 202458 ']' 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 202458 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 202458 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 202458' 00:23:58.214 killing process with pid 202458 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 202458 00:23:58.214 [2024-06-07 12:26:21.658291] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:58.214 12:26:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 202458 00:23:58.214 [2024-06-07 12:26:21.689576] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.rE4VmjdQGe 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:58.473 00:23:58.473 real 0m6.874s 00:23:58.473 user 0m10.738s 00:23:58.473 sys 0m1.079s 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:58.473 12:26:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:58.473 ************************************ 00:23:58.473 END TEST raid_write_error_test 00:23:58.473 ************************************ 00:23:58.732 12:26:22 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:23:58.732 12:26:22 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:23:58.732 12:26:22 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:23:58.732 12:26:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:58.732 12:26:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:58.732 12:26:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:58.732 ************************************ 00:23:58.732 START TEST raid_state_function_test 00:23:58.732 ************************************ 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 false 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=202646 00:23:58.732 Process raid pid: 202646 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 202646' 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 202646 /var/tmp/spdk-raid.sock 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 202646 ']' 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:58.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:58.732 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:58.732 [2024-06-07 12:26:22.198046] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:23:58.732 [2024-06-07 12:26:22.199382] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:58.732 [2024-06-07 12:26:22.337104] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:58.991 [2024-06-07 12:26:22.432497] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:58.991 [2024-06-07 12:26:22.515435] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:58.991 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:58.991 12:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:23:58.991 12:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:23:59.557 [2024-06-07 12:26:22.991774] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:59.557 [2024-06-07 12:26:22.992146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:59.557 [2024-06-07 12:26:22.992325] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:59.557 [2024-06-07 12:26:22.992403] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:59.557 [2024-06-07 12:26:22.992638] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:59.557 [2024-06-07 12:26:22.992754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:59.558 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.815 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.815 "name": "Existed_Raid", 00:23:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.815 "strip_size_kb": 64, 00:23:59.815 "state": "configuring", 00:23:59.815 "raid_level": "raid0", 00:23:59.815 "superblock": false, 00:23:59.815 "num_base_bdevs": 3, 00:23:59.815 "num_base_bdevs_discovered": 0, 00:23:59.815 "num_base_bdevs_operational": 3, 00:23:59.815 "base_bdevs_list": [ 00:23:59.815 { 00:23:59.815 "name": "BaseBdev1", 00:23:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.815 "is_configured": false, 00:23:59.815 "data_offset": 0, 00:23:59.815 "data_size": 0 00:23:59.815 }, 00:23:59.815 { 00:23:59.815 "name": "BaseBdev2", 00:23:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.815 "is_configured": false, 00:23:59.815 "data_offset": 0, 00:23:59.815 "data_size": 0 00:23:59.815 }, 00:23:59.815 { 00:23:59.815 "name": "BaseBdev3", 00:23:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.815 "is_configured": false, 00:23:59.815 "data_offset": 0, 00:23:59.815 "data_size": 0 00:23:59.815 } 00:23:59.815 ] 00:23:59.815 }' 00:23:59.815 12:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.815 12:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:00.379 12:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:00.636 [2024-06-07 12:26:24.279879] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:00.636 [2024-06-07 12:26:24.280256] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:24:00.894 12:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:01.153 [2024-06-07 12:26:24.595978] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:01.153 [2024-06-07 12:26:24.596787] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:01.153 [2024-06-07 12:26:24.596939] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:01.153 [2024-06-07 12:26:24.597084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:01.153 [2024-06-07 12:26:24.597268] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:01.153 [2024-06-07 12:26:24.597398] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:01.153 12:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:01.411 [2024-06-07 12:26:24.849227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:01.411 BaseBdev1 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:01.411 12:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:01.670 12:26:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:01.929 [ 00:24:01.929 { 00:24:01.929 "name": "BaseBdev1", 00:24:01.929 "aliases": [ 00:24:01.929 "57bec1f1-be3e-469b-9ba2-c6250b695f04" 00:24:01.929 ], 00:24:01.929 "product_name": "Malloc disk", 00:24:01.929 "block_size": 512, 00:24:01.929 "num_blocks": 65536, 00:24:01.929 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:01.929 "assigned_rate_limits": { 00:24:01.929 "rw_ios_per_sec": 0, 00:24:01.929 "rw_mbytes_per_sec": 0, 00:24:01.929 "r_mbytes_per_sec": 0, 00:24:01.929 "w_mbytes_per_sec": 0 00:24:01.929 }, 00:24:01.929 "claimed": true, 00:24:01.929 "claim_type": "exclusive_write", 00:24:01.929 "zoned": false, 00:24:01.929 "supported_io_types": { 00:24:01.929 "read": true, 00:24:01.929 "write": true, 00:24:01.929 "unmap": true, 00:24:01.929 "write_zeroes": true, 00:24:01.929 "flush": true, 00:24:01.929 "reset": true, 00:24:01.929 "compare": false, 00:24:01.929 "compare_and_write": false, 00:24:01.929 "abort": true, 00:24:01.929 "nvme_admin": false, 00:24:01.929 "nvme_io": false 00:24:01.929 }, 00:24:01.929 "memory_domains": [ 00:24:01.929 { 00:24:01.929 "dma_device_id": "system", 00:24:01.929 "dma_device_type": 1 00:24:01.929 }, 00:24:01.929 { 00:24:01.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.929 "dma_device_type": 2 00:24:01.929 } 00:24:01.929 ], 00:24:01.929 "driver_specific": {} 00:24:01.930 } 00:24:01.930 ] 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.930 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:02.188 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.188 "name": "Existed_Raid", 00:24:02.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.188 "strip_size_kb": 64, 00:24:02.188 "state": "configuring", 00:24:02.188 "raid_level": "raid0", 00:24:02.188 "superblock": false, 00:24:02.188 "num_base_bdevs": 3, 00:24:02.188 "num_base_bdevs_discovered": 1, 00:24:02.188 "num_base_bdevs_operational": 3, 00:24:02.188 "base_bdevs_list": [ 00:24:02.188 { 00:24:02.188 "name": "BaseBdev1", 00:24:02.188 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:02.188 "is_configured": true, 00:24:02.188 "data_offset": 0, 00:24:02.188 "data_size": 65536 00:24:02.188 }, 00:24:02.188 { 00:24:02.188 "name": "BaseBdev2", 00:24:02.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.189 "is_configured": false, 00:24:02.189 "data_offset": 0, 00:24:02.189 "data_size": 0 00:24:02.189 }, 00:24:02.189 { 00:24:02.189 "name": "BaseBdev3", 00:24:02.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.189 "is_configured": false, 00:24:02.189 "data_offset": 0, 00:24:02.189 "data_size": 0 00:24:02.189 } 00:24:02.189 ] 00:24:02.189 }' 00:24:02.189 12:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.189 12:26:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:03.167 12:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:03.167 [2024-06-07 12:26:26.789646] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:03.167 [2024-06-07 12:26:26.789986] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:24:03.462 12:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:03.720 [2024-06-07 12:26:27.129782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:03.720 [2024-06-07 12:26:27.132265] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:03.720 [2024-06-07 12:26:27.133079] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:03.720 [2024-06-07 12:26:27.133276] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:03.720 [2024-06-07 12:26:27.133515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.720 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:03.979 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.979 "name": "Existed_Raid", 00:24:03.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.979 "strip_size_kb": 64, 00:24:03.979 "state": "configuring", 00:24:03.979 "raid_level": "raid0", 00:24:03.979 "superblock": false, 00:24:03.979 "num_base_bdevs": 3, 00:24:03.979 "num_base_bdevs_discovered": 1, 00:24:03.979 "num_base_bdevs_operational": 3, 00:24:03.979 "base_bdevs_list": [ 00:24:03.979 { 00:24:03.979 "name": "BaseBdev1", 00:24:03.979 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:03.979 "is_configured": true, 00:24:03.979 "data_offset": 0, 00:24:03.979 "data_size": 65536 00:24:03.979 }, 00:24:03.979 { 00:24:03.979 "name": "BaseBdev2", 00:24:03.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.979 "is_configured": false, 00:24:03.979 "data_offset": 0, 00:24:03.979 "data_size": 0 00:24:03.979 }, 00:24:03.979 { 00:24:03.979 "name": "BaseBdev3", 00:24:03.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.979 "is_configured": false, 00:24:03.979 "data_offset": 0, 00:24:03.979 "data_size": 0 00:24:03.979 } 00:24:03.979 ] 00:24:03.979 }' 00:24:03.979 12:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.979 12:26:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:04.544 12:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:05.109 [2024-06-07 12:26:28.474766] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:05.109 BaseBdev2 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:05.109 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:05.367 12:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:05.626 [ 00:24:05.626 { 00:24:05.626 "name": "BaseBdev2", 00:24:05.626 "aliases": [ 00:24:05.626 "74c0867a-3422-43fd-9e93-8fbd9be79054" 00:24:05.626 ], 00:24:05.626 "product_name": "Malloc disk", 00:24:05.626 "block_size": 512, 00:24:05.626 "num_blocks": 65536, 00:24:05.626 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:05.626 "assigned_rate_limits": { 00:24:05.626 "rw_ios_per_sec": 0, 00:24:05.626 "rw_mbytes_per_sec": 0, 00:24:05.626 "r_mbytes_per_sec": 0, 00:24:05.626 "w_mbytes_per_sec": 0 00:24:05.626 }, 00:24:05.626 "claimed": true, 00:24:05.626 "claim_type": "exclusive_write", 00:24:05.626 "zoned": false, 00:24:05.626 "supported_io_types": { 00:24:05.626 "read": true, 00:24:05.626 "write": true, 00:24:05.626 "unmap": true, 00:24:05.626 "write_zeroes": true, 00:24:05.626 "flush": true, 00:24:05.626 "reset": true, 00:24:05.626 "compare": false, 00:24:05.626 "compare_and_write": false, 00:24:05.626 "abort": true, 00:24:05.626 "nvme_admin": false, 00:24:05.626 "nvme_io": false 00:24:05.626 }, 00:24:05.626 "memory_domains": [ 00:24:05.626 { 00:24:05.626 "dma_device_id": "system", 00:24:05.626 "dma_device_type": 1 00:24:05.626 }, 00:24:05.626 { 00:24:05.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:05.626 "dma_device_type": 2 00:24:05.626 } 00:24:05.626 ], 00:24:05.626 "driver_specific": {} 00:24:05.626 } 00:24:05.626 ] 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.626 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:05.885 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.885 "name": "Existed_Raid", 00:24:05.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.885 "strip_size_kb": 64, 00:24:05.885 "state": "configuring", 00:24:05.885 "raid_level": "raid0", 00:24:05.885 "superblock": false, 00:24:05.885 "num_base_bdevs": 3, 00:24:05.885 "num_base_bdevs_discovered": 2, 00:24:05.885 "num_base_bdevs_operational": 3, 00:24:05.885 "base_bdevs_list": [ 00:24:05.885 { 00:24:05.885 "name": "BaseBdev1", 00:24:05.885 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:05.885 "is_configured": true, 00:24:05.885 "data_offset": 0, 00:24:05.885 "data_size": 65536 00:24:05.885 }, 00:24:05.885 { 00:24:05.885 "name": "BaseBdev2", 00:24:05.885 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:05.885 "is_configured": true, 00:24:05.885 "data_offset": 0, 00:24:05.885 "data_size": 65536 00:24:05.885 }, 00:24:05.885 { 00:24:05.885 "name": "BaseBdev3", 00:24:05.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.885 "is_configured": false, 00:24:05.885 "data_offset": 0, 00:24:05.885 "data_size": 0 00:24:05.885 } 00:24:05.885 ] 00:24:05.885 }' 00:24:05.885 12:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.885 12:26:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:06.450 12:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:06.709 [2024-06-07 12:26:30.321276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:06.709 [2024-06-07 12:26:30.321610] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:24:06.709 [2024-06-07 12:26:30.321661] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:24:06.709 [2024-06-07 12:26:30.321934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:24:06.709 [2024-06-07 12:26:30.322422] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:24:06.709 [2024-06-07 12:26:30.322560] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:24:06.709 [2024-06-07 12:26:30.322911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.709 BaseBdev3 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:06.709 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:07.276 12:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:07.534 [ 00:24:07.534 { 00:24:07.534 "name": "BaseBdev3", 00:24:07.534 "aliases": [ 00:24:07.534 "c76427ab-bb35-4ca0-b98f-cb5667906e34" 00:24:07.534 ], 00:24:07.534 "product_name": "Malloc disk", 00:24:07.534 "block_size": 512, 00:24:07.534 "num_blocks": 65536, 00:24:07.534 "uuid": "c76427ab-bb35-4ca0-b98f-cb5667906e34", 00:24:07.534 "assigned_rate_limits": { 00:24:07.534 "rw_ios_per_sec": 0, 00:24:07.534 "rw_mbytes_per_sec": 0, 00:24:07.534 "r_mbytes_per_sec": 0, 00:24:07.534 "w_mbytes_per_sec": 0 00:24:07.534 }, 00:24:07.534 "claimed": true, 00:24:07.534 "claim_type": "exclusive_write", 00:24:07.534 "zoned": false, 00:24:07.534 "supported_io_types": { 00:24:07.534 "read": true, 00:24:07.534 "write": true, 00:24:07.534 "unmap": true, 00:24:07.534 "write_zeroes": true, 00:24:07.534 "flush": true, 00:24:07.534 "reset": true, 00:24:07.534 "compare": false, 00:24:07.534 "compare_and_write": false, 00:24:07.534 "abort": true, 00:24:07.534 "nvme_admin": false, 00:24:07.534 "nvme_io": false 00:24:07.534 }, 00:24:07.534 "memory_domains": [ 00:24:07.534 { 00:24:07.534 "dma_device_id": "system", 00:24:07.534 "dma_device_type": 1 00:24:07.534 }, 00:24:07.534 { 00:24:07.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:07.534 "dma_device_type": 2 00:24:07.534 } 00:24:07.534 ], 00:24:07.534 "driver_specific": {} 00:24:07.534 } 00:24:07.534 ] 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:07.534 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.102 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.102 "name": "Existed_Raid", 00:24:08.102 "uuid": "48cf3ada-8653-47b3-b943-8b52356fa984", 00:24:08.102 "strip_size_kb": 64, 00:24:08.102 "state": "online", 00:24:08.102 "raid_level": "raid0", 00:24:08.102 "superblock": false, 00:24:08.102 "num_base_bdevs": 3, 00:24:08.102 "num_base_bdevs_discovered": 3, 00:24:08.102 "num_base_bdevs_operational": 3, 00:24:08.102 "base_bdevs_list": [ 00:24:08.102 { 00:24:08.102 "name": "BaseBdev1", 00:24:08.102 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:08.102 "is_configured": true, 00:24:08.102 "data_offset": 0, 00:24:08.102 "data_size": 65536 00:24:08.102 }, 00:24:08.102 { 00:24:08.102 "name": "BaseBdev2", 00:24:08.102 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:08.102 "is_configured": true, 00:24:08.102 "data_offset": 0, 00:24:08.102 "data_size": 65536 00:24:08.102 }, 00:24:08.102 { 00:24:08.102 "name": "BaseBdev3", 00:24:08.102 "uuid": "c76427ab-bb35-4ca0-b98f-cb5667906e34", 00:24:08.102 "is_configured": true, 00:24:08.102 "data_offset": 0, 00:24:08.102 "data_size": 65536 00:24:08.102 } 00:24:08.102 ] 00:24:08.102 }' 00:24:08.102 12:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.102 12:26:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:08.751 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:08.752 [2024-06-07 12:26:32.293948] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:08.752 "name": "Existed_Raid", 00:24:08.752 "aliases": [ 00:24:08.752 "48cf3ada-8653-47b3-b943-8b52356fa984" 00:24:08.752 ], 00:24:08.752 "product_name": "Raid Volume", 00:24:08.752 "block_size": 512, 00:24:08.752 "num_blocks": 196608, 00:24:08.752 "uuid": "48cf3ada-8653-47b3-b943-8b52356fa984", 00:24:08.752 "assigned_rate_limits": { 00:24:08.752 "rw_ios_per_sec": 0, 00:24:08.752 "rw_mbytes_per_sec": 0, 00:24:08.752 "r_mbytes_per_sec": 0, 00:24:08.752 "w_mbytes_per_sec": 0 00:24:08.752 }, 00:24:08.752 "claimed": false, 00:24:08.752 "zoned": false, 00:24:08.752 "supported_io_types": { 00:24:08.752 "read": true, 00:24:08.752 "write": true, 00:24:08.752 "unmap": true, 00:24:08.752 "write_zeroes": true, 00:24:08.752 "flush": true, 00:24:08.752 "reset": true, 00:24:08.752 "compare": false, 00:24:08.752 "compare_and_write": false, 00:24:08.752 "abort": false, 00:24:08.752 "nvme_admin": false, 00:24:08.752 "nvme_io": false 00:24:08.752 }, 00:24:08.752 "memory_domains": [ 00:24:08.752 { 00:24:08.752 "dma_device_id": "system", 00:24:08.752 "dma_device_type": 1 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.752 "dma_device_type": 2 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "dma_device_id": "system", 00:24:08.752 "dma_device_type": 1 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.752 "dma_device_type": 2 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "dma_device_id": "system", 00:24:08.752 "dma_device_type": 1 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.752 "dma_device_type": 2 00:24:08.752 } 00:24:08.752 ], 00:24:08.752 "driver_specific": { 00:24:08.752 "raid": { 00:24:08.752 "uuid": "48cf3ada-8653-47b3-b943-8b52356fa984", 00:24:08.752 "strip_size_kb": 64, 00:24:08.752 "state": "online", 00:24:08.752 "raid_level": "raid0", 00:24:08.752 "superblock": false, 00:24:08.752 "num_base_bdevs": 3, 00:24:08.752 "num_base_bdevs_discovered": 3, 00:24:08.752 "num_base_bdevs_operational": 3, 00:24:08.752 "base_bdevs_list": [ 00:24:08.752 { 00:24:08.752 "name": "BaseBdev1", 00:24:08.752 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:08.752 "is_configured": true, 00:24:08.752 "data_offset": 0, 00:24:08.752 "data_size": 65536 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "name": "BaseBdev2", 00:24:08.752 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:08.752 "is_configured": true, 00:24:08.752 "data_offset": 0, 00:24:08.752 "data_size": 65536 00:24:08.752 }, 00:24:08.752 { 00:24:08.752 "name": "BaseBdev3", 00:24:08.752 "uuid": "c76427ab-bb35-4ca0-b98f-cb5667906e34", 00:24:08.752 "is_configured": true, 00:24:08.752 "data_offset": 0, 00:24:08.752 "data_size": 65536 00:24:08.752 } 00:24:08.752 ] 00:24:08.752 } 00:24:08.752 } 00:24:08.752 }' 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:08.752 BaseBdev2 00:24:08.752 BaseBdev3' 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:08.752 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:09.319 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:09.319 "name": "BaseBdev1", 00:24:09.319 "aliases": [ 00:24:09.319 "57bec1f1-be3e-469b-9ba2-c6250b695f04" 00:24:09.319 ], 00:24:09.319 "product_name": "Malloc disk", 00:24:09.319 "block_size": 512, 00:24:09.319 "num_blocks": 65536, 00:24:09.319 "uuid": "57bec1f1-be3e-469b-9ba2-c6250b695f04", 00:24:09.319 "assigned_rate_limits": { 00:24:09.319 "rw_ios_per_sec": 0, 00:24:09.319 "rw_mbytes_per_sec": 0, 00:24:09.319 "r_mbytes_per_sec": 0, 00:24:09.319 "w_mbytes_per_sec": 0 00:24:09.319 }, 00:24:09.319 "claimed": true, 00:24:09.319 "claim_type": "exclusive_write", 00:24:09.319 "zoned": false, 00:24:09.319 "supported_io_types": { 00:24:09.319 "read": true, 00:24:09.319 "write": true, 00:24:09.319 "unmap": true, 00:24:09.319 "write_zeroes": true, 00:24:09.319 "flush": true, 00:24:09.319 "reset": true, 00:24:09.319 "compare": false, 00:24:09.319 "compare_and_write": false, 00:24:09.319 "abort": true, 00:24:09.319 "nvme_admin": false, 00:24:09.319 "nvme_io": false 00:24:09.319 }, 00:24:09.320 "memory_domains": [ 00:24:09.320 { 00:24:09.320 "dma_device_id": "system", 00:24:09.320 "dma_device_type": 1 00:24:09.320 }, 00:24:09.320 { 00:24:09.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:09.320 "dma_device_type": 2 00:24:09.320 } 00:24:09.320 ], 00:24:09.320 "driver_specific": {} 00:24:09.320 }' 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:09.320 12:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:09.579 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:09.579 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:09.579 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:09.579 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:09.579 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:09.837 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:09.837 "name": "BaseBdev2", 00:24:09.837 "aliases": [ 00:24:09.837 "74c0867a-3422-43fd-9e93-8fbd9be79054" 00:24:09.837 ], 00:24:09.837 "product_name": "Malloc disk", 00:24:09.837 "block_size": 512, 00:24:09.837 "num_blocks": 65536, 00:24:09.837 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:09.837 "assigned_rate_limits": { 00:24:09.837 "rw_ios_per_sec": 0, 00:24:09.837 "rw_mbytes_per_sec": 0, 00:24:09.837 "r_mbytes_per_sec": 0, 00:24:09.837 "w_mbytes_per_sec": 0 00:24:09.837 }, 00:24:09.837 "claimed": true, 00:24:09.837 "claim_type": "exclusive_write", 00:24:09.837 "zoned": false, 00:24:09.837 "supported_io_types": { 00:24:09.837 "read": true, 00:24:09.837 "write": true, 00:24:09.837 "unmap": true, 00:24:09.837 "write_zeroes": true, 00:24:09.837 "flush": true, 00:24:09.837 "reset": true, 00:24:09.837 "compare": false, 00:24:09.837 "compare_and_write": false, 00:24:09.837 "abort": true, 00:24:09.837 "nvme_admin": false, 00:24:09.837 "nvme_io": false 00:24:09.837 }, 00:24:09.837 "memory_domains": [ 00:24:09.837 { 00:24:09.837 "dma_device_id": "system", 00:24:09.837 "dma_device_type": 1 00:24:09.837 }, 00:24:09.837 { 00:24:09.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:09.837 "dma_device_type": 2 00:24:09.837 } 00:24:09.837 ], 00:24:09.837 "driver_specific": {} 00:24:09.837 }' 00:24:09.837 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.837 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.096 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:10.354 12:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:10.611 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:10.611 "name": "BaseBdev3", 00:24:10.611 "aliases": [ 00:24:10.611 "c76427ab-bb35-4ca0-b98f-cb5667906e34" 00:24:10.611 ], 00:24:10.611 "product_name": "Malloc disk", 00:24:10.611 "block_size": 512, 00:24:10.611 "num_blocks": 65536, 00:24:10.611 "uuid": "c76427ab-bb35-4ca0-b98f-cb5667906e34", 00:24:10.611 "assigned_rate_limits": { 00:24:10.611 "rw_ios_per_sec": 0, 00:24:10.611 "rw_mbytes_per_sec": 0, 00:24:10.611 "r_mbytes_per_sec": 0, 00:24:10.611 "w_mbytes_per_sec": 0 00:24:10.611 }, 00:24:10.611 "claimed": true, 00:24:10.611 "claim_type": "exclusive_write", 00:24:10.611 "zoned": false, 00:24:10.611 "supported_io_types": { 00:24:10.611 "read": true, 00:24:10.611 "write": true, 00:24:10.611 "unmap": true, 00:24:10.611 "write_zeroes": true, 00:24:10.611 "flush": true, 00:24:10.611 "reset": true, 00:24:10.611 "compare": false, 00:24:10.611 "compare_and_write": false, 00:24:10.611 "abort": true, 00:24:10.611 "nvme_admin": false, 00:24:10.611 "nvme_io": false 00:24:10.611 }, 00:24:10.611 "memory_domains": [ 00:24:10.611 { 00:24:10.611 "dma_device_id": "system", 00:24:10.611 "dma_device_type": 1 00:24:10.611 }, 00:24:10.611 { 00:24:10.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:10.611 "dma_device_type": 2 00:24:10.611 } 00:24:10.611 ], 00:24:10.611 "driver_specific": {} 00:24:10.611 }' 00:24:10.611 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.893 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.893 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:10.893 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.893 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.893 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:10.894 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.894 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.894 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:10.894 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.894 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:11.154 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:11.154 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:11.412 [2024-06-07 12:26:34.842267] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:11.412 [2024-06-07 12:26:34.842587] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:11.412 [2024-06-07 12:26:34.842802] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.412 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.413 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.413 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.413 12:26:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:11.672 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.672 "name": "Existed_Raid", 00:24:11.672 "uuid": "48cf3ada-8653-47b3-b943-8b52356fa984", 00:24:11.672 "strip_size_kb": 64, 00:24:11.672 "state": "offline", 00:24:11.672 "raid_level": "raid0", 00:24:11.672 "superblock": false, 00:24:11.672 "num_base_bdevs": 3, 00:24:11.672 "num_base_bdevs_discovered": 2, 00:24:11.672 "num_base_bdevs_operational": 2, 00:24:11.672 "base_bdevs_list": [ 00:24:11.672 { 00:24:11.672 "name": null, 00:24:11.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.672 "is_configured": false, 00:24:11.672 "data_offset": 0, 00:24:11.672 "data_size": 65536 00:24:11.672 }, 00:24:11.672 { 00:24:11.672 "name": "BaseBdev2", 00:24:11.672 "uuid": "74c0867a-3422-43fd-9e93-8fbd9be79054", 00:24:11.672 "is_configured": true, 00:24:11.672 "data_offset": 0, 00:24:11.672 "data_size": 65536 00:24:11.672 }, 00:24:11.672 { 00:24:11.672 "name": "BaseBdev3", 00:24:11.672 "uuid": "c76427ab-bb35-4ca0-b98f-cb5667906e34", 00:24:11.672 "is_configured": true, 00:24:11.672 "data_offset": 0, 00:24:11.672 "data_size": 65536 00:24:11.672 } 00:24:11.672 ] 00:24:11.672 }' 00:24:11.672 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.672 12:26:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:12.605 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:12.605 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:12.605 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.605 12:26:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:12.605 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:12.605 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:12.605 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:12.863 [2024-06-07 12:26:36.494771] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:13.121 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:13.121 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:13.121 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:13.121 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.379 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:13.379 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:13.379 12:26:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:24:13.637 [2024-06-07 12:26:37.043454] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:13.637 [2024-06-07 12:26:37.043823] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:24:13.637 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:13.637 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:13.637 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.637 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:13.894 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:14.152 BaseBdev2 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:14.152 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:14.411 12:26:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:14.670 [ 00:24:14.670 { 00:24:14.670 "name": "BaseBdev2", 00:24:14.670 "aliases": [ 00:24:14.670 "aea64352-7d8f-4eab-84fc-ebce0a512046" 00:24:14.670 ], 00:24:14.670 "product_name": "Malloc disk", 00:24:14.670 "block_size": 512, 00:24:14.670 "num_blocks": 65536, 00:24:14.670 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:14.670 "assigned_rate_limits": { 00:24:14.670 "rw_ios_per_sec": 0, 00:24:14.670 "rw_mbytes_per_sec": 0, 00:24:14.670 "r_mbytes_per_sec": 0, 00:24:14.670 "w_mbytes_per_sec": 0 00:24:14.670 }, 00:24:14.670 "claimed": false, 00:24:14.670 "zoned": false, 00:24:14.670 "supported_io_types": { 00:24:14.670 "read": true, 00:24:14.670 "write": true, 00:24:14.670 "unmap": true, 00:24:14.670 "write_zeroes": true, 00:24:14.670 "flush": true, 00:24:14.670 "reset": true, 00:24:14.670 "compare": false, 00:24:14.670 "compare_and_write": false, 00:24:14.670 "abort": true, 00:24:14.670 "nvme_admin": false, 00:24:14.670 "nvme_io": false 00:24:14.670 }, 00:24:14.670 "memory_domains": [ 00:24:14.670 { 00:24:14.670 "dma_device_id": "system", 00:24:14.670 "dma_device_type": 1 00:24:14.670 }, 00:24:14.670 { 00:24:14.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.670 "dma_device_type": 2 00:24:14.670 } 00:24:14.670 ], 00:24:14.670 "driver_specific": {} 00:24:14.670 } 00:24:14.670 ] 00:24:14.670 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:14.670 12:26:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:14.670 12:26:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:14.670 12:26:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:14.927 BaseBdev3 00:24:14.927 12:26:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:24:14.927 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:24:14.927 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:14.927 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:14.927 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:14.928 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:14.928 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:15.185 12:26:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:15.444 [ 00:24:15.444 { 00:24:15.444 "name": "BaseBdev3", 00:24:15.444 "aliases": [ 00:24:15.444 "7d762d72-8bbe-4f22-852a-56b514c188ec" 00:24:15.444 ], 00:24:15.444 "product_name": "Malloc disk", 00:24:15.444 "block_size": 512, 00:24:15.444 "num_blocks": 65536, 00:24:15.444 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:15.444 "assigned_rate_limits": { 00:24:15.444 "rw_ios_per_sec": 0, 00:24:15.444 "rw_mbytes_per_sec": 0, 00:24:15.444 "r_mbytes_per_sec": 0, 00:24:15.444 "w_mbytes_per_sec": 0 00:24:15.444 }, 00:24:15.444 "claimed": false, 00:24:15.444 "zoned": false, 00:24:15.444 "supported_io_types": { 00:24:15.444 "read": true, 00:24:15.444 "write": true, 00:24:15.444 "unmap": true, 00:24:15.444 "write_zeroes": true, 00:24:15.444 "flush": true, 00:24:15.444 "reset": true, 00:24:15.444 "compare": false, 00:24:15.444 "compare_and_write": false, 00:24:15.444 "abort": true, 00:24:15.444 "nvme_admin": false, 00:24:15.444 "nvme_io": false 00:24:15.444 }, 00:24:15.444 "memory_domains": [ 00:24:15.444 { 00:24:15.444 "dma_device_id": "system", 00:24:15.444 "dma_device_type": 1 00:24:15.444 }, 00:24:15.444 { 00:24:15.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.444 "dma_device_type": 2 00:24:15.444 } 00:24:15.444 ], 00:24:15.444 "driver_specific": {} 00:24:15.444 } 00:24:15.444 ] 00:24:15.444 12:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:15.444 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:15.444 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:15.444 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:16.013 [2024-06-07 12:26:39.408842] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:16.013 [2024-06-07 12:26:39.409747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:16.013 [2024-06-07 12:26:39.409933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:16.013 [2024-06-07 12:26:39.412170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:16.013 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.014 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:16.272 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.272 "name": "Existed_Raid", 00:24:16.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.272 "strip_size_kb": 64, 00:24:16.272 "state": "configuring", 00:24:16.272 "raid_level": "raid0", 00:24:16.272 "superblock": false, 00:24:16.272 "num_base_bdevs": 3, 00:24:16.272 "num_base_bdevs_discovered": 2, 00:24:16.272 "num_base_bdevs_operational": 3, 00:24:16.272 "base_bdevs_list": [ 00:24:16.272 { 00:24:16.272 "name": "BaseBdev1", 00:24:16.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.272 "is_configured": false, 00:24:16.272 "data_offset": 0, 00:24:16.272 "data_size": 0 00:24:16.272 }, 00:24:16.272 { 00:24:16.272 "name": "BaseBdev2", 00:24:16.272 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:16.272 "is_configured": true, 00:24:16.272 "data_offset": 0, 00:24:16.272 "data_size": 65536 00:24:16.272 }, 00:24:16.272 { 00:24:16.272 "name": "BaseBdev3", 00:24:16.272 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:16.272 "is_configured": true, 00:24:16.272 "data_offset": 0, 00:24:16.272 "data_size": 65536 00:24:16.272 } 00:24:16.272 ] 00:24:16.272 }' 00:24:16.272 12:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.272 12:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.839 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:17.098 [2024-06-07 12:26:40.648979] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:17.098 12:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.664 12:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.664 "name": "Existed_Raid", 00:24:17.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.664 "strip_size_kb": 64, 00:24:17.664 "state": "configuring", 00:24:17.664 "raid_level": "raid0", 00:24:17.664 "superblock": false, 00:24:17.664 "num_base_bdevs": 3, 00:24:17.664 "num_base_bdevs_discovered": 1, 00:24:17.664 "num_base_bdevs_operational": 3, 00:24:17.664 "base_bdevs_list": [ 00:24:17.664 { 00:24:17.664 "name": "BaseBdev1", 00:24:17.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.664 "is_configured": false, 00:24:17.664 "data_offset": 0, 00:24:17.664 "data_size": 0 00:24:17.664 }, 00:24:17.664 { 00:24:17.664 "name": null, 00:24:17.664 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:17.664 "is_configured": false, 00:24:17.664 "data_offset": 0, 00:24:17.664 "data_size": 65536 00:24:17.664 }, 00:24:17.664 { 00:24:17.664 "name": "BaseBdev3", 00:24:17.664 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:17.664 "is_configured": true, 00:24:17.664 "data_offset": 0, 00:24:17.664 "data_size": 65536 00:24:17.664 } 00:24:17.664 ] 00:24:17.664 }' 00:24:17.664 12:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.664 12:26:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:18.230 12:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:18.230 12:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.488 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:24:18.488 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:18.746 [2024-06-07 12:26:42.297440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:18.746 BaseBdev1 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:18.746 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:19.004 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:19.262 [ 00:24:19.262 { 00:24:19.262 "name": "BaseBdev1", 00:24:19.262 "aliases": [ 00:24:19.262 "725e2a27-96e4-4761-adae-d0340b796026" 00:24:19.262 ], 00:24:19.262 "product_name": "Malloc disk", 00:24:19.262 "block_size": 512, 00:24:19.262 "num_blocks": 65536, 00:24:19.262 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:19.262 "assigned_rate_limits": { 00:24:19.262 "rw_ios_per_sec": 0, 00:24:19.262 "rw_mbytes_per_sec": 0, 00:24:19.262 "r_mbytes_per_sec": 0, 00:24:19.262 "w_mbytes_per_sec": 0 00:24:19.262 }, 00:24:19.262 "claimed": true, 00:24:19.262 "claim_type": "exclusive_write", 00:24:19.262 "zoned": false, 00:24:19.262 "supported_io_types": { 00:24:19.262 "read": true, 00:24:19.262 "write": true, 00:24:19.262 "unmap": true, 00:24:19.262 "write_zeroes": true, 00:24:19.262 "flush": true, 00:24:19.262 "reset": true, 00:24:19.262 "compare": false, 00:24:19.262 "compare_and_write": false, 00:24:19.262 "abort": true, 00:24:19.262 "nvme_admin": false, 00:24:19.262 "nvme_io": false 00:24:19.262 }, 00:24:19.262 "memory_domains": [ 00:24:19.262 { 00:24:19.262 "dma_device_id": "system", 00:24:19.262 "dma_device_type": 1 00:24:19.262 }, 00:24:19.262 { 00:24:19.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.262 "dma_device_type": 2 00:24:19.262 } 00:24:19.262 ], 00:24:19.262 "driver_specific": {} 00:24:19.262 } 00:24:19.262 ] 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:19.262 12:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.520 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.520 "name": "Existed_Raid", 00:24:19.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.520 "strip_size_kb": 64, 00:24:19.520 "state": "configuring", 00:24:19.520 "raid_level": "raid0", 00:24:19.520 "superblock": false, 00:24:19.520 "num_base_bdevs": 3, 00:24:19.520 "num_base_bdevs_discovered": 2, 00:24:19.520 "num_base_bdevs_operational": 3, 00:24:19.520 "base_bdevs_list": [ 00:24:19.520 { 00:24:19.520 "name": "BaseBdev1", 00:24:19.520 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:19.520 "is_configured": true, 00:24:19.520 "data_offset": 0, 00:24:19.520 "data_size": 65536 00:24:19.520 }, 00:24:19.520 { 00:24:19.520 "name": null, 00:24:19.520 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:19.520 "is_configured": false, 00:24:19.520 "data_offset": 0, 00:24:19.520 "data_size": 65536 00:24:19.520 }, 00:24:19.520 { 00:24:19.520 "name": "BaseBdev3", 00:24:19.520 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:19.520 "is_configured": true, 00:24:19.520 "data_offset": 0, 00:24:19.520 "data_size": 65536 00:24:19.520 } 00:24:19.520 ] 00:24:19.520 }' 00:24:19.520 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.520 12:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:20.086 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.086 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:20.342 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:24:20.342 12:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:24:20.599 [2024-06-07 12:26:44.124823] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:20.599 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:20.599 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.600 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:20.930 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:20.930 "name": "Existed_Raid", 00:24:20.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:20.930 "strip_size_kb": 64, 00:24:20.930 "state": "configuring", 00:24:20.930 "raid_level": "raid0", 00:24:20.930 "superblock": false, 00:24:20.930 "num_base_bdevs": 3, 00:24:20.930 "num_base_bdevs_discovered": 1, 00:24:20.930 "num_base_bdevs_operational": 3, 00:24:20.930 "base_bdevs_list": [ 00:24:20.930 { 00:24:20.930 "name": "BaseBdev1", 00:24:20.930 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:20.930 "is_configured": true, 00:24:20.930 "data_offset": 0, 00:24:20.930 "data_size": 65536 00:24:20.930 }, 00:24:20.930 { 00:24:20.930 "name": null, 00:24:20.930 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:20.930 "is_configured": false, 00:24:20.930 "data_offset": 0, 00:24:20.930 "data_size": 65536 00:24:20.930 }, 00:24:20.930 { 00:24:20.930 "name": null, 00:24:20.930 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:20.930 "is_configured": false, 00:24:20.930 "data_offset": 0, 00:24:20.930 "data_size": 65536 00:24:20.930 } 00:24:20.930 ] 00:24:20.930 }' 00:24:20.930 12:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:20.930 12:26:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:21.501 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.501 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:22.066 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:24:22.066 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:24:22.066 [2024-06-07 12:26:45.685895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.325 12:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:22.582 12:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.582 "name": "Existed_Raid", 00:24:22.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.582 "strip_size_kb": 64, 00:24:22.583 "state": "configuring", 00:24:22.583 "raid_level": "raid0", 00:24:22.583 "superblock": false, 00:24:22.583 "num_base_bdevs": 3, 00:24:22.583 "num_base_bdevs_discovered": 2, 00:24:22.583 "num_base_bdevs_operational": 3, 00:24:22.583 "base_bdevs_list": [ 00:24:22.583 { 00:24:22.583 "name": "BaseBdev1", 00:24:22.583 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:22.583 "is_configured": true, 00:24:22.583 "data_offset": 0, 00:24:22.583 "data_size": 65536 00:24:22.583 }, 00:24:22.583 { 00:24:22.583 "name": null, 00:24:22.583 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:22.583 "is_configured": false, 00:24:22.583 "data_offset": 0, 00:24:22.583 "data_size": 65536 00:24:22.583 }, 00:24:22.583 { 00:24:22.583 "name": "BaseBdev3", 00:24:22.583 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:22.583 "is_configured": true, 00:24:22.583 "data_offset": 0, 00:24:22.583 "data_size": 65536 00:24:22.583 } 00:24:22.583 ] 00:24:22.583 }' 00:24:22.583 12:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.583 12:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:23.517 12:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:23.517 12:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.517 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:24:23.517 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:23.776 [2024-06-07 12:26:47.370009] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.037 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:24.294 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.294 "name": "Existed_Raid", 00:24:24.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.294 "strip_size_kb": 64, 00:24:24.294 "state": "configuring", 00:24:24.294 "raid_level": "raid0", 00:24:24.294 "superblock": false, 00:24:24.294 "num_base_bdevs": 3, 00:24:24.294 "num_base_bdevs_discovered": 1, 00:24:24.294 "num_base_bdevs_operational": 3, 00:24:24.294 "base_bdevs_list": [ 00:24:24.294 { 00:24:24.294 "name": null, 00:24:24.294 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:24.294 "is_configured": false, 00:24:24.294 "data_offset": 0, 00:24:24.294 "data_size": 65536 00:24:24.294 }, 00:24:24.294 { 00:24:24.294 "name": null, 00:24:24.294 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:24.294 "is_configured": false, 00:24:24.294 "data_offset": 0, 00:24:24.294 "data_size": 65536 00:24:24.294 }, 00:24:24.294 { 00:24:24.294 "name": "BaseBdev3", 00:24:24.294 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:24.294 "is_configured": true, 00:24:24.294 "data_offset": 0, 00:24:24.294 "data_size": 65536 00:24:24.294 } 00:24:24.294 ] 00:24:24.294 }' 00:24:24.294 12:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.294 12:26:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:24.863 12:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.863 12:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:25.121 12:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:24:25.121 12:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:24:25.688 [2024-06-07 12:26:49.091642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.688 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:25.946 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.946 "name": "Existed_Raid", 00:24:25.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.946 "strip_size_kb": 64, 00:24:25.946 "state": "configuring", 00:24:25.946 "raid_level": "raid0", 00:24:25.946 "superblock": false, 00:24:25.946 "num_base_bdevs": 3, 00:24:25.946 "num_base_bdevs_discovered": 2, 00:24:25.946 "num_base_bdevs_operational": 3, 00:24:25.946 "base_bdevs_list": [ 00:24:25.946 { 00:24:25.946 "name": null, 00:24:25.946 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:25.946 "is_configured": false, 00:24:25.946 "data_offset": 0, 00:24:25.946 "data_size": 65536 00:24:25.946 }, 00:24:25.946 { 00:24:25.946 "name": "BaseBdev2", 00:24:25.946 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:25.946 "is_configured": true, 00:24:25.946 "data_offset": 0, 00:24:25.946 "data_size": 65536 00:24:25.946 }, 00:24:25.946 { 00:24:25.946 "name": "BaseBdev3", 00:24:25.946 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:25.946 "is_configured": true, 00:24:25.946 "data_offset": 0, 00:24:25.946 "data_size": 65536 00:24:25.946 } 00:24:25.946 ] 00:24:25.946 }' 00:24:25.946 12:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.946 12:26:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:26.513 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.513 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:26.771 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:24:26.771 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.771 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:24:27.029 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 725e2a27-96e4-4761-adae-d0340b796026 00:24:27.287 [2024-06-07 12:26:50.930539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:24:27.287 [2024-06-07 12:26:50.930841] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:24:27.287 [2024-06-07 12:26:50.930890] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:24:27.287 [2024-06-07 12:26:50.931272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:24:27.287 [2024-06-07 12:26:50.931705] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:24:27.287 [2024-06-07 12:26:50.931829] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:24:27.287 [2024-06-07 12:26:50.932093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:27.545 NewBaseBdev 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:27.545 12:26:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:27.805 12:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:24:28.063 [ 00:24:28.063 { 00:24:28.063 "name": "NewBaseBdev", 00:24:28.063 "aliases": [ 00:24:28.063 "725e2a27-96e4-4761-adae-d0340b796026" 00:24:28.063 ], 00:24:28.063 "product_name": "Malloc disk", 00:24:28.063 "block_size": 512, 00:24:28.063 "num_blocks": 65536, 00:24:28.063 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:28.063 "assigned_rate_limits": { 00:24:28.063 "rw_ios_per_sec": 0, 00:24:28.063 "rw_mbytes_per_sec": 0, 00:24:28.063 "r_mbytes_per_sec": 0, 00:24:28.063 "w_mbytes_per_sec": 0 00:24:28.063 }, 00:24:28.063 "claimed": true, 00:24:28.063 "claim_type": "exclusive_write", 00:24:28.063 "zoned": false, 00:24:28.063 "supported_io_types": { 00:24:28.063 "read": true, 00:24:28.063 "write": true, 00:24:28.063 "unmap": true, 00:24:28.063 "write_zeroes": true, 00:24:28.063 "flush": true, 00:24:28.063 "reset": true, 00:24:28.063 "compare": false, 00:24:28.063 "compare_and_write": false, 00:24:28.063 "abort": true, 00:24:28.063 "nvme_admin": false, 00:24:28.063 "nvme_io": false 00:24:28.063 }, 00:24:28.063 "memory_domains": [ 00:24:28.063 { 00:24:28.063 "dma_device_id": "system", 00:24:28.063 "dma_device_type": 1 00:24:28.063 }, 00:24:28.063 { 00:24:28.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:28.063 "dma_device_type": 2 00:24:28.063 } 00:24:28.063 ], 00:24:28.063 "driver_specific": {} 00:24:28.063 } 00:24:28.063 ] 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:28.063 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.322 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.322 "name": "Existed_Raid", 00:24:28.322 "uuid": "1e43539c-f7a5-4053-a790-c3e00fea717d", 00:24:28.322 "strip_size_kb": 64, 00:24:28.322 "state": "online", 00:24:28.322 "raid_level": "raid0", 00:24:28.322 "superblock": false, 00:24:28.322 "num_base_bdevs": 3, 00:24:28.322 "num_base_bdevs_discovered": 3, 00:24:28.322 "num_base_bdevs_operational": 3, 00:24:28.322 "base_bdevs_list": [ 00:24:28.322 { 00:24:28.322 "name": "NewBaseBdev", 00:24:28.322 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:28.322 "is_configured": true, 00:24:28.322 "data_offset": 0, 00:24:28.322 "data_size": 65536 00:24:28.322 }, 00:24:28.322 { 00:24:28.322 "name": "BaseBdev2", 00:24:28.322 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:28.322 "is_configured": true, 00:24:28.322 "data_offset": 0, 00:24:28.322 "data_size": 65536 00:24:28.322 }, 00:24:28.322 { 00:24:28.322 "name": "BaseBdev3", 00:24:28.322 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:28.322 "is_configured": true, 00:24:28.322 "data_offset": 0, 00:24:28.322 "data_size": 65536 00:24:28.322 } 00:24:28.322 ] 00:24:28.322 }' 00:24:28.322 12:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.322 12:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:28.888 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:29.146 [2024-06-07 12:26:52.749958] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:29.405 "name": "Existed_Raid", 00:24:29.405 "aliases": [ 00:24:29.405 "1e43539c-f7a5-4053-a790-c3e00fea717d" 00:24:29.405 ], 00:24:29.405 "product_name": "Raid Volume", 00:24:29.405 "block_size": 512, 00:24:29.405 "num_blocks": 196608, 00:24:29.405 "uuid": "1e43539c-f7a5-4053-a790-c3e00fea717d", 00:24:29.405 "assigned_rate_limits": { 00:24:29.405 "rw_ios_per_sec": 0, 00:24:29.405 "rw_mbytes_per_sec": 0, 00:24:29.405 "r_mbytes_per_sec": 0, 00:24:29.405 "w_mbytes_per_sec": 0 00:24:29.405 }, 00:24:29.405 "claimed": false, 00:24:29.405 "zoned": false, 00:24:29.405 "supported_io_types": { 00:24:29.405 "read": true, 00:24:29.405 "write": true, 00:24:29.405 "unmap": true, 00:24:29.405 "write_zeroes": true, 00:24:29.405 "flush": true, 00:24:29.405 "reset": true, 00:24:29.405 "compare": false, 00:24:29.405 "compare_and_write": false, 00:24:29.405 "abort": false, 00:24:29.405 "nvme_admin": false, 00:24:29.405 "nvme_io": false 00:24:29.405 }, 00:24:29.405 "memory_domains": [ 00:24:29.405 { 00:24:29.405 "dma_device_id": "system", 00:24:29.405 "dma_device_type": 1 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.405 "dma_device_type": 2 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "dma_device_id": "system", 00:24:29.405 "dma_device_type": 1 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.405 "dma_device_type": 2 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "dma_device_id": "system", 00:24:29.405 "dma_device_type": 1 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.405 "dma_device_type": 2 00:24:29.405 } 00:24:29.405 ], 00:24:29.405 "driver_specific": { 00:24:29.405 "raid": { 00:24:29.405 "uuid": "1e43539c-f7a5-4053-a790-c3e00fea717d", 00:24:29.405 "strip_size_kb": 64, 00:24:29.405 "state": "online", 00:24:29.405 "raid_level": "raid0", 00:24:29.405 "superblock": false, 00:24:29.405 "num_base_bdevs": 3, 00:24:29.405 "num_base_bdevs_discovered": 3, 00:24:29.405 "num_base_bdevs_operational": 3, 00:24:29.405 "base_bdevs_list": [ 00:24:29.405 { 00:24:29.405 "name": "NewBaseBdev", 00:24:29.405 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:29.405 "is_configured": true, 00:24:29.405 "data_offset": 0, 00:24:29.405 "data_size": 65536 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "name": "BaseBdev2", 00:24:29.405 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:29.405 "is_configured": true, 00:24:29.405 "data_offset": 0, 00:24:29.405 "data_size": 65536 00:24:29.405 }, 00:24:29.405 { 00:24:29.405 "name": "BaseBdev3", 00:24:29.405 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:29.405 "is_configured": true, 00:24:29.405 "data_offset": 0, 00:24:29.405 "data_size": 65536 00:24:29.405 } 00:24:29.405 ] 00:24:29.405 } 00:24:29.405 } 00:24:29.405 }' 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:24:29.405 BaseBdev2 00:24:29.405 BaseBdev3' 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:24:29.405 12:26:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:29.665 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:29.665 "name": "NewBaseBdev", 00:24:29.665 "aliases": [ 00:24:29.665 "725e2a27-96e4-4761-adae-d0340b796026" 00:24:29.665 ], 00:24:29.665 "product_name": "Malloc disk", 00:24:29.665 "block_size": 512, 00:24:29.665 "num_blocks": 65536, 00:24:29.665 "uuid": "725e2a27-96e4-4761-adae-d0340b796026", 00:24:29.665 "assigned_rate_limits": { 00:24:29.665 "rw_ios_per_sec": 0, 00:24:29.665 "rw_mbytes_per_sec": 0, 00:24:29.665 "r_mbytes_per_sec": 0, 00:24:29.665 "w_mbytes_per_sec": 0 00:24:29.665 }, 00:24:29.665 "claimed": true, 00:24:29.665 "claim_type": "exclusive_write", 00:24:29.665 "zoned": false, 00:24:29.665 "supported_io_types": { 00:24:29.665 "read": true, 00:24:29.665 "write": true, 00:24:29.665 "unmap": true, 00:24:29.665 "write_zeroes": true, 00:24:29.665 "flush": true, 00:24:29.665 "reset": true, 00:24:29.665 "compare": false, 00:24:29.665 "compare_and_write": false, 00:24:29.665 "abort": true, 00:24:29.665 "nvme_admin": false, 00:24:29.665 "nvme_io": false 00:24:29.665 }, 00:24:29.665 "memory_domains": [ 00:24:29.665 { 00:24:29.665 "dma_device_id": "system", 00:24:29.665 "dma_device_type": 1 00:24:29.665 }, 00:24:29.665 { 00:24:29.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.665 "dma_device_type": 2 00:24:29.665 } 00:24:29.665 ], 00:24:29.665 "driver_specific": {} 00:24:29.665 }' 00:24:29.665 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:29.928 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:30.187 12:26:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:30.445 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:30.445 "name": "BaseBdev2", 00:24:30.445 "aliases": [ 00:24:30.445 "aea64352-7d8f-4eab-84fc-ebce0a512046" 00:24:30.445 ], 00:24:30.445 "product_name": "Malloc disk", 00:24:30.445 "block_size": 512, 00:24:30.445 "num_blocks": 65536, 00:24:30.445 "uuid": "aea64352-7d8f-4eab-84fc-ebce0a512046", 00:24:30.445 "assigned_rate_limits": { 00:24:30.445 "rw_ios_per_sec": 0, 00:24:30.445 "rw_mbytes_per_sec": 0, 00:24:30.445 "r_mbytes_per_sec": 0, 00:24:30.445 "w_mbytes_per_sec": 0 00:24:30.445 }, 00:24:30.445 "claimed": true, 00:24:30.445 "claim_type": "exclusive_write", 00:24:30.445 "zoned": false, 00:24:30.445 "supported_io_types": { 00:24:30.445 "read": true, 00:24:30.445 "write": true, 00:24:30.445 "unmap": true, 00:24:30.445 "write_zeroes": true, 00:24:30.445 "flush": true, 00:24:30.445 "reset": true, 00:24:30.445 "compare": false, 00:24:30.445 "compare_and_write": false, 00:24:30.445 "abort": true, 00:24:30.445 "nvme_admin": false, 00:24:30.445 "nvme_io": false 00:24:30.445 }, 00:24:30.445 "memory_domains": [ 00:24:30.445 { 00:24:30.445 "dma_device_id": "system", 00:24:30.445 "dma_device_type": 1 00:24:30.445 }, 00:24:30.445 { 00:24:30.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:30.445 "dma_device_type": 2 00:24:30.445 } 00:24:30.445 ], 00:24:30.445 "driver_specific": {} 00:24:30.445 }' 00:24:30.445 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.445 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:30.703 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.961 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.961 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:30.961 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:30.961 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:30.961 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:31.220 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:31.220 "name": "BaseBdev3", 00:24:31.220 "aliases": [ 00:24:31.220 "7d762d72-8bbe-4f22-852a-56b514c188ec" 00:24:31.220 ], 00:24:31.220 "product_name": "Malloc disk", 00:24:31.220 "block_size": 512, 00:24:31.220 "num_blocks": 65536, 00:24:31.220 "uuid": "7d762d72-8bbe-4f22-852a-56b514c188ec", 00:24:31.220 "assigned_rate_limits": { 00:24:31.220 "rw_ios_per_sec": 0, 00:24:31.220 "rw_mbytes_per_sec": 0, 00:24:31.220 "r_mbytes_per_sec": 0, 00:24:31.220 "w_mbytes_per_sec": 0 00:24:31.220 }, 00:24:31.220 "claimed": true, 00:24:31.220 "claim_type": "exclusive_write", 00:24:31.220 "zoned": false, 00:24:31.220 "supported_io_types": { 00:24:31.220 "read": true, 00:24:31.220 "write": true, 00:24:31.220 "unmap": true, 00:24:31.220 "write_zeroes": true, 00:24:31.220 "flush": true, 00:24:31.220 "reset": true, 00:24:31.220 "compare": false, 00:24:31.220 "compare_and_write": false, 00:24:31.220 "abort": true, 00:24:31.220 "nvme_admin": false, 00:24:31.220 "nvme_io": false 00:24:31.220 }, 00:24:31.220 "memory_domains": [ 00:24:31.220 { 00:24:31.220 "dma_device_id": "system", 00:24:31.220 "dma_device_type": 1 00:24:31.220 }, 00:24:31.220 { 00:24:31.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:31.220 "dma_device_type": 2 00:24:31.220 } 00:24:31.220 ], 00:24:31.220 "driver_specific": {} 00:24:31.220 }' 00:24:31.220 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:31.220 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:31.478 12:26:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:31.478 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:31.478 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:31.478 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:31.478 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:31.478 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:32.045 [2024-06-07 12:26:55.394664] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:32.045 [2024-06-07 12:26:55.394731] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:32.045 [2024-06-07 12:26:55.394821] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:32.045 [2024-06-07 12:26:55.394873] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:32.045 [2024-06-07 12:26:55.394883] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 202646 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 202646 ']' 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 202646 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 202646 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:32.045 killing process with pid 202646 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 202646' 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 202646 00:24:32.045 [2024-06-07 12:26:55.456014] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:32.045 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 202646 00:24:32.045 [2024-06-07 12:26:55.518745] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:32.302 12:26:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:24:32.302 00:24:32.302 real 0m33.752s 00:24:32.302 user 1m2.460s 00:24:32.302 sys 0m5.127s 00:24:32.303 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:32.303 12:26:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:32.303 ************************************ 00:24:32.303 END TEST raid_state_function_test 00:24:32.303 ************************************ 00:24:32.561 12:26:55 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:24:32.561 12:26:55 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:24:32.561 12:26:55 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:32.561 12:26:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:32.561 ************************************ 00:24:32.561 START TEST raid_state_function_test_sb 00:24:32.561 ************************************ 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 true 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=203668 00:24:32.561 Process raid pid: 203668 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 203668' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 203668 /var/tmp/spdk-raid.sock 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 203668 ']' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:32.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:32.561 12:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:32.561 [2024-06-07 12:26:56.014147] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:24:32.561 [2024-06-07 12:26:56.014468] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:32.561 [2024-06-07 12:26:56.166292] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.820 [2024-06-07 12:26:56.264364] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:32.820 [2024-06-07 12:26:56.351333] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:33.755 [2024-06-07 12:26:57.324030] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:33.755 [2024-06-07 12:26:57.324593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:33.755 [2024-06-07 12:26:57.324623] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:33.755 [2024-06-07 12:26:57.324712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:33.755 [2024-06-07 12:26:57.324722] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:33.755 [2024-06-07 12:26:57.324830] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.755 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:34.014 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.014 "name": "Existed_Raid", 00:24:34.014 "uuid": "6ba840bc-862c-4419-8b52-257787cb83f5", 00:24:34.014 "strip_size_kb": 64, 00:24:34.014 "state": "configuring", 00:24:34.014 "raid_level": "raid0", 00:24:34.014 "superblock": true, 00:24:34.014 "num_base_bdevs": 3, 00:24:34.014 "num_base_bdevs_discovered": 0, 00:24:34.014 "num_base_bdevs_operational": 3, 00:24:34.014 "base_bdevs_list": [ 00:24:34.014 { 00:24:34.014 "name": "BaseBdev1", 00:24:34.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.014 "is_configured": false, 00:24:34.014 "data_offset": 0, 00:24:34.014 "data_size": 0 00:24:34.014 }, 00:24:34.014 { 00:24:34.014 "name": "BaseBdev2", 00:24:34.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.014 "is_configured": false, 00:24:34.014 "data_offset": 0, 00:24:34.014 "data_size": 0 00:24:34.014 }, 00:24:34.014 { 00:24:34.014 "name": "BaseBdev3", 00:24:34.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.014 "is_configured": false, 00:24:34.014 "data_offset": 0, 00:24:34.014 "data_size": 0 00:24:34.014 } 00:24:34.014 ] 00:24:34.014 }' 00:24:34.014 12:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.014 12:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:34.650 12:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:34.907 [2024-06-07 12:26:58.528097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:34.907 [2024-06-07 12:26:58.528166] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:24:35.167 12:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:35.167 [2024-06-07 12:26:58.772154] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:35.167 [2024-06-07 12:26:58.772701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:35.167 [2024-06-07 12:26:58.772732] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:35.167 [2024-06-07 12:26:58.772815] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:35.167 [2024-06-07 12:26:58.772827] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:35.167 [2024-06-07 12:26:58.772902] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:35.167 12:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:35.733 [2024-06-07 12:26:59.120559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:35.733 BaseBdev1 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:35.733 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:35.992 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:36.250 [ 00:24:36.250 { 00:24:36.250 "name": "BaseBdev1", 00:24:36.250 "aliases": [ 00:24:36.250 "713f16a3-b3a6-4467-abdd-ebc5f2a414ef" 00:24:36.250 ], 00:24:36.250 "product_name": "Malloc disk", 00:24:36.250 "block_size": 512, 00:24:36.250 "num_blocks": 65536, 00:24:36.250 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:36.250 "assigned_rate_limits": { 00:24:36.250 "rw_ios_per_sec": 0, 00:24:36.250 "rw_mbytes_per_sec": 0, 00:24:36.250 "r_mbytes_per_sec": 0, 00:24:36.250 "w_mbytes_per_sec": 0 00:24:36.250 }, 00:24:36.250 "claimed": true, 00:24:36.250 "claim_type": "exclusive_write", 00:24:36.250 "zoned": false, 00:24:36.250 "supported_io_types": { 00:24:36.250 "read": true, 00:24:36.250 "write": true, 00:24:36.250 "unmap": true, 00:24:36.250 "write_zeroes": true, 00:24:36.250 "flush": true, 00:24:36.250 "reset": true, 00:24:36.250 "compare": false, 00:24:36.251 "compare_and_write": false, 00:24:36.251 "abort": true, 00:24:36.251 "nvme_admin": false, 00:24:36.251 "nvme_io": false 00:24:36.251 }, 00:24:36.251 "memory_domains": [ 00:24:36.251 { 00:24:36.251 "dma_device_id": "system", 00:24:36.251 "dma_device_type": 1 00:24:36.251 }, 00:24:36.251 { 00:24:36.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.251 "dma_device_type": 2 00:24:36.251 } 00:24:36.251 ], 00:24:36.251 "driver_specific": {} 00:24:36.251 } 00:24:36.251 ] 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.251 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:36.509 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.509 "name": "Existed_Raid", 00:24:36.509 "uuid": "a3adadb1-3a6e-4c48-a45a-fc7c91874c88", 00:24:36.509 "strip_size_kb": 64, 00:24:36.509 "state": "configuring", 00:24:36.509 "raid_level": "raid0", 00:24:36.509 "superblock": true, 00:24:36.509 "num_base_bdevs": 3, 00:24:36.509 "num_base_bdevs_discovered": 1, 00:24:36.509 "num_base_bdevs_operational": 3, 00:24:36.509 "base_bdevs_list": [ 00:24:36.509 { 00:24:36.509 "name": "BaseBdev1", 00:24:36.509 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:36.509 "is_configured": true, 00:24:36.509 "data_offset": 2048, 00:24:36.509 "data_size": 63488 00:24:36.509 }, 00:24:36.509 { 00:24:36.509 "name": "BaseBdev2", 00:24:36.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.509 "is_configured": false, 00:24:36.509 "data_offset": 0, 00:24:36.509 "data_size": 0 00:24:36.509 }, 00:24:36.509 { 00:24:36.509 "name": "BaseBdev3", 00:24:36.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.509 "is_configured": false, 00:24:36.509 "data_offset": 0, 00:24:36.509 "data_size": 0 00:24:36.509 } 00:24:36.509 ] 00:24:36.509 }' 00:24:36.509 12:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.509 12:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:37.137 12:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:37.395 [2024-06-07 12:27:00.792902] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:37.395 [2024-06-07 12:27:00.792999] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:24:37.395 12:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:37.653 [2024-06-07 12:27:01.085047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:37.653 [2024-06-07 12:27:01.087264] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:37.653 [2024-06-07 12:27:01.087807] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:37.653 [2024-06-07 12:27:01.087836] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:37.653 [2024-06-07 12:27:01.087928] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:37.653 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.912 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:37.912 "name": "Existed_Raid", 00:24:37.912 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:37.912 "strip_size_kb": 64, 00:24:37.912 "state": "configuring", 00:24:37.912 "raid_level": "raid0", 00:24:37.912 "superblock": true, 00:24:37.912 "num_base_bdevs": 3, 00:24:37.912 "num_base_bdevs_discovered": 1, 00:24:37.912 "num_base_bdevs_operational": 3, 00:24:37.912 "base_bdevs_list": [ 00:24:37.912 { 00:24:37.912 "name": "BaseBdev1", 00:24:37.912 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:37.912 "is_configured": true, 00:24:37.912 "data_offset": 2048, 00:24:37.912 "data_size": 63488 00:24:37.912 }, 00:24:37.912 { 00:24:37.912 "name": "BaseBdev2", 00:24:37.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.912 "is_configured": false, 00:24:37.912 "data_offset": 0, 00:24:37.912 "data_size": 0 00:24:37.912 }, 00:24:37.912 { 00:24:37.912 "name": "BaseBdev3", 00:24:37.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.912 "is_configured": false, 00:24:37.912 "data_offset": 0, 00:24:37.912 "data_size": 0 00:24:37.912 } 00:24:37.912 ] 00:24:37.912 }' 00:24:37.912 12:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:37.912 12:27:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:38.478 12:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:38.737 [2024-06-07 12:27:02.341953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:38.737 BaseBdev2 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:38.737 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:39.340 12:27:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:39.598 [ 00:24:39.598 { 00:24:39.598 "name": "BaseBdev2", 00:24:39.598 "aliases": [ 00:24:39.598 "4a061579-d701-4737-9a37-0ec8c5f6a437" 00:24:39.598 ], 00:24:39.598 "product_name": "Malloc disk", 00:24:39.598 "block_size": 512, 00:24:39.598 "num_blocks": 65536, 00:24:39.598 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:39.598 "assigned_rate_limits": { 00:24:39.598 "rw_ios_per_sec": 0, 00:24:39.598 "rw_mbytes_per_sec": 0, 00:24:39.598 "r_mbytes_per_sec": 0, 00:24:39.598 "w_mbytes_per_sec": 0 00:24:39.598 }, 00:24:39.598 "claimed": true, 00:24:39.598 "claim_type": "exclusive_write", 00:24:39.598 "zoned": false, 00:24:39.598 "supported_io_types": { 00:24:39.598 "read": true, 00:24:39.598 "write": true, 00:24:39.598 "unmap": true, 00:24:39.598 "write_zeroes": true, 00:24:39.598 "flush": true, 00:24:39.598 "reset": true, 00:24:39.598 "compare": false, 00:24:39.598 "compare_and_write": false, 00:24:39.598 "abort": true, 00:24:39.598 "nvme_admin": false, 00:24:39.598 "nvme_io": false 00:24:39.598 }, 00:24:39.598 "memory_domains": [ 00:24:39.598 { 00:24:39.598 "dma_device_id": "system", 00:24:39.598 "dma_device_type": 1 00:24:39.598 }, 00:24:39.598 { 00:24:39.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:39.598 "dma_device_type": 2 00:24:39.598 } 00:24:39.598 ], 00:24:39.599 "driver_specific": {} 00:24:39.599 } 00:24:39.599 ] 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:39.599 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.858 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.858 "name": "Existed_Raid", 00:24:39.858 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:39.858 "strip_size_kb": 64, 00:24:39.858 "state": "configuring", 00:24:39.858 "raid_level": "raid0", 00:24:39.858 "superblock": true, 00:24:39.858 "num_base_bdevs": 3, 00:24:39.858 "num_base_bdevs_discovered": 2, 00:24:39.858 "num_base_bdevs_operational": 3, 00:24:39.858 "base_bdevs_list": [ 00:24:39.858 { 00:24:39.858 "name": "BaseBdev1", 00:24:39.858 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:39.858 "is_configured": true, 00:24:39.858 "data_offset": 2048, 00:24:39.858 "data_size": 63488 00:24:39.858 }, 00:24:39.858 { 00:24:39.858 "name": "BaseBdev2", 00:24:39.858 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:39.858 "is_configured": true, 00:24:39.858 "data_offset": 2048, 00:24:39.858 "data_size": 63488 00:24:39.858 }, 00:24:39.858 { 00:24:39.858 "name": "BaseBdev3", 00:24:39.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.858 "is_configured": false, 00:24:39.858 "data_offset": 0, 00:24:39.858 "data_size": 0 00:24:39.858 } 00:24:39.858 ] 00:24:39.858 }' 00:24:39.858 12:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.858 12:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:40.425 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:40.683 [2024-06-07 12:27:04.272697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:40.683 [2024-06-07 12:27:04.273266] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:24:40.683 [2024-06-07 12:27:04.273426] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:24:40.683 [2024-06-07 12:27:04.273762] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:24:40.683 [2024-06-07 12:27:04.274333] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:24:40.683 [2024-06-07 12:27:04.274487] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:24:40.683 [2024-06-07 12:27:04.274761] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:40.683 BaseBdev3 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:40.683 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:40.941 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:41.506 [ 00:24:41.506 { 00:24:41.506 "name": "BaseBdev3", 00:24:41.506 "aliases": [ 00:24:41.506 "1f90fcc5-74ba-4340-9a8e-f28c89cd6331" 00:24:41.506 ], 00:24:41.506 "product_name": "Malloc disk", 00:24:41.506 "block_size": 512, 00:24:41.506 "num_blocks": 65536, 00:24:41.506 "uuid": "1f90fcc5-74ba-4340-9a8e-f28c89cd6331", 00:24:41.506 "assigned_rate_limits": { 00:24:41.506 "rw_ios_per_sec": 0, 00:24:41.506 "rw_mbytes_per_sec": 0, 00:24:41.506 "r_mbytes_per_sec": 0, 00:24:41.506 "w_mbytes_per_sec": 0 00:24:41.506 }, 00:24:41.506 "claimed": true, 00:24:41.506 "claim_type": "exclusive_write", 00:24:41.506 "zoned": false, 00:24:41.506 "supported_io_types": { 00:24:41.506 "read": true, 00:24:41.506 "write": true, 00:24:41.506 "unmap": true, 00:24:41.506 "write_zeroes": true, 00:24:41.506 "flush": true, 00:24:41.506 "reset": true, 00:24:41.506 "compare": false, 00:24:41.506 "compare_and_write": false, 00:24:41.506 "abort": true, 00:24:41.506 "nvme_admin": false, 00:24:41.506 "nvme_io": false 00:24:41.506 }, 00:24:41.506 "memory_domains": [ 00:24:41.506 { 00:24:41.506 "dma_device_id": "system", 00:24:41.506 "dma_device_type": 1 00:24:41.506 }, 00:24:41.506 { 00:24:41.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:41.506 "dma_device_type": 2 00:24:41.506 } 00:24:41.506 ], 00:24:41.506 "driver_specific": {} 00:24:41.506 } 00:24:41.506 ] 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.506 12:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:41.767 12:27:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.767 "name": "Existed_Raid", 00:24:41.767 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:41.767 "strip_size_kb": 64, 00:24:41.767 "state": "online", 00:24:41.767 "raid_level": "raid0", 00:24:41.767 "superblock": true, 00:24:41.767 "num_base_bdevs": 3, 00:24:41.767 "num_base_bdevs_discovered": 3, 00:24:41.767 "num_base_bdevs_operational": 3, 00:24:41.767 "base_bdevs_list": [ 00:24:41.767 { 00:24:41.767 "name": "BaseBdev1", 00:24:41.767 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:41.767 "is_configured": true, 00:24:41.767 "data_offset": 2048, 00:24:41.767 "data_size": 63488 00:24:41.767 }, 00:24:41.767 { 00:24:41.767 "name": "BaseBdev2", 00:24:41.767 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:41.767 "is_configured": true, 00:24:41.767 "data_offset": 2048, 00:24:41.767 "data_size": 63488 00:24:41.767 }, 00:24:41.767 { 00:24:41.767 "name": "BaseBdev3", 00:24:41.767 "uuid": "1f90fcc5-74ba-4340-9a8e-f28c89cd6331", 00:24:41.767 "is_configured": true, 00:24:41.767 "data_offset": 2048, 00:24:41.767 "data_size": 63488 00:24:41.767 } 00:24:41.767 ] 00:24:41.767 }' 00:24:41.767 12:27:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.767 12:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:42.701 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:42.959 [2024-06-07 12:27:06.403702] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:42.959 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:42.959 "name": "Existed_Raid", 00:24:42.959 "aliases": [ 00:24:42.959 "d5f5e7f2-13fb-4f06-be66-c6087c27f65b" 00:24:42.959 ], 00:24:42.959 "product_name": "Raid Volume", 00:24:42.959 "block_size": 512, 00:24:42.959 "num_blocks": 190464, 00:24:42.959 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:42.959 "assigned_rate_limits": { 00:24:42.959 "rw_ios_per_sec": 0, 00:24:42.959 "rw_mbytes_per_sec": 0, 00:24:42.959 "r_mbytes_per_sec": 0, 00:24:42.959 "w_mbytes_per_sec": 0 00:24:42.959 }, 00:24:42.959 "claimed": false, 00:24:42.959 "zoned": false, 00:24:42.959 "supported_io_types": { 00:24:42.959 "read": true, 00:24:42.959 "write": true, 00:24:42.959 "unmap": true, 00:24:42.959 "write_zeroes": true, 00:24:42.959 "flush": true, 00:24:42.959 "reset": true, 00:24:42.959 "compare": false, 00:24:42.959 "compare_and_write": false, 00:24:42.959 "abort": false, 00:24:42.959 "nvme_admin": false, 00:24:42.959 "nvme_io": false 00:24:42.959 }, 00:24:42.959 "memory_domains": [ 00:24:42.959 { 00:24:42.959 "dma_device_id": "system", 00:24:42.959 "dma_device_type": 1 00:24:42.959 }, 00:24:42.959 { 00:24:42.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.959 "dma_device_type": 2 00:24:42.959 }, 00:24:42.959 { 00:24:42.959 "dma_device_id": "system", 00:24:42.959 "dma_device_type": 1 00:24:42.959 }, 00:24:42.959 { 00:24:42.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.959 "dma_device_type": 2 00:24:42.959 }, 00:24:42.959 { 00:24:42.959 "dma_device_id": "system", 00:24:42.959 "dma_device_type": 1 00:24:42.959 }, 00:24:42.959 { 00:24:42.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.959 "dma_device_type": 2 00:24:42.959 } 00:24:42.959 ], 00:24:42.959 "driver_specific": { 00:24:42.959 "raid": { 00:24:42.960 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:42.960 "strip_size_kb": 64, 00:24:42.960 "state": "online", 00:24:42.960 "raid_level": "raid0", 00:24:42.960 "superblock": true, 00:24:42.960 "num_base_bdevs": 3, 00:24:42.960 "num_base_bdevs_discovered": 3, 00:24:42.960 "num_base_bdevs_operational": 3, 00:24:42.960 "base_bdevs_list": [ 00:24:42.960 { 00:24:42.960 "name": "BaseBdev1", 00:24:42.960 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:42.960 "is_configured": true, 00:24:42.960 "data_offset": 2048, 00:24:42.960 "data_size": 63488 00:24:42.960 }, 00:24:42.960 { 00:24:42.960 "name": "BaseBdev2", 00:24:42.960 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:42.960 "is_configured": true, 00:24:42.960 "data_offset": 2048, 00:24:42.960 "data_size": 63488 00:24:42.960 }, 00:24:42.960 { 00:24:42.960 "name": "BaseBdev3", 00:24:42.960 "uuid": "1f90fcc5-74ba-4340-9a8e-f28c89cd6331", 00:24:42.960 "is_configured": true, 00:24:42.960 "data_offset": 2048, 00:24:42.960 "data_size": 63488 00:24:42.960 } 00:24:42.960 ] 00:24:42.960 } 00:24:42.960 } 00:24:42.960 }' 00:24:42.960 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:42.960 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:42.960 BaseBdev2 00:24:42.960 BaseBdev3' 00:24:42.960 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:42.960 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:42.960 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:43.219 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:43.219 "name": "BaseBdev1", 00:24:43.219 "aliases": [ 00:24:43.219 "713f16a3-b3a6-4467-abdd-ebc5f2a414ef" 00:24:43.219 ], 00:24:43.219 "product_name": "Malloc disk", 00:24:43.219 "block_size": 512, 00:24:43.219 "num_blocks": 65536, 00:24:43.219 "uuid": "713f16a3-b3a6-4467-abdd-ebc5f2a414ef", 00:24:43.219 "assigned_rate_limits": { 00:24:43.219 "rw_ios_per_sec": 0, 00:24:43.219 "rw_mbytes_per_sec": 0, 00:24:43.219 "r_mbytes_per_sec": 0, 00:24:43.219 "w_mbytes_per_sec": 0 00:24:43.219 }, 00:24:43.219 "claimed": true, 00:24:43.219 "claim_type": "exclusive_write", 00:24:43.219 "zoned": false, 00:24:43.219 "supported_io_types": { 00:24:43.219 "read": true, 00:24:43.219 "write": true, 00:24:43.219 "unmap": true, 00:24:43.219 "write_zeroes": true, 00:24:43.219 "flush": true, 00:24:43.219 "reset": true, 00:24:43.219 "compare": false, 00:24:43.219 "compare_and_write": false, 00:24:43.219 "abort": true, 00:24:43.219 "nvme_admin": false, 00:24:43.219 "nvme_io": false 00:24:43.219 }, 00:24:43.219 "memory_domains": [ 00:24:43.219 { 00:24:43.219 "dma_device_id": "system", 00:24:43.219 "dma_device_type": 1 00:24:43.219 }, 00:24:43.219 { 00:24:43.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:43.219 "dma_device_type": 2 00:24:43.219 } 00:24:43.219 ], 00:24:43.219 "driver_specific": {} 00:24:43.219 }' 00:24:43.219 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:43.558 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:43.558 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:43.558 12:27:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.558 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.823 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:43.823 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:43.823 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:43.823 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:44.086 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:44.086 "name": "BaseBdev2", 00:24:44.086 "aliases": [ 00:24:44.086 "4a061579-d701-4737-9a37-0ec8c5f6a437" 00:24:44.086 ], 00:24:44.086 "product_name": "Malloc disk", 00:24:44.086 "block_size": 512, 00:24:44.086 "num_blocks": 65536, 00:24:44.086 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:44.086 "assigned_rate_limits": { 00:24:44.087 "rw_ios_per_sec": 0, 00:24:44.087 "rw_mbytes_per_sec": 0, 00:24:44.087 "r_mbytes_per_sec": 0, 00:24:44.087 "w_mbytes_per_sec": 0 00:24:44.087 }, 00:24:44.087 "claimed": true, 00:24:44.087 "claim_type": "exclusive_write", 00:24:44.087 "zoned": false, 00:24:44.087 "supported_io_types": { 00:24:44.087 "read": true, 00:24:44.087 "write": true, 00:24:44.087 "unmap": true, 00:24:44.087 "write_zeroes": true, 00:24:44.087 "flush": true, 00:24:44.087 "reset": true, 00:24:44.087 "compare": false, 00:24:44.087 "compare_and_write": false, 00:24:44.087 "abort": true, 00:24:44.087 "nvme_admin": false, 00:24:44.087 "nvme_io": false 00:24:44.087 }, 00:24:44.087 "memory_domains": [ 00:24:44.087 { 00:24:44.087 "dma_device_id": "system", 00:24:44.087 "dma_device_type": 1 00:24:44.087 }, 00:24:44.087 { 00:24:44.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.087 "dma_device_type": 2 00:24:44.087 } 00:24:44.087 ], 00:24:44.087 "driver_specific": {} 00:24:44.087 }' 00:24:44.087 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:44.087 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:44.087 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:44.087 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:44.087 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:44.347 12:27:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:44.607 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:44.607 "name": "BaseBdev3", 00:24:44.607 "aliases": [ 00:24:44.607 "1f90fcc5-74ba-4340-9a8e-f28c89cd6331" 00:24:44.607 ], 00:24:44.607 "product_name": "Malloc disk", 00:24:44.607 "block_size": 512, 00:24:44.607 "num_blocks": 65536, 00:24:44.607 "uuid": "1f90fcc5-74ba-4340-9a8e-f28c89cd6331", 00:24:44.607 "assigned_rate_limits": { 00:24:44.607 "rw_ios_per_sec": 0, 00:24:44.607 "rw_mbytes_per_sec": 0, 00:24:44.607 "r_mbytes_per_sec": 0, 00:24:44.607 "w_mbytes_per_sec": 0 00:24:44.607 }, 00:24:44.607 "claimed": true, 00:24:44.607 "claim_type": "exclusive_write", 00:24:44.607 "zoned": false, 00:24:44.607 "supported_io_types": { 00:24:44.607 "read": true, 00:24:44.607 "write": true, 00:24:44.607 "unmap": true, 00:24:44.607 "write_zeroes": true, 00:24:44.607 "flush": true, 00:24:44.607 "reset": true, 00:24:44.607 "compare": false, 00:24:44.608 "compare_and_write": false, 00:24:44.608 "abort": true, 00:24:44.608 "nvme_admin": false, 00:24:44.608 "nvme_io": false 00:24:44.608 }, 00:24:44.608 "memory_domains": [ 00:24:44.608 { 00:24:44.608 "dma_device_id": "system", 00:24:44.608 "dma_device_type": 1 00:24:44.608 }, 00:24:44.608 { 00:24:44.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.608 "dma_device_type": 2 00:24:44.608 } 00:24:44.608 ], 00:24:44.608 "driver_specific": {} 00:24:44.608 }' 00:24:44.608 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:44.608 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:44.867 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:45.126 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:45.126 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:45.384 [2024-06-07 12:27:08.884375] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:45.384 [2024-06-07 12:27:08.884774] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:45.384 [2024-06-07 12:27:08.884970] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.384 12:27:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:45.642 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.643 "name": "Existed_Raid", 00:24:45.643 "uuid": "d5f5e7f2-13fb-4f06-be66-c6087c27f65b", 00:24:45.643 "strip_size_kb": 64, 00:24:45.643 "state": "offline", 00:24:45.643 "raid_level": "raid0", 00:24:45.643 "superblock": true, 00:24:45.643 "num_base_bdevs": 3, 00:24:45.643 "num_base_bdevs_discovered": 2, 00:24:45.643 "num_base_bdevs_operational": 2, 00:24:45.643 "base_bdevs_list": [ 00:24:45.643 { 00:24:45.643 "name": null, 00:24:45.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.643 "is_configured": false, 00:24:45.643 "data_offset": 2048, 00:24:45.643 "data_size": 63488 00:24:45.643 }, 00:24:45.643 { 00:24:45.643 "name": "BaseBdev2", 00:24:45.643 "uuid": "4a061579-d701-4737-9a37-0ec8c5f6a437", 00:24:45.643 "is_configured": true, 00:24:45.643 "data_offset": 2048, 00:24:45.643 "data_size": 63488 00:24:45.643 }, 00:24:45.643 { 00:24:45.643 "name": "BaseBdev3", 00:24:45.643 "uuid": "1f90fcc5-74ba-4340-9a8e-f28c89cd6331", 00:24:45.643 "is_configured": true, 00:24:45.643 "data_offset": 2048, 00:24:45.643 "data_size": 63488 00:24:45.643 } 00:24:45.643 ] 00:24:45.643 }' 00:24:45.643 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.643 12:27:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:46.578 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:46.578 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:46.578 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.578 12:27:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:46.578 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:46.578 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:46.578 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:46.836 [2024-06-07 12:27:10.476533] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:47.094 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:47.094 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:47.094 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.094 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:47.351 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:47.351 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:47.351 12:27:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:24:47.609 [2024-06-07 12:27:11.072490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:47.609 [2024-06-07 12:27:11.072882] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:24:47.609 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:47.609 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:47.609 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.609 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:48.008 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:48.305 BaseBdev2 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:48.305 12:27:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:48.563 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:48.826 [ 00:24:48.826 { 00:24:48.826 "name": "BaseBdev2", 00:24:48.826 "aliases": [ 00:24:48.826 "dc2b923a-3a4a-4569-8a51-d58eb40dadac" 00:24:48.826 ], 00:24:48.826 "product_name": "Malloc disk", 00:24:48.826 "block_size": 512, 00:24:48.826 "num_blocks": 65536, 00:24:48.826 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:48.826 "assigned_rate_limits": { 00:24:48.826 "rw_ios_per_sec": 0, 00:24:48.826 "rw_mbytes_per_sec": 0, 00:24:48.826 "r_mbytes_per_sec": 0, 00:24:48.826 "w_mbytes_per_sec": 0 00:24:48.826 }, 00:24:48.826 "claimed": false, 00:24:48.826 "zoned": false, 00:24:48.826 "supported_io_types": { 00:24:48.826 "read": true, 00:24:48.826 "write": true, 00:24:48.826 "unmap": true, 00:24:48.826 "write_zeroes": true, 00:24:48.826 "flush": true, 00:24:48.826 "reset": true, 00:24:48.826 "compare": false, 00:24:48.826 "compare_and_write": false, 00:24:48.826 "abort": true, 00:24:48.826 "nvme_admin": false, 00:24:48.826 "nvme_io": false 00:24:48.826 }, 00:24:48.826 "memory_domains": [ 00:24:48.826 { 00:24:48.826 "dma_device_id": "system", 00:24:48.826 "dma_device_type": 1 00:24:48.826 }, 00:24:48.826 { 00:24:48.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.826 "dma_device_type": 2 00:24:48.826 } 00:24:48.826 ], 00:24:48.826 "driver_specific": {} 00:24:48.826 } 00:24:48.826 ] 00:24:48.826 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:48.826 12:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:48.826 12:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:48.826 12:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:49.396 BaseBdev3 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:49.396 12:27:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:49.653 12:27:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:49.910 [ 00:24:49.910 { 00:24:49.910 "name": "BaseBdev3", 00:24:49.910 "aliases": [ 00:24:49.910 "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7" 00:24:49.910 ], 00:24:49.910 "product_name": "Malloc disk", 00:24:49.910 "block_size": 512, 00:24:49.910 "num_blocks": 65536, 00:24:49.910 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:49.910 "assigned_rate_limits": { 00:24:49.910 "rw_ios_per_sec": 0, 00:24:49.910 "rw_mbytes_per_sec": 0, 00:24:49.910 "r_mbytes_per_sec": 0, 00:24:49.910 "w_mbytes_per_sec": 0 00:24:49.910 }, 00:24:49.910 "claimed": false, 00:24:49.910 "zoned": false, 00:24:49.910 "supported_io_types": { 00:24:49.910 "read": true, 00:24:49.910 "write": true, 00:24:49.910 "unmap": true, 00:24:49.910 "write_zeroes": true, 00:24:49.910 "flush": true, 00:24:49.910 "reset": true, 00:24:49.910 "compare": false, 00:24:49.910 "compare_and_write": false, 00:24:49.910 "abort": true, 00:24:49.910 "nvme_admin": false, 00:24:49.910 "nvme_io": false 00:24:49.910 }, 00:24:49.910 "memory_domains": [ 00:24:49.910 { 00:24:49.910 "dma_device_id": "system", 00:24:49.910 "dma_device_type": 1 00:24:49.910 }, 00:24:49.910 { 00:24:49.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:49.910 "dma_device_type": 2 00:24:49.910 } 00:24:49.910 ], 00:24:49.910 "driver_specific": {} 00:24:49.910 } 00:24:49.910 ] 00:24:49.910 12:27:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:49.910 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:49.910 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:49.910 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:24:50.167 [2024-06-07 12:27:13.744819] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:50.167 [2024-06-07 12:27:13.744971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:50.167 [2024-06-07 12:27:13.745014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:50.167 [2024-06-07 12:27:13.747169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.167 12:27:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:50.772 12:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.772 "name": "Existed_Raid", 00:24:50.772 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:50.772 "strip_size_kb": 64, 00:24:50.772 "state": "configuring", 00:24:50.772 "raid_level": "raid0", 00:24:50.772 "superblock": true, 00:24:50.772 "num_base_bdevs": 3, 00:24:50.772 "num_base_bdevs_discovered": 2, 00:24:50.772 "num_base_bdevs_operational": 3, 00:24:50.772 "base_bdevs_list": [ 00:24:50.772 { 00:24:50.772 "name": "BaseBdev1", 00:24:50.772 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.772 "is_configured": false, 00:24:50.772 "data_offset": 0, 00:24:50.772 "data_size": 0 00:24:50.772 }, 00:24:50.772 { 00:24:50.772 "name": "BaseBdev2", 00:24:50.772 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:50.772 "is_configured": true, 00:24:50.772 "data_offset": 2048, 00:24:50.772 "data_size": 63488 00:24:50.772 }, 00:24:50.772 { 00:24:50.772 "name": "BaseBdev3", 00:24:50.772 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:50.772 "is_configured": true, 00:24:50.772 "data_offset": 2048, 00:24:50.772 "data_size": 63488 00:24:50.772 } 00:24:50.772 ] 00:24:50.772 }' 00:24:50.772 12:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.772 12:27:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:51.339 12:27:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:51.598 [2024-06-07 12:27:15.012997] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.598 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:51.856 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.856 "name": "Existed_Raid", 00:24:51.856 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:51.856 "strip_size_kb": 64, 00:24:51.856 "state": "configuring", 00:24:51.856 "raid_level": "raid0", 00:24:51.856 "superblock": true, 00:24:51.856 "num_base_bdevs": 3, 00:24:51.856 "num_base_bdevs_discovered": 1, 00:24:51.856 "num_base_bdevs_operational": 3, 00:24:51.856 "base_bdevs_list": [ 00:24:51.856 { 00:24:51.856 "name": "BaseBdev1", 00:24:51.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.856 "is_configured": false, 00:24:51.856 "data_offset": 0, 00:24:51.856 "data_size": 0 00:24:51.856 }, 00:24:51.856 { 00:24:51.856 "name": null, 00:24:51.856 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:51.856 "is_configured": false, 00:24:51.856 "data_offset": 2048, 00:24:51.856 "data_size": 63488 00:24:51.856 }, 00:24:51.856 { 00:24:51.856 "name": "BaseBdev3", 00:24:51.856 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:51.856 "is_configured": true, 00:24:51.856 "data_offset": 2048, 00:24:51.856 "data_size": 63488 00:24:51.856 } 00:24:51.856 ] 00:24:51.856 }' 00:24:51.856 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.856 12:27:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:52.423 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.423 12:27:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:52.680 12:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:24:52.681 12:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:52.940 [2024-06-07 12:27:16.495349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:52.940 BaseBdev1 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:52.940 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:53.197 12:27:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:53.455 [ 00:24:53.455 { 00:24:53.455 "name": "BaseBdev1", 00:24:53.455 "aliases": [ 00:24:53.455 "aecd3eb3-4b82-4ead-af38-2985c156f2ab" 00:24:53.455 ], 00:24:53.455 "product_name": "Malloc disk", 00:24:53.455 "block_size": 512, 00:24:53.455 "num_blocks": 65536, 00:24:53.455 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:53.455 "assigned_rate_limits": { 00:24:53.455 "rw_ios_per_sec": 0, 00:24:53.455 "rw_mbytes_per_sec": 0, 00:24:53.455 "r_mbytes_per_sec": 0, 00:24:53.455 "w_mbytes_per_sec": 0 00:24:53.455 }, 00:24:53.455 "claimed": true, 00:24:53.455 "claim_type": "exclusive_write", 00:24:53.455 "zoned": false, 00:24:53.455 "supported_io_types": { 00:24:53.455 "read": true, 00:24:53.455 "write": true, 00:24:53.455 "unmap": true, 00:24:53.455 "write_zeroes": true, 00:24:53.455 "flush": true, 00:24:53.455 "reset": true, 00:24:53.455 "compare": false, 00:24:53.455 "compare_and_write": false, 00:24:53.455 "abort": true, 00:24:53.455 "nvme_admin": false, 00:24:53.455 "nvme_io": false 00:24:53.455 }, 00:24:53.455 "memory_domains": [ 00:24:53.455 { 00:24:53.455 "dma_device_id": "system", 00:24:53.455 "dma_device_type": 1 00:24:53.455 }, 00:24:53.455 { 00:24:53.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:53.455 "dma_device_type": 2 00:24:53.455 } 00:24:53.455 ], 00:24:53.455 "driver_specific": {} 00:24:53.455 } 00:24:53.455 ] 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.455 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:53.714 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.714 "name": "Existed_Raid", 00:24:53.714 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:53.714 "strip_size_kb": 64, 00:24:53.714 "state": "configuring", 00:24:53.714 "raid_level": "raid0", 00:24:53.714 "superblock": true, 00:24:53.714 "num_base_bdevs": 3, 00:24:53.714 "num_base_bdevs_discovered": 2, 00:24:53.714 "num_base_bdevs_operational": 3, 00:24:53.714 "base_bdevs_list": [ 00:24:53.714 { 00:24:53.714 "name": "BaseBdev1", 00:24:53.714 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:53.714 "is_configured": true, 00:24:53.714 "data_offset": 2048, 00:24:53.714 "data_size": 63488 00:24:53.714 }, 00:24:53.714 { 00:24:53.714 "name": null, 00:24:53.714 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:53.714 "is_configured": false, 00:24:53.714 "data_offset": 2048, 00:24:53.714 "data_size": 63488 00:24:53.714 }, 00:24:53.714 { 00:24:53.714 "name": "BaseBdev3", 00:24:53.714 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:53.714 "is_configured": true, 00:24:53.714 "data_offset": 2048, 00:24:53.714 "data_size": 63488 00:24:53.714 } 00:24:53.714 ] 00:24:53.714 }' 00:24:53.714 12:27:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.714 12:27:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:54.647 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.647 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:24:55.003 [2024-06-07 12:27:18.599850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.003 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:55.263 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.263 "name": "Existed_Raid", 00:24:55.263 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:55.263 "strip_size_kb": 64, 00:24:55.263 "state": "configuring", 00:24:55.263 "raid_level": "raid0", 00:24:55.263 "superblock": true, 00:24:55.263 "num_base_bdevs": 3, 00:24:55.263 "num_base_bdevs_discovered": 1, 00:24:55.263 "num_base_bdevs_operational": 3, 00:24:55.263 "base_bdevs_list": [ 00:24:55.263 { 00:24:55.263 "name": "BaseBdev1", 00:24:55.263 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:55.263 "is_configured": true, 00:24:55.263 "data_offset": 2048, 00:24:55.263 "data_size": 63488 00:24:55.263 }, 00:24:55.263 { 00:24:55.263 "name": null, 00:24:55.263 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:55.263 "is_configured": false, 00:24:55.263 "data_offset": 2048, 00:24:55.263 "data_size": 63488 00:24:55.263 }, 00:24:55.263 { 00:24:55.263 "name": null, 00:24:55.263 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:55.263 "is_configured": false, 00:24:55.263 "data_offset": 2048, 00:24:55.263 "data_size": 63488 00:24:55.263 } 00:24:55.263 ] 00:24:55.263 }' 00:24:55.263 12:27:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.263 12:27:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:55.831 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.831 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:56.089 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:24:56.089 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:24:56.347 [2024-06-07 12:27:19.960086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.347 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.348 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.348 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.348 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:56.348 12:27:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.606 12:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.606 "name": "Existed_Raid", 00:24:56.606 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:56.606 "strip_size_kb": 64, 00:24:56.606 "state": "configuring", 00:24:56.606 "raid_level": "raid0", 00:24:56.606 "superblock": true, 00:24:56.606 "num_base_bdevs": 3, 00:24:56.606 "num_base_bdevs_discovered": 2, 00:24:56.606 "num_base_bdevs_operational": 3, 00:24:56.606 "base_bdevs_list": [ 00:24:56.606 { 00:24:56.606 "name": "BaseBdev1", 00:24:56.606 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:56.606 "is_configured": true, 00:24:56.606 "data_offset": 2048, 00:24:56.606 "data_size": 63488 00:24:56.606 }, 00:24:56.606 { 00:24:56.606 "name": null, 00:24:56.606 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:56.606 "is_configured": false, 00:24:56.606 "data_offset": 2048, 00:24:56.606 "data_size": 63488 00:24:56.606 }, 00:24:56.606 { 00:24:56.606 "name": "BaseBdev3", 00:24:56.606 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:56.606 "is_configured": true, 00:24:56.606 "data_offset": 2048, 00:24:56.606 "data_size": 63488 00:24:56.606 } 00:24:56.606 ] 00:24:56.606 }' 00:24:56.606 12:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.606 12:27:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:57.541 12:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:57.541 12:27:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.541 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:24:57.541 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:57.799 [2024-06-07 12:27:21.296209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:57.799 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.056 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.056 "name": "Existed_Raid", 00:24:58.056 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:58.056 "strip_size_kb": 64, 00:24:58.056 "state": "configuring", 00:24:58.056 "raid_level": "raid0", 00:24:58.056 "superblock": true, 00:24:58.056 "num_base_bdevs": 3, 00:24:58.056 "num_base_bdevs_discovered": 1, 00:24:58.056 "num_base_bdevs_operational": 3, 00:24:58.056 "base_bdevs_list": [ 00:24:58.056 { 00:24:58.056 "name": null, 00:24:58.056 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:58.056 "is_configured": false, 00:24:58.056 "data_offset": 2048, 00:24:58.057 "data_size": 63488 00:24:58.057 }, 00:24:58.057 { 00:24:58.057 "name": null, 00:24:58.057 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:58.057 "is_configured": false, 00:24:58.057 "data_offset": 2048, 00:24:58.057 "data_size": 63488 00:24:58.057 }, 00:24:58.057 { 00:24:58.057 "name": "BaseBdev3", 00:24:58.057 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:58.057 "is_configured": true, 00:24:58.057 "data_offset": 2048, 00:24:58.057 "data_size": 63488 00:24:58.057 } 00:24:58.057 ] 00:24:58.057 }' 00:24:58.057 12:27:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.057 12:27:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:58.623 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.623 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:58.881 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:24:58.881 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:24:59.138 [2024-06-07 12:27:22.677129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.139 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:59.397 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.397 "name": "Existed_Raid", 00:24:59.397 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:24:59.397 "strip_size_kb": 64, 00:24:59.397 "state": "configuring", 00:24:59.397 "raid_level": "raid0", 00:24:59.397 "superblock": true, 00:24:59.397 "num_base_bdevs": 3, 00:24:59.397 "num_base_bdevs_discovered": 2, 00:24:59.397 "num_base_bdevs_operational": 3, 00:24:59.397 "base_bdevs_list": [ 00:24:59.397 { 00:24:59.397 "name": null, 00:24:59.397 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:24:59.397 "is_configured": false, 00:24:59.397 "data_offset": 2048, 00:24:59.397 "data_size": 63488 00:24:59.397 }, 00:24:59.397 { 00:24:59.397 "name": "BaseBdev2", 00:24:59.397 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:24:59.397 "is_configured": true, 00:24:59.397 "data_offset": 2048, 00:24:59.397 "data_size": 63488 00:24:59.397 }, 00:24:59.397 { 00:24:59.397 "name": "BaseBdev3", 00:24:59.397 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:24:59.397 "is_configured": true, 00:24:59.397 "data_offset": 2048, 00:24:59.397 "data_size": 63488 00:24:59.397 } 00:24:59.397 ] 00:24:59.397 }' 00:24:59.397 12:27:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.397 12:27:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:00.041 12:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:25:00.041 12:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.300 12:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:25:00.300 12:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.300 12:27:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:25:00.560 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u aecd3eb3-4b82-4ead-af38-2985c156f2ab 00:25:00.817 [2024-06-07 12:27:24.358988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:25:00.817 [2024-06-07 12:27:24.359480] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:25:00.817 [2024-06-07 12:27:24.359604] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:25:00.817 [2024-06-07 12:27:24.359732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:25:00.817 [2024-06-07 12:27:24.360102] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:25:00.817 [2024-06-07 12:27:24.360147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:25:00.817 [2024-06-07 12:27:24.360364] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:00.817 NewBaseBdev 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:00.817 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:01.075 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:25:01.333 [ 00:25:01.333 { 00:25:01.333 "name": "NewBaseBdev", 00:25:01.333 "aliases": [ 00:25:01.333 "aecd3eb3-4b82-4ead-af38-2985c156f2ab" 00:25:01.333 ], 00:25:01.333 "product_name": "Malloc disk", 00:25:01.333 "block_size": 512, 00:25:01.333 "num_blocks": 65536, 00:25:01.333 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:25:01.333 "assigned_rate_limits": { 00:25:01.333 "rw_ios_per_sec": 0, 00:25:01.333 "rw_mbytes_per_sec": 0, 00:25:01.333 "r_mbytes_per_sec": 0, 00:25:01.333 "w_mbytes_per_sec": 0 00:25:01.333 }, 00:25:01.333 "claimed": true, 00:25:01.333 "claim_type": "exclusive_write", 00:25:01.333 "zoned": false, 00:25:01.333 "supported_io_types": { 00:25:01.333 "read": true, 00:25:01.333 "write": true, 00:25:01.333 "unmap": true, 00:25:01.333 "write_zeroes": true, 00:25:01.333 "flush": true, 00:25:01.333 "reset": true, 00:25:01.333 "compare": false, 00:25:01.333 "compare_and_write": false, 00:25:01.333 "abort": true, 00:25:01.333 "nvme_admin": false, 00:25:01.333 "nvme_io": false 00:25:01.333 }, 00:25:01.333 "memory_domains": [ 00:25:01.333 { 00:25:01.333 "dma_device_id": "system", 00:25:01.333 "dma_device_type": 1 00:25:01.333 }, 00:25:01.333 { 00:25:01.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:01.333 "dma_device_type": 2 00:25:01.333 } 00:25:01.333 ], 00:25:01.333 "driver_specific": {} 00:25:01.333 } 00:25:01.333 ] 00:25:01.333 12:27:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:25:01.333 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:25:01.333 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:01.333 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.334 12:27:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:01.899 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:01.899 "name": "Existed_Raid", 00:25:01.899 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:25:01.899 "strip_size_kb": 64, 00:25:01.899 "state": "online", 00:25:01.899 "raid_level": "raid0", 00:25:01.899 "superblock": true, 00:25:01.899 "num_base_bdevs": 3, 00:25:01.899 "num_base_bdevs_discovered": 3, 00:25:01.899 "num_base_bdevs_operational": 3, 00:25:01.899 "base_bdevs_list": [ 00:25:01.899 { 00:25:01.899 "name": "NewBaseBdev", 00:25:01.899 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:25:01.899 "is_configured": true, 00:25:01.899 "data_offset": 2048, 00:25:01.899 "data_size": 63488 00:25:01.899 }, 00:25:01.899 { 00:25:01.899 "name": "BaseBdev2", 00:25:01.899 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:25:01.899 "is_configured": true, 00:25:01.899 "data_offset": 2048, 00:25:01.899 "data_size": 63488 00:25:01.899 }, 00:25:01.899 { 00:25:01.899 "name": "BaseBdev3", 00:25:01.899 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:25:01.899 "is_configured": true, 00:25:01.899 "data_offset": 2048, 00:25:01.899 "data_size": 63488 00:25:01.899 } 00:25:01.899 ] 00:25:01.899 }' 00:25:01.899 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:01.899 12:27:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:02.465 12:27:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:02.722 [2024-06-07 12:27:26.119708] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:02.722 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:02.722 "name": "Existed_Raid", 00:25:02.722 "aliases": [ 00:25:02.722 "39ab65b7-fcda-430d-8c42-81b579c9430c" 00:25:02.722 ], 00:25:02.722 "product_name": "Raid Volume", 00:25:02.722 "block_size": 512, 00:25:02.722 "num_blocks": 190464, 00:25:02.722 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:25:02.722 "assigned_rate_limits": { 00:25:02.722 "rw_ios_per_sec": 0, 00:25:02.722 "rw_mbytes_per_sec": 0, 00:25:02.722 "r_mbytes_per_sec": 0, 00:25:02.722 "w_mbytes_per_sec": 0 00:25:02.722 }, 00:25:02.722 "claimed": false, 00:25:02.722 "zoned": false, 00:25:02.722 "supported_io_types": { 00:25:02.723 "read": true, 00:25:02.723 "write": true, 00:25:02.723 "unmap": true, 00:25:02.723 "write_zeroes": true, 00:25:02.723 "flush": true, 00:25:02.723 "reset": true, 00:25:02.723 "compare": false, 00:25:02.723 "compare_and_write": false, 00:25:02.723 "abort": false, 00:25:02.723 "nvme_admin": false, 00:25:02.723 "nvme_io": false 00:25:02.723 }, 00:25:02.723 "memory_domains": [ 00:25:02.723 { 00:25:02.723 "dma_device_id": "system", 00:25:02.723 "dma_device_type": 1 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.723 "dma_device_type": 2 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "dma_device_id": "system", 00:25:02.723 "dma_device_type": 1 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.723 "dma_device_type": 2 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "dma_device_id": "system", 00:25:02.723 "dma_device_type": 1 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.723 "dma_device_type": 2 00:25:02.723 } 00:25:02.723 ], 00:25:02.723 "driver_specific": { 00:25:02.723 "raid": { 00:25:02.723 "uuid": "39ab65b7-fcda-430d-8c42-81b579c9430c", 00:25:02.723 "strip_size_kb": 64, 00:25:02.723 "state": "online", 00:25:02.723 "raid_level": "raid0", 00:25:02.723 "superblock": true, 00:25:02.723 "num_base_bdevs": 3, 00:25:02.723 "num_base_bdevs_discovered": 3, 00:25:02.723 "num_base_bdevs_operational": 3, 00:25:02.723 "base_bdevs_list": [ 00:25:02.723 { 00:25:02.723 "name": "NewBaseBdev", 00:25:02.723 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:25:02.723 "is_configured": true, 00:25:02.723 "data_offset": 2048, 00:25:02.723 "data_size": 63488 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "name": "BaseBdev2", 00:25:02.723 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:25:02.723 "is_configured": true, 00:25:02.723 "data_offset": 2048, 00:25:02.723 "data_size": 63488 00:25:02.723 }, 00:25:02.723 { 00:25:02.723 "name": "BaseBdev3", 00:25:02.723 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:25:02.723 "is_configured": true, 00:25:02.723 "data_offset": 2048, 00:25:02.723 "data_size": 63488 00:25:02.723 } 00:25:02.723 ] 00:25:02.723 } 00:25:02.723 } 00:25:02.723 }' 00:25:02.723 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:02.723 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:25:02.723 BaseBdev2 00:25:02.723 BaseBdev3' 00:25:02.723 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:02.723 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:25:02.723 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:02.980 "name": "NewBaseBdev", 00:25:02.980 "aliases": [ 00:25:02.980 "aecd3eb3-4b82-4ead-af38-2985c156f2ab" 00:25:02.980 ], 00:25:02.980 "product_name": "Malloc disk", 00:25:02.980 "block_size": 512, 00:25:02.980 "num_blocks": 65536, 00:25:02.980 "uuid": "aecd3eb3-4b82-4ead-af38-2985c156f2ab", 00:25:02.980 "assigned_rate_limits": { 00:25:02.980 "rw_ios_per_sec": 0, 00:25:02.980 "rw_mbytes_per_sec": 0, 00:25:02.980 "r_mbytes_per_sec": 0, 00:25:02.980 "w_mbytes_per_sec": 0 00:25:02.980 }, 00:25:02.980 "claimed": true, 00:25:02.980 "claim_type": "exclusive_write", 00:25:02.980 "zoned": false, 00:25:02.980 "supported_io_types": { 00:25:02.980 "read": true, 00:25:02.980 "write": true, 00:25:02.980 "unmap": true, 00:25:02.980 "write_zeroes": true, 00:25:02.980 "flush": true, 00:25:02.980 "reset": true, 00:25:02.980 "compare": false, 00:25:02.980 "compare_and_write": false, 00:25:02.980 "abort": true, 00:25:02.980 "nvme_admin": false, 00:25:02.980 "nvme_io": false 00:25:02.980 }, 00:25:02.980 "memory_domains": [ 00:25:02.980 { 00:25:02.980 "dma_device_id": "system", 00:25:02.980 "dma_device_type": 1 00:25:02.980 }, 00:25:02.980 { 00:25:02.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:02.980 "dma_device_type": 2 00:25:02.980 } 00:25:02.980 ], 00:25:02.980 "driver_specific": {} 00:25:02.980 }' 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:02.980 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:02.981 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:03.238 12:27:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:03.496 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:03.496 "name": "BaseBdev2", 00:25:03.496 "aliases": [ 00:25:03.496 "dc2b923a-3a4a-4569-8a51-d58eb40dadac" 00:25:03.496 ], 00:25:03.496 "product_name": "Malloc disk", 00:25:03.496 "block_size": 512, 00:25:03.496 "num_blocks": 65536, 00:25:03.496 "uuid": "dc2b923a-3a4a-4569-8a51-d58eb40dadac", 00:25:03.496 "assigned_rate_limits": { 00:25:03.496 "rw_ios_per_sec": 0, 00:25:03.496 "rw_mbytes_per_sec": 0, 00:25:03.496 "r_mbytes_per_sec": 0, 00:25:03.496 "w_mbytes_per_sec": 0 00:25:03.496 }, 00:25:03.496 "claimed": true, 00:25:03.496 "claim_type": "exclusive_write", 00:25:03.496 "zoned": false, 00:25:03.496 "supported_io_types": { 00:25:03.496 "read": true, 00:25:03.496 "write": true, 00:25:03.496 "unmap": true, 00:25:03.496 "write_zeroes": true, 00:25:03.496 "flush": true, 00:25:03.496 "reset": true, 00:25:03.496 "compare": false, 00:25:03.496 "compare_and_write": false, 00:25:03.496 "abort": true, 00:25:03.496 "nvme_admin": false, 00:25:03.496 "nvme_io": false 00:25:03.496 }, 00:25:03.496 "memory_domains": [ 00:25:03.496 { 00:25:03.496 "dma_device_id": "system", 00:25:03.496 "dma_device_type": 1 00:25:03.496 }, 00:25:03.496 { 00:25:03.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:03.496 "dma_device_type": 2 00:25:03.496 } 00:25:03.496 ], 00:25:03.496 "driver_specific": {} 00:25:03.496 }' 00:25:03.496 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.496 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:03.496 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:03.496 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:25:03.754 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:04.321 "name": "BaseBdev3", 00:25:04.321 "aliases": [ 00:25:04.321 "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7" 00:25:04.321 ], 00:25:04.321 "product_name": "Malloc disk", 00:25:04.321 "block_size": 512, 00:25:04.321 "num_blocks": 65536, 00:25:04.321 "uuid": "38ab3aa7-6ca3-4623-9ff7-3f6624ee5ea7", 00:25:04.321 "assigned_rate_limits": { 00:25:04.321 "rw_ios_per_sec": 0, 00:25:04.321 "rw_mbytes_per_sec": 0, 00:25:04.321 "r_mbytes_per_sec": 0, 00:25:04.321 "w_mbytes_per_sec": 0 00:25:04.321 }, 00:25:04.321 "claimed": true, 00:25:04.321 "claim_type": "exclusive_write", 00:25:04.321 "zoned": false, 00:25:04.321 "supported_io_types": { 00:25:04.321 "read": true, 00:25:04.321 "write": true, 00:25:04.321 "unmap": true, 00:25:04.321 "write_zeroes": true, 00:25:04.321 "flush": true, 00:25:04.321 "reset": true, 00:25:04.321 "compare": false, 00:25:04.321 "compare_and_write": false, 00:25:04.321 "abort": true, 00:25:04.321 "nvme_admin": false, 00:25:04.321 "nvme_io": false 00:25:04.321 }, 00:25:04.321 "memory_domains": [ 00:25:04.321 { 00:25:04.321 "dma_device_id": "system", 00:25:04.321 "dma_device_type": 1 00:25:04.321 }, 00:25:04.321 { 00:25:04.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:04.321 "dma_device_type": 2 00:25:04.321 } 00:25:04.321 ], 00:25:04.321 "driver_specific": {} 00:25:04.321 }' 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:04.321 12:27:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:04.579 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:04.579 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.579 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:04.579 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:04.579 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:04.838 [2024-06-07 12:27:28.304144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:04.838 [2024-06-07 12:27:28.304448] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:04.838 [2024-06-07 12:27:28.304630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:04.838 [2024-06-07 12:27:28.304759] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:04.838 [2024-06-07 12:27:28.304837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 203668 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 203668 ']' 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 203668 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 203668 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 203668' 00:25:04.838 killing process with pid 203668 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 203668 00:25:04.838 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 203668 00:25:04.838 [2024-06-07 12:27:28.358862] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:04.838 [2024-06-07 12:27:28.425027] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:05.421 12:27:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:25:05.421 00:25:05.421 real 0m32.830s 00:25:05.421 user 1m0.433s 00:25:05.421 sys 0m5.077s 00:25:05.421 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:05.421 12:27:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:05.421 ************************************ 00:25:05.421 END TEST raid_state_function_test_sb 00:25:05.421 ************************************ 00:25:05.421 12:27:28 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:25:05.421 12:27:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:25:05.422 12:27:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:05.422 12:27:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:05.422 ************************************ 00:25:05.422 START TEST raid_superblock_test 00:25:05.422 ************************************ 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 3 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=204676 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 204676 /var/tmp/spdk-raid.sock 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 204676 ']' 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:05.422 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:05.422 12:27:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:05.422 [2024-06-07 12:27:28.894036] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:25:05.422 [2024-06-07 12:27:28.894809] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204676 ] 00:25:05.422 [2024-06-07 12:27:29.031502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:05.680 [2024-06-07 12:27:29.126161] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:05.680 [2024-06-07 12:27:29.208381] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:06.246 12:27:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:06.247 12:27:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:25:06.814 malloc1 00:25:06.814 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:06.814 [2024-06-07 12:27:30.443606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:06.814 [2024-06-07 12:27:30.443771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:06.814 [2024-06-07 12:27:30.443829] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:25:06.814 [2024-06-07 12:27:30.443905] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:06.814 [2024-06-07 12:27:30.446417] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:06.814 [2024-06-07 12:27:30.446494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:06.814 pt1 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:25:07.072 malloc2 00:25:07.072 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:07.330 [2024-06-07 12:27:30.939395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:07.330 [2024-06-07 12:27:30.939516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.330 [2024-06-07 12:27:30.939562] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:25:07.330 [2024-06-07 12:27:30.939611] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.330 [2024-06-07 12:27:30.941911] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.330 [2024-06-07 12:27:30.941972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:07.330 pt2 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:07.330 12:27:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:25:07.896 malloc3 00:25:07.896 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:25:08.155 [2024-06-07 12:27:31.620592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:25:08.155 [2024-06-07 12:27:31.620719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.155 [2024-06-07 12:27:31.620765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:25:08.155 [2024-06-07 12:27:31.620828] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.155 [2024-06-07 12:27:31.623211] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.155 [2024-06-07 12:27:31.623314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:25:08.155 pt3 00:25:08.155 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:08.155 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:08.155 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:25:08.454 [2024-06-07 12:27:31.876738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:08.454 [2024-06-07 12:27:31.878999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:08.454 [2024-06-07 12:27:31.879070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:25:08.454 [2024-06-07 12:27:31.879281] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:25:08.454 [2024-06-07 12:27:31.879296] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:25:08.454 [2024-06-07 12:27:31.879492] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:25:08.454 [2024-06-07 12:27:31.879856] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:25:08.454 [2024-06-07 12:27:31.879877] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007880 00:25:08.454 [2024-06-07 12:27:31.880006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.454 12:27:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.714 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:08.714 "name": "raid_bdev1", 00:25:08.714 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:08.714 "strip_size_kb": 64, 00:25:08.714 "state": "online", 00:25:08.714 "raid_level": "raid0", 00:25:08.714 "superblock": true, 00:25:08.714 "num_base_bdevs": 3, 00:25:08.714 "num_base_bdevs_discovered": 3, 00:25:08.714 "num_base_bdevs_operational": 3, 00:25:08.714 "base_bdevs_list": [ 00:25:08.714 { 00:25:08.714 "name": "pt1", 00:25:08.714 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:08.714 "is_configured": true, 00:25:08.714 "data_offset": 2048, 00:25:08.714 "data_size": 63488 00:25:08.714 }, 00:25:08.714 { 00:25:08.714 "name": "pt2", 00:25:08.714 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:08.714 "is_configured": true, 00:25:08.714 "data_offset": 2048, 00:25:08.714 "data_size": 63488 00:25:08.714 }, 00:25:08.714 { 00:25:08.714 "name": "pt3", 00:25:08.714 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:08.714 "is_configured": true, 00:25:08.714 "data_offset": 2048, 00:25:08.714 "data_size": 63488 00:25:08.714 } 00:25:08.714 ] 00:25:08.714 }' 00:25:08.714 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:08.714 12:27:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:09.279 12:27:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:09.538 [2024-06-07 12:27:33.056956] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:09.538 "name": "raid_bdev1", 00:25:09.538 "aliases": [ 00:25:09.538 "58e3e172-08fc-48e9-b0bc-be243327cc02" 00:25:09.538 ], 00:25:09.538 "product_name": "Raid Volume", 00:25:09.538 "block_size": 512, 00:25:09.538 "num_blocks": 190464, 00:25:09.538 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:09.538 "assigned_rate_limits": { 00:25:09.538 "rw_ios_per_sec": 0, 00:25:09.538 "rw_mbytes_per_sec": 0, 00:25:09.538 "r_mbytes_per_sec": 0, 00:25:09.538 "w_mbytes_per_sec": 0 00:25:09.538 }, 00:25:09.538 "claimed": false, 00:25:09.538 "zoned": false, 00:25:09.538 "supported_io_types": { 00:25:09.538 "read": true, 00:25:09.538 "write": true, 00:25:09.538 "unmap": true, 00:25:09.538 "write_zeroes": true, 00:25:09.538 "flush": true, 00:25:09.538 "reset": true, 00:25:09.538 "compare": false, 00:25:09.538 "compare_and_write": false, 00:25:09.538 "abort": false, 00:25:09.538 "nvme_admin": false, 00:25:09.538 "nvme_io": false 00:25:09.538 }, 00:25:09.538 "memory_domains": [ 00:25:09.538 { 00:25:09.538 "dma_device_id": "system", 00:25:09.538 "dma_device_type": 1 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.538 "dma_device_type": 2 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "dma_device_id": "system", 00:25:09.538 "dma_device_type": 1 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.538 "dma_device_type": 2 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "dma_device_id": "system", 00:25:09.538 "dma_device_type": 1 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.538 "dma_device_type": 2 00:25:09.538 } 00:25:09.538 ], 00:25:09.538 "driver_specific": { 00:25:09.538 "raid": { 00:25:09.538 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:09.538 "strip_size_kb": 64, 00:25:09.538 "state": "online", 00:25:09.538 "raid_level": "raid0", 00:25:09.538 "superblock": true, 00:25:09.538 "num_base_bdevs": 3, 00:25:09.538 "num_base_bdevs_discovered": 3, 00:25:09.538 "num_base_bdevs_operational": 3, 00:25:09.538 "base_bdevs_list": [ 00:25:09.538 { 00:25:09.538 "name": "pt1", 00:25:09.538 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:09.538 "is_configured": true, 00:25:09.538 "data_offset": 2048, 00:25:09.538 "data_size": 63488 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "name": "pt2", 00:25:09.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:09.538 "is_configured": true, 00:25:09.538 "data_offset": 2048, 00:25:09.538 "data_size": 63488 00:25:09.538 }, 00:25:09.538 { 00:25:09.538 "name": "pt3", 00:25:09.538 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:09.538 "is_configured": true, 00:25:09.538 "data_offset": 2048, 00:25:09.538 "data_size": 63488 00:25:09.538 } 00:25:09.538 ] 00:25:09.538 } 00:25:09.538 } 00:25:09.538 }' 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:09.538 pt2 00:25:09.538 pt3' 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:09.538 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:09.796 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:09.796 "name": "pt1", 00:25:09.796 "aliases": [ 00:25:09.796 "00000000-0000-0000-0000-000000000001" 00:25:09.796 ], 00:25:09.796 "product_name": "passthru", 00:25:09.796 "block_size": 512, 00:25:09.796 "num_blocks": 65536, 00:25:09.796 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:09.796 "assigned_rate_limits": { 00:25:09.797 "rw_ios_per_sec": 0, 00:25:09.797 "rw_mbytes_per_sec": 0, 00:25:09.797 "r_mbytes_per_sec": 0, 00:25:09.797 "w_mbytes_per_sec": 0 00:25:09.797 }, 00:25:09.797 "claimed": true, 00:25:09.797 "claim_type": "exclusive_write", 00:25:09.797 "zoned": false, 00:25:09.797 "supported_io_types": { 00:25:09.797 "read": true, 00:25:09.797 "write": true, 00:25:09.797 "unmap": true, 00:25:09.797 "write_zeroes": true, 00:25:09.797 "flush": true, 00:25:09.797 "reset": true, 00:25:09.797 "compare": false, 00:25:09.797 "compare_and_write": false, 00:25:09.797 "abort": true, 00:25:09.797 "nvme_admin": false, 00:25:09.797 "nvme_io": false 00:25:09.797 }, 00:25:09.797 "memory_domains": [ 00:25:09.797 { 00:25:09.797 "dma_device_id": "system", 00:25:09.797 "dma_device_type": 1 00:25:09.797 }, 00:25:09.797 { 00:25:09.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.797 "dma_device_type": 2 00:25:09.797 } 00:25:09.797 ], 00:25:09.797 "driver_specific": { 00:25:09.797 "passthru": { 00:25:09.797 "name": "pt1", 00:25:09.797 "base_bdev_name": "malloc1" 00:25:09.797 } 00:25:09.797 } 00:25:09.797 }' 00:25:09.797 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:09.797 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.055 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.313 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.313 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:10.313 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:10.313 12:27:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:10.572 "name": "pt2", 00:25:10.572 "aliases": [ 00:25:10.572 "00000000-0000-0000-0000-000000000002" 00:25:10.572 ], 00:25:10.572 "product_name": "passthru", 00:25:10.572 "block_size": 512, 00:25:10.572 "num_blocks": 65536, 00:25:10.572 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:10.572 "assigned_rate_limits": { 00:25:10.572 "rw_ios_per_sec": 0, 00:25:10.572 "rw_mbytes_per_sec": 0, 00:25:10.572 "r_mbytes_per_sec": 0, 00:25:10.572 "w_mbytes_per_sec": 0 00:25:10.572 }, 00:25:10.572 "claimed": true, 00:25:10.572 "claim_type": "exclusive_write", 00:25:10.572 "zoned": false, 00:25:10.572 "supported_io_types": { 00:25:10.572 "read": true, 00:25:10.572 "write": true, 00:25:10.572 "unmap": true, 00:25:10.572 "write_zeroes": true, 00:25:10.572 "flush": true, 00:25:10.572 "reset": true, 00:25:10.572 "compare": false, 00:25:10.572 "compare_and_write": false, 00:25:10.572 "abort": true, 00:25:10.572 "nvme_admin": false, 00:25:10.572 "nvme_io": false 00:25:10.572 }, 00:25:10.572 "memory_domains": [ 00:25:10.572 { 00:25:10.572 "dma_device_id": "system", 00:25:10.572 "dma_device_type": 1 00:25:10.572 }, 00:25:10.572 { 00:25:10.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:10.572 "dma_device_type": 2 00:25:10.572 } 00:25:10.572 ], 00:25:10.572 "driver_specific": { 00:25:10.572 "passthru": { 00:25:10.572 "name": "pt2", 00:25:10.572 "base_bdev_name": "malloc2" 00:25:10.572 } 00:25:10.572 } 00:25:10.572 }' 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.572 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.831 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:11.089 "name": "pt3", 00:25:11.089 "aliases": [ 00:25:11.089 "00000000-0000-0000-0000-000000000003" 00:25:11.089 ], 00:25:11.089 "product_name": "passthru", 00:25:11.089 "block_size": 512, 00:25:11.089 "num_blocks": 65536, 00:25:11.089 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:11.089 "assigned_rate_limits": { 00:25:11.089 "rw_ios_per_sec": 0, 00:25:11.089 "rw_mbytes_per_sec": 0, 00:25:11.089 "r_mbytes_per_sec": 0, 00:25:11.089 "w_mbytes_per_sec": 0 00:25:11.089 }, 00:25:11.089 "claimed": true, 00:25:11.089 "claim_type": "exclusive_write", 00:25:11.089 "zoned": false, 00:25:11.089 "supported_io_types": { 00:25:11.089 "read": true, 00:25:11.089 "write": true, 00:25:11.089 "unmap": true, 00:25:11.089 "write_zeroes": true, 00:25:11.089 "flush": true, 00:25:11.089 "reset": true, 00:25:11.089 "compare": false, 00:25:11.089 "compare_and_write": false, 00:25:11.089 "abort": true, 00:25:11.089 "nvme_admin": false, 00:25:11.089 "nvme_io": false 00:25:11.089 }, 00:25:11.089 "memory_domains": [ 00:25:11.089 { 00:25:11.089 "dma_device_id": "system", 00:25:11.089 "dma_device_type": 1 00:25:11.089 }, 00:25:11.089 { 00:25:11.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:11.089 "dma_device_type": 2 00:25:11.089 } 00:25:11.089 ], 00:25:11.089 "driver_specific": { 00:25:11.089 "passthru": { 00:25:11.089 "name": "pt3", 00:25:11.089 "base_bdev_name": "malloc3" 00:25:11.089 } 00:25:11.089 } 00:25:11.089 }' 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.089 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:11.347 12:27:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:11.604 [2024-06-07 12:27:35.141220] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:11.604 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=58e3e172-08fc-48e9-b0bc-be243327cc02 00:25:11.604 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 58e3e172-08fc-48e9-b0bc-be243327cc02 ']' 00:25:11.604 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:11.863 [2024-06-07 12:27:35.389105] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:11.863 [2024-06-07 12:27:35.389170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:11.863 [2024-06-07 12:27:35.389317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:11.863 [2024-06-07 12:27:35.389379] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:11.863 [2024-06-07 12:27:35.389392] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name raid_bdev1, state offline 00:25:11.863 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.863 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:12.120 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:12.120 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:12.120 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:12.120 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:12.378 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:12.378 12:27:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:12.636 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:12.636 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:25:12.893 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:12.893 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:25:13.151 12:27:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:25:13.409 [2024-06-07 12:27:37.001360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:13.409 [2024-06-07 12:27:37.003599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:13.409 [2024-06-07 12:27:37.003680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:25:13.409 [2024-06-07 12:27:37.003721] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:13.409 [2024-06-07 12:27:37.003831] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:13.409 [2024-06-07 12:27:37.003862] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:25:13.409 [2024-06-07 12:27:37.003924] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:13.409 [2024-06-07 12:27:37.003937] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state configuring 00:25:13.409 request: 00:25:13.409 { 00:25:13.409 "name": "raid_bdev1", 00:25:13.409 "raid_level": "raid0", 00:25:13.409 "base_bdevs": [ 00:25:13.409 "malloc1", 00:25:13.409 "malloc2", 00:25:13.409 "malloc3" 00:25:13.409 ], 00:25:13.409 "superblock": false, 00:25:13.409 "strip_size_kb": 64, 00:25:13.409 "method": "bdev_raid_create", 00:25:13.409 "req_id": 1 00:25:13.409 } 00:25:13.409 Got JSON-RPC error response 00:25:13.409 response: 00:25:13.409 { 00:25:13.409 "code": -17, 00:25:13.409 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:13.409 } 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.409 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:13.691 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:13.691 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:13.691 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:13.961 [2024-06-07 12:27:37.565363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:13.961 [2024-06-07 12:27:37.565484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:13.961 [2024-06-07 12:27:37.565530] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:25:13.961 [2024-06-07 12:27:37.565557] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:13.961 [2024-06-07 12:27:37.568001] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:13.961 [2024-06-07 12:27:37.568090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:13.961 [2024-06-07 12:27:37.568217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:13.961 [2024-06-07 12:27:37.568287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:13.961 pt1 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.961 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.223 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.223 "name": "raid_bdev1", 00:25:14.223 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:14.223 "strip_size_kb": 64, 00:25:14.223 "state": "configuring", 00:25:14.223 "raid_level": "raid0", 00:25:14.223 "superblock": true, 00:25:14.223 "num_base_bdevs": 3, 00:25:14.223 "num_base_bdevs_discovered": 1, 00:25:14.223 "num_base_bdevs_operational": 3, 00:25:14.223 "base_bdevs_list": [ 00:25:14.223 { 00:25:14.223 "name": "pt1", 00:25:14.223 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:14.223 "is_configured": true, 00:25:14.223 "data_offset": 2048, 00:25:14.223 "data_size": 63488 00:25:14.223 }, 00:25:14.223 { 00:25:14.223 "name": null, 00:25:14.223 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:14.223 "is_configured": false, 00:25:14.223 "data_offset": 2048, 00:25:14.223 "data_size": 63488 00:25:14.223 }, 00:25:14.223 { 00:25:14.223 "name": null, 00:25:14.223 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:14.223 "is_configured": false, 00:25:14.223 "data_offset": 2048, 00:25:14.223 "data_size": 63488 00:25:14.223 } 00:25:14.223 ] 00:25:14.223 }' 00:25:14.223 12:27:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.223 12:27:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:14.792 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:25:14.792 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:15.357 [2024-06-07 12:27:38.705524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:15.357 [2024-06-07 12:27:38.705665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.357 [2024-06-07 12:27:38.705716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:25:15.357 [2024-06-07 12:27:38.705746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.357 [2024-06-07 12:27:38.706152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.357 [2024-06-07 12:27:38.706185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:15.357 [2024-06-07 12:27:38.706296] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:15.357 [2024-06-07 12:27:38.706326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:15.357 pt2 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:15.357 [2024-06-07 12:27:38.933583] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.357 12:27:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.615 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.615 "name": "raid_bdev1", 00:25:15.615 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:15.615 "strip_size_kb": 64, 00:25:15.615 "state": "configuring", 00:25:15.615 "raid_level": "raid0", 00:25:15.615 "superblock": true, 00:25:15.615 "num_base_bdevs": 3, 00:25:15.615 "num_base_bdevs_discovered": 1, 00:25:15.615 "num_base_bdevs_operational": 3, 00:25:15.615 "base_bdevs_list": [ 00:25:15.615 { 00:25:15.615 "name": "pt1", 00:25:15.615 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:15.615 "is_configured": true, 00:25:15.615 "data_offset": 2048, 00:25:15.615 "data_size": 63488 00:25:15.615 }, 00:25:15.615 { 00:25:15.615 "name": null, 00:25:15.615 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:15.615 "is_configured": false, 00:25:15.615 "data_offset": 2048, 00:25:15.615 "data_size": 63488 00:25:15.615 }, 00:25:15.615 { 00:25:15.615 "name": null, 00:25:15.615 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:15.615 "is_configured": false, 00:25:15.615 "data_offset": 2048, 00:25:15.615 "data_size": 63488 00:25:15.615 } 00:25:15.615 ] 00:25:15.615 }' 00:25:15.615 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.615 12:27:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:16.183 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:16.183 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:16.183 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:16.441 [2024-06-07 12:27:39.969689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:16.441 [2024-06-07 12:27:39.969826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.441 [2024-06-07 12:27:39.969862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:25:16.441 [2024-06-07 12:27:39.969893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.441 [2024-06-07 12:27:39.970287] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.441 [2024-06-07 12:27:39.970324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:16.441 [2024-06-07 12:27:39.970414] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:16.441 [2024-06-07 12:27:39.970435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:16.441 pt2 00:25:16.441 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:16.441 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:16.441 12:27:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:25:16.699 [2024-06-07 12:27:40.189694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:25:16.699 [2024-06-07 12:27:40.189821] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.699 [2024-06-07 12:27:40.189862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009380 00:25:16.699 [2024-06-07 12:27:40.189895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.699 [2024-06-07 12:27:40.190355] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.699 [2024-06-07 12:27:40.190403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:25:16.699 [2024-06-07 12:27:40.190499] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:25:16.699 [2024-06-07 12:27:40.190522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:25:16.699 [2024-06-07 12:27:40.190619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:25:16.699 [2024-06-07 12:27:40.190637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:25:16.699 [2024-06-07 12:27:40.190708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:25:16.699 [2024-06-07 12:27:40.190928] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:25:16.699 [2024-06-07 12:27:40.190947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:25:16.699 [2024-06-07 12:27:40.191018] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.699 pt3 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.699 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.958 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.958 "name": "raid_bdev1", 00:25:16.958 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:16.958 "strip_size_kb": 64, 00:25:16.958 "state": "online", 00:25:16.958 "raid_level": "raid0", 00:25:16.958 "superblock": true, 00:25:16.958 "num_base_bdevs": 3, 00:25:16.958 "num_base_bdevs_discovered": 3, 00:25:16.958 "num_base_bdevs_operational": 3, 00:25:16.958 "base_bdevs_list": [ 00:25:16.958 { 00:25:16.958 "name": "pt1", 00:25:16.958 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:16.958 "is_configured": true, 00:25:16.958 "data_offset": 2048, 00:25:16.958 "data_size": 63488 00:25:16.958 }, 00:25:16.958 { 00:25:16.958 "name": "pt2", 00:25:16.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:16.958 "is_configured": true, 00:25:16.958 "data_offset": 2048, 00:25:16.958 "data_size": 63488 00:25:16.958 }, 00:25:16.958 { 00:25:16.958 "name": "pt3", 00:25:16.958 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:16.958 "is_configured": true, 00:25:16.958 "data_offset": 2048, 00:25:16.958 "data_size": 63488 00:25:16.958 } 00:25:16.958 ] 00:25:16.958 }' 00:25:16.958 12:27:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.958 12:27:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:17.524 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:17.782 [2024-06-07 12:27:41.241965] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:17.782 "name": "raid_bdev1", 00:25:17.782 "aliases": [ 00:25:17.782 "58e3e172-08fc-48e9-b0bc-be243327cc02" 00:25:17.782 ], 00:25:17.782 "product_name": "Raid Volume", 00:25:17.782 "block_size": 512, 00:25:17.782 "num_blocks": 190464, 00:25:17.782 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:17.782 "assigned_rate_limits": { 00:25:17.782 "rw_ios_per_sec": 0, 00:25:17.782 "rw_mbytes_per_sec": 0, 00:25:17.782 "r_mbytes_per_sec": 0, 00:25:17.782 "w_mbytes_per_sec": 0 00:25:17.782 }, 00:25:17.782 "claimed": false, 00:25:17.782 "zoned": false, 00:25:17.782 "supported_io_types": { 00:25:17.782 "read": true, 00:25:17.782 "write": true, 00:25:17.782 "unmap": true, 00:25:17.782 "write_zeroes": true, 00:25:17.782 "flush": true, 00:25:17.782 "reset": true, 00:25:17.782 "compare": false, 00:25:17.782 "compare_and_write": false, 00:25:17.782 "abort": false, 00:25:17.782 "nvme_admin": false, 00:25:17.782 "nvme_io": false 00:25:17.782 }, 00:25:17.782 "memory_domains": [ 00:25:17.782 { 00:25:17.782 "dma_device_id": "system", 00:25:17.782 "dma_device_type": 1 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.782 "dma_device_type": 2 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "dma_device_id": "system", 00:25:17.782 "dma_device_type": 1 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.782 "dma_device_type": 2 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "dma_device_id": "system", 00:25:17.782 "dma_device_type": 1 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.782 "dma_device_type": 2 00:25:17.782 } 00:25:17.782 ], 00:25:17.782 "driver_specific": { 00:25:17.782 "raid": { 00:25:17.782 "uuid": "58e3e172-08fc-48e9-b0bc-be243327cc02", 00:25:17.782 "strip_size_kb": 64, 00:25:17.782 "state": "online", 00:25:17.782 "raid_level": "raid0", 00:25:17.782 "superblock": true, 00:25:17.782 "num_base_bdevs": 3, 00:25:17.782 "num_base_bdevs_discovered": 3, 00:25:17.782 "num_base_bdevs_operational": 3, 00:25:17.782 "base_bdevs_list": [ 00:25:17.782 { 00:25:17.782 "name": "pt1", 00:25:17.782 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:17.782 "is_configured": true, 00:25:17.782 "data_offset": 2048, 00:25:17.782 "data_size": 63488 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "name": "pt2", 00:25:17.782 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:17.782 "is_configured": true, 00:25:17.782 "data_offset": 2048, 00:25:17.782 "data_size": 63488 00:25:17.782 }, 00:25:17.782 { 00:25:17.782 "name": "pt3", 00:25:17.782 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:17.782 "is_configured": true, 00:25:17.782 "data_offset": 2048, 00:25:17.782 "data_size": 63488 00:25:17.782 } 00:25:17.782 ] 00:25:17.782 } 00:25:17.782 } 00:25:17.782 }' 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:17.782 pt2 00:25:17.782 pt3' 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:17.782 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:18.040 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:18.040 "name": "pt1", 00:25:18.040 "aliases": [ 00:25:18.040 "00000000-0000-0000-0000-000000000001" 00:25:18.040 ], 00:25:18.040 "product_name": "passthru", 00:25:18.040 "block_size": 512, 00:25:18.040 "num_blocks": 65536, 00:25:18.040 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:18.040 "assigned_rate_limits": { 00:25:18.040 "rw_ios_per_sec": 0, 00:25:18.040 "rw_mbytes_per_sec": 0, 00:25:18.040 "r_mbytes_per_sec": 0, 00:25:18.040 "w_mbytes_per_sec": 0 00:25:18.040 }, 00:25:18.040 "claimed": true, 00:25:18.040 "claim_type": "exclusive_write", 00:25:18.040 "zoned": false, 00:25:18.040 "supported_io_types": { 00:25:18.040 "read": true, 00:25:18.040 "write": true, 00:25:18.040 "unmap": true, 00:25:18.040 "write_zeroes": true, 00:25:18.040 "flush": true, 00:25:18.040 "reset": true, 00:25:18.040 "compare": false, 00:25:18.040 "compare_and_write": false, 00:25:18.040 "abort": true, 00:25:18.040 "nvme_admin": false, 00:25:18.041 "nvme_io": false 00:25:18.041 }, 00:25:18.041 "memory_domains": [ 00:25:18.041 { 00:25:18.041 "dma_device_id": "system", 00:25:18.041 "dma_device_type": 1 00:25:18.041 }, 00:25:18.041 { 00:25:18.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.041 "dma_device_type": 2 00:25:18.041 } 00:25:18.041 ], 00:25:18.041 "driver_specific": { 00:25:18.041 "passthru": { 00:25:18.041 "name": "pt1", 00:25:18.041 "base_bdev_name": "malloc1" 00:25:18.041 } 00:25:18.041 } 00:25:18.041 }' 00:25:18.041 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.041 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.299 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:18.299 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.299 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.300 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.558 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:18.558 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:18.558 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:18.558 12:27:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:18.558 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:18.558 "name": "pt2", 00:25:18.558 "aliases": [ 00:25:18.558 "00000000-0000-0000-0000-000000000002" 00:25:18.558 ], 00:25:18.558 "product_name": "passthru", 00:25:18.558 "block_size": 512, 00:25:18.558 "num_blocks": 65536, 00:25:18.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:18.558 "assigned_rate_limits": { 00:25:18.558 "rw_ios_per_sec": 0, 00:25:18.558 "rw_mbytes_per_sec": 0, 00:25:18.558 "r_mbytes_per_sec": 0, 00:25:18.558 "w_mbytes_per_sec": 0 00:25:18.558 }, 00:25:18.558 "claimed": true, 00:25:18.558 "claim_type": "exclusive_write", 00:25:18.558 "zoned": false, 00:25:18.558 "supported_io_types": { 00:25:18.558 "read": true, 00:25:18.558 "write": true, 00:25:18.558 "unmap": true, 00:25:18.558 "write_zeroes": true, 00:25:18.558 "flush": true, 00:25:18.558 "reset": true, 00:25:18.558 "compare": false, 00:25:18.558 "compare_and_write": false, 00:25:18.558 "abort": true, 00:25:18.558 "nvme_admin": false, 00:25:18.558 "nvme_io": false 00:25:18.558 }, 00:25:18.558 "memory_domains": [ 00:25:18.558 { 00:25:18.558 "dma_device_id": "system", 00:25:18.558 "dma_device_type": 1 00:25:18.558 }, 00:25:18.558 { 00:25:18.558 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.558 "dma_device_type": 2 00:25:18.558 } 00:25:18.558 ], 00:25:18.558 "driver_specific": { 00:25:18.558 "passthru": { 00:25:18.558 "name": "pt2", 00:25:18.558 "base_bdev_name": "malloc2" 00:25:18.558 } 00:25:18.558 } 00:25:18.558 }' 00:25:18.558 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:18.817 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.075 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.075 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:19.075 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:19.075 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:25:19.075 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:19.334 "name": "pt3", 00:25:19.334 "aliases": [ 00:25:19.334 "00000000-0000-0000-0000-000000000003" 00:25:19.334 ], 00:25:19.334 "product_name": "passthru", 00:25:19.334 "block_size": 512, 00:25:19.334 "num_blocks": 65536, 00:25:19.334 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:19.334 "assigned_rate_limits": { 00:25:19.334 "rw_ios_per_sec": 0, 00:25:19.334 "rw_mbytes_per_sec": 0, 00:25:19.334 "r_mbytes_per_sec": 0, 00:25:19.334 "w_mbytes_per_sec": 0 00:25:19.334 }, 00:25:19.334 "claimed": true, 00:25:19.334 "claim_type": "exclusive_write", 00:25:19.334 "zoned": false, 00:25:19.334 "supported_io_types": { 00:25:19.334 "read": true, 00:25:19.334 "write": true, 00:25:19.334 "unmap": true, 00:25:19.334 "write_zeroes": true, 00:25:19.334 "flush": true, 00:25:19.334 "reset": true, 00:25:19.334 "compare": false, 00:25:19.334 "compare_and_write": false, 00:25:19.334 "abort": true, 00:25:19.334 "nvme_admin": false, 00:25:19.334 "nvme_io": false 00:25:19.334 }, 00:25:19.334 "memory_domains": [ 00:25:19.334 { 00:25:19.334 "dma_device_id": "system", 00:25:19.334 "dma_device_type": 1 00:25:19.334 }, 00:25:19.334 { 00:25:19.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:19.334 "dma_device_type": 2 00:25:19.334 } 00:25:19.334 ], 00:25:19.334 "driver_specific": { 00:25:19.334 "passthru": { 00:25:19.334 "name": "pt3", 00:25:19.334 "base_bdev_name": "malloc3" 00:25:19.334 } 00:25:19.334 } 00:25:19.334 }' 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.334 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.594 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:19.594 12:27:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.594 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.594 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:19.594 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:19.594 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:19.853 [2024-06-07 12:27:43.326190] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 58e3e172-08fc-48e9-b0bc-be243327cc02 '!=' 58e3e172-08fc-48e9-b0bc-be243327cc02 ']' 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 204676 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 204676 ']' 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 204676 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 204676 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:19.853 killing process with pid 204676 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 204676' 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 204676 00:25:19.853 [2024-06-07 12:27:43.374621] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:19.853 [2024-06-07 12:27:43.374693] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:19.853 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 204676 00:25:19.853 [2024-06-07 12:27:43.374743] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:19.853 [2024-06-07 12:27:43.374752] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:25:19.853 [2024-06-07 12:27:43.437830] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:20.493 12:27:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:25:20.493 00:25:20.493 real 0m14.933s 00:25:20.493 user 0m26.863s 00:25:20.493 sys 0m2.401s 00:25:20.493 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:20.493 ************************************ 00:25:20.493 END TEST raid_superblock_test 00:25:20.493 12:27:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:20.493 ************************************ 00:25:20.493 12:27:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:25:20.493 12:27:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:25:20.493 12:27:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:20.493 12:27:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:20.493 ************************************ 00:25:20.493 START TEST raid_read_error_test 00:25:20.493 ************************************ 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 read 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.nQiT86KpPo 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=205161 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 205161 /var/tmp/spdk-raid.sock 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 205161 ']' 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:20.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:20.493 12:27:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:20.493 [2024-06-07 12:27:43.905651] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:25:20.493 [2024-06-07 12:27:43.906742] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205161 ] 00:25:20.493 [2024-06-07 12:27:44.051689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:20.752 [2024-06-07 12:27:44.149064] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:20.752 [2024-06-07 12:27:44.233149] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:21.321 12:27:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:21.321 12:27:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:25:21.321 12:27:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:21.321 12:27:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:21.581 BaseBdev1_malloc 00:25:21.581 12:27:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:25:21.841 true 00:25:21.841 12:27:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:25:22.101 [2024-06-07 12:27:45.590745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:25:22.101 [2024-06-07 12:27:45.590928] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.101 [2024-06-07 12:27:45.591016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:25:22.101 [2024-06-07 12:27:45.591118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.101 [2024-06-07 12:27:45.593802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.101 [2024-06-07 12:27:45.593892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:22.101 BaseBdev1 00:25:22.101 12:27:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:22.101 12:27:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:22.361 BaseBdev2_malloc 00:25:22.361 12:27:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:25:22.620 true 00:25:22.620 12:27:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:25:22.880 [2024-06-07 12:27:46.270629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:25:22.880 [2024-06-07 12:27:46.270758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.880 [2024-06-07 12:27:46.270831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:25:22.880 [2024-06-07 12:27:46.270911] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.880 [2024-06-07 12:27:46.273399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.880 [2024-06-07 12:27:46.273493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:22.880 BaseBdev2 00:25:22.880 12:27:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:22.880 12:27:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:22.880 BaseBdev3_malloc 00:25:23.139 12:27:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:25:23.463 true 00:25:23.463 12:27:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:25:23.463 [2024-06-07 12:27:47.020372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:25:23.463 [2024-06-07 12:27:47.020503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:23.463 [2024-06-07 12:27:47.020593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:25:23.463 [2024-06-07 12:27:47.020703] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:23.463 [2024-06-07 12:27:47.023284] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:23.463 [2024-06-07 12:27:47.023372] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:23.463 BaseBdev3 00:25:23.463 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:25:23.723 [2024-06-07 12:27:47.232544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:23.723 [2024-06-07 12:27:47.234684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:23.723 [2024-06-07 12:27:47.234765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:23.723 [2024-06-07 12:27:47.234976] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:25:23.723 [2024-06-07 12:27:47.235000] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:25:23.723 [2024-06-07 12:27:47.235194] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:25:23.723 [2024-06-07 12:27:47.235652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:25:23.723 [2024-06-07 12:27:47.235677] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:25:23.723 [2024-06-07 12:27:47.235869] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.723 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.982 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.982 "name": "raid_bdev1", 00:25:23.982 "uuid": "6c617f54-4870-4dab-a8b5-a40fba9e5954", 00:25:23.982 "strip_size_kb": 64, 00:25:23.982 "state": "online", 00:25:23.982 "raid_level": "raid0", 00:25:23.982 "superblock": true, 00:25:23.982 "num_base_bdevs": 3, 00:25:23.982 "num_base_bdevs_discovered": 3, 00:25:23.982 "num_base_bdevs_operational": 3, 00:25:23.982 "base_bdevs_list": [ 00:25:23.982 { 00:25:23.982 "name": "BaseBdev1", 00:25:23.982 "uuid": "b403a6b6-3a21-54d5-833a-993d94e9cecb", 00:25:23.982 "is_configured": true, 00:25:23.982 "data_offset": 2048, 00:25:23.982 "data_size": 63488 00:25:23.982 }, 00:25:23.982 { 00:25:23.982 "name": "BaseBdev2", 00:25:23.982 "uuid": "b3507747-7a68-5fb7-8c71-3c6e6af5f603", 00:25:23.982 "is_configured": true, 00:25:23.982 "data_offset": 2048, 00:25:23.982 "data_size": 63488 00:25:23.982 }, 00:25:23.982 { 00:25:23.982 "name": "BaseBdev3", 00:25:23.982 "uuid": "f3b59e91-01bb-52a6-8d09-cd4e71d4ef0c", 00:25:23.982 "is_configured": true, 00:25:23.982 "data_offset": 2048, 00:25:23.982 "data_size": 63488 00:25:23.982 } 00:25:23.982 ] 00:25:23.982 }' 00:25:23.982 12:27:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.982 12:27:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:24.547 12:27:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:25:24.547 12:27:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:24.806 [2024-06-07 12:27:48.240959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.740 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.998 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.998 "name": "raid_bdev1", 00:25:25.998 "uuid": "6c617f54-4870-4dab-a8b5-a40fba9e5954", 00:25:25.998 "strip_size_kb": 64, 00:25:25.998 "state": "online", 00:25:25.998 "raid_level": "raid0", 00:25:25.998 "superblock": true, 00:25:25.998 "num_base_bdevs": 3, 00:25:25.998 "num_base_bdevs_discovered": 3, 00:25:25.998 "num_base_bdevs_operational": 3, 00:25:25.998 "base_bdevs_list": [ 00:25:25.998 { 00:25:25.998 "name": "BaseBdev1", 00:25:25.998 "uuid": "b403a6b6-3a21-54d5-833a-993d94e9cecb", 00:25:25.998 "is_configured": true, 00:25:25.998 "data_offset": 2048, 00:25:25.998 "data_size": 63488 00:25:25.998 }, 00:25:25.998 { 00:25:25.998 "name": "BaseBdev2", 00:25:25.998 "uuid": "b3507747-7a68-5fb7-8c71-3c6e6af5f603", 00:25:25.998 "is_configured": true, 00:25:25.998 "data_offset": 2048, 00:25:25.998 "data_size": 63488 00:25:25.998 }, 00:25:25.998 { 00:25:25.998 "name": "BaseBdev3", 00:25:25.998 "uuid": "f3b59e91-01bb-52a6-8d09-cd4e71d4ef0c", 00:25:25.998 "is_configured": true, 00:25:25.998 "data_offset": 2048, 00:25:25.998 "data_size": 63488 00:25:25.998 } 00:25:25.998 ] 00:25:25.998 }' 00:25:25.998 12:27:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.998 12:27:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:26.967 12:27:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:26.967 [2024-06-07 12:27:50.610189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:26.967 [2024-06-07 12:27:50.610263] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:26.967 [2024-06-07 12:27:50.611559] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:26.967 [2024-06-07 12:27:50.611619] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:26.967 [2024-06-07 12:27:50.611648] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:26.967 [2024-06-07 12:27:50.611659] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:25:27.225 0 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 205161 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 205161 ']' 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 205161 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 205161 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 205161' 00:25:27.225 killing process with pid 205161 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 205161 00:25:27.225 [2024-06-07 12:27:50.680005] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:27.225 12:27:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 205161 00:25:27.225 [2024-06-07 12:27:50.729538] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:27.483 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.nQiT86KpPo 00:25:27.483 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:25:27.483 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:25:27.483 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:25:27.484 00:25:27.484 real 0m7.261s 00:25:27.484 user 0m11.343s 00:25:27.484 sys 0m1.158s 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:27.484 12:27:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:27.484 ************************************ 00:25:27.484 END TEST raid_read_error_test 00:25:27.484 ************************************ 00:25:27.742 12:27:51 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:25:27.742 12:27:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:25:27.742 12:27:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:27.742 12:27:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:27.742 ************************************ 00:25:27.742 START TEST raid_write_error_test 00:25:27.742 ************************************ 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 write 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Syd6WsQkqI 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=205353 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 205353 /var/tmp/spdk-raid.sock 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 205353 ']' 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:27.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:27.742 12:27:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:27.742 [2024-06-07 12:27:51.237394] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:25:27.742 [2024-06-07 12:27:51.237703] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205353 ] 00:25:27.742 [2024-06-07 12:27:51.378462] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:27.999 [2024-06-07 12:27:51.474727] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:27.999 [2024-06-07 12:27:51.556339] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:28.989 12:27:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:28.989 12:27:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:25:28.989 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:28.989 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:28.989 BaseBdev1_malloc 00:25:28.989 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:25:29.249 true 00:25:29.249 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:25:29.506 [2024-06-07 12:27:52.966316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:25:29.506 [2024-06-07 12:27:52.966448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.506 [2024-06-07 12:27:52.966506] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:25:29.506 [2024-06-07 12:27:52.966571] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.506 [2024-06-07 12:27:52.968963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.506 [2024-06-07 12:27:52.969028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:29.506 BaseBdev1 00:25:29.506 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:29.506 12:27:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:29.763 BaseBdev2_malloc 00:25:29.763 12:27:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:25:30.035 true 00:25:30.035 12:27:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:25:30.294 [2024-06-07 12:27:53.771164] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:25:30.294 [2024-06-07 12:27:53.771311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.294 [2024-06-07 12:27:53.771369] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:25:30.294 [2024-06-07 12:27:53.771427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.294 [2024-06-07 12:27:53.773714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.294 [2024-06-07 12:27:53.773767] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:30.294 BaseBdev2 00:25:30.294 12:27:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:30.294 12:27:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:30.554 BaseBdev3_malloc 00:25:30.554 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:25:30.825 true 00:25:30.825 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:25:31.083 [2024-06-07 12:27:54.493977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:25:31.083 [2024-06-07 12:27:54.494098] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.083 [2024-06-07 12:27:54.494149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:25:31.083 [2024-06-07 12:27:54.494236] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.083 [2024-06-07 12:27:54.496629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.083 [2024-06-07 12:27:54.496694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:31.083 BaseBdev3 00:25:31.083 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:25:31.083 [2024-06-07 12:27:54.710157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:31.083 [2024-06-07 12:27:54.712337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:31.083 [2024-06-07 12:27:54.712403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:31.083 [2024-06-07 12:27:54.712581] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:25:31.083 [2024-06-07 12:27:54.712594] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:25:31.083 [2024-06-07 12:27:54.712744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:25:31.083 [2024-06-07 12:27:54.713099] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:25:31.083 [2024-06-07 12:27:54.713118] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:25:31.083 [2024-06-07 12:27:54.713282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.347 12:27:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.612 12:27:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.612 "name": "raid_bdev1", 00:25:31.612 "uuid": "fa9a422d-2383-4e08-9d41-a1a54fa508de", 00:25:31.612 "strip_size_kb": 64, 00:25:31.612 "state": "online", 00:25:31.612 "raid_level": "raid0", 00:25:31.612 "superblock": true, 00:25:31.612 "num_base_bdevs": 3, 00:25:31.612 "num_base_bdevs_discovered": 3, 00:25:31.612 "num_base_bdevs_operational": 3, 00:25:31.612 "base_bdevs_list": [ 00:25:31.612 { 00:25:31.612 "name": "BaseBdev1", 00:25:31.612 "uuid": "27afc7ea-1320-54a5-b76d-b662fdafec1b", 00:25:31.612 "is_configured": true, 00:25:31.612 "data_offset": 2048, 00:25:31.612 "data_size": 63488 00:25:31.612 }, 00:25:31.612 { 00:25:31.612 "name": "BaseBdev2", 00:25:31.612 "uuid": "7ff2c244-269e-5abb-8f5f-bc74eb46e1fa", 00:25:31.613 "is_configured": true, 00:25:31.613 "data_offset": 2048, 00:25:31.613 "data_size": 63488 00:25:31.613 }, 00:25:31.613 { 00:25:31.613 "name": "BaseBdev3", 00:25:31.613 "uuid": "cf69fb49-9a76-5288-b949-1faad675a517", 00:25:31.613 "is_configured": true, 00:25:31.613 "data_offset": 2048, 00:25:31.613 "data_size": 63488 00:25:31.613 } 00:25:31.613 ] 00:25:31.613 }' 00:25:31.613 12:27:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.613 12:27:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:32.177 12:27:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:25:32.177 12:27:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:32.177 [2024-06-07 12:27:55.702478] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:25:33.112 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.371 12:27:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.629 12:27:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.629 "name": "raid_bdev1", 00:25:33.629 "uuid": "fa9a422d-2383-4e08-9d41-a1a54fa508de", 00:25:33.629 "strip_size_kb": 64, 00:25:33.629 "state": "online", 00:25:33.629 "raid_level": "raid0", 00:25:33.629 "superblock": true, 00:25:33.629 "num_base_bdevs": 3, 00:25:33.629 "num_base_bdevs_discovered": 3, 00:25:33.629 "num_base_bdevs_operational": 3, 00:25:33.629 "base_bdevs_list": [ 00:25:33.629 { 00:25:33.629 "name": "BaseBdev1", 00:25:33.629 "uuid": "27afc7ea-1320-54a5-b76d-b662fdafec1b", 00:25:33.629 "is_configured": true, 00:25:33.629 "data_offset": 2048, 00:25:33.629 "data_size": 63488 00:25:33.629 }, 00:25:33.629 { 00:25:33.629 "name": "BaseBdev2", 00:25:33.629 "uuid": "7ff2c244-269e-5abb-8f5f-bc74eb46e1fa", 00:25:33.629 "is_configured": true, 00:25:33.629 "data_offset": 2048, 00:25:33.629 "data_size": 63488 00:25:33.629 }, 00:25:33.629 { 00:25:33.629 "name": "BaseBdev3", 00:25:33.629 "uuid": "cf69fb49-9a76-5288-b949-1faad675a517", 00:25:33.629 "is_configured": true, 00:25:33.629 "data_offset": 2048, 00:25:33.629 "data_size": 63488 00:25:33.629 } 00:25:33.629 ] 00:25:33.629 }' 00:25:33.629 12:27:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.629 12:27:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:34.193 12:27:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:34.760 [2024-06-07 12:27:58.102857] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:34.760 [2024-06-07 12:27:58.102918] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:34.760 [2024-06-07 12:27:58.104196] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:34.760 [2024-06-07 12:27:58.104279] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:34.760 [2024-06-07 12:27:58.104309] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:34.760 [2024-06-07 12:27:58.104320] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:25:34.760 0 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 205353 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 205353 ']' 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 205353 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 205353 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 205353' 00:25:34.760 killing process with pid 205353 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 205353 00:25:34.760 [2024-06-07 12:27:58.162797] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:34.760 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 205353 00:25:34.760 [2024-06-07 12:27:58.212693] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Syd6WsQkqI 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:25:35.018 00:25:35.018 real 0m7.407s 00:25:35.018 user 0m11.644s 00:25:35.018 sys 0m1.214s 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:35.018 12:27:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:35.018 ************************************ 00:25:35.018 END TEST raid_write_error_test 00:25:35.018 ************************************ 00:25:35.018 12:27:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:25:35.018 12:27:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:25:35.018 12:27:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:25:35.018 12:27:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:35.018 12:27:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:35.275 ************************************ 00:25:35.275 START TEST raid_state_function_test 00:25:35.275 ************************************ 00:25:35.275 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 false 00:25:35.275 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=205544 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 205544' 00:25:35.276 Process raid pid: 205544 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 205544 /var/tmp/spdk-raid.sock 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 205544 ']' 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:35.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:35.276 12:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:35.276 [2024-06-07 12:27:58.707948] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:25:35.276 [2024-06-07 12:27:58.708185] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:35.276 [2024-06-07 12:27:58.849945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:35.534 [2024-06-07 12:27:58.944733] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:35.534 [2024-06-07 12:27:59.026790] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:36.101 12:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:36.101 12:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:25:36.101 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:25:36.359 [2024-06-07 12:27:59.936263] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:36.359 [2024-06-07 12:27:59.936383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:36.359 [2024-06-07 12:27:59.936396] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:36.359 [2024-06-07 12:27:59.936421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:36.359 [2024-06-07 12:27:59.936429] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:36.359 [2024-06-07 12:27:59.936479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.359 12:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:36.947 12:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.947 "name": "Existed_Raid", 00:25:36.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.947 "strip_size_kb": 64, 00:25:36.947 "state": "configuring", 00:25:36.947 "raid_level": "concat", 00:25:36.947 "superblock": false, 00:25:36.947 "num_base_bdevs": 3, 00:25:36.947 "num_base_bdevs_discovered": 0, 00:25:36.947 "num_base_bdevs_operational": 3, 00:25:36.947 "base_bdevs_list": [ 00:25:36.947 { 00:25:36.947 "name": "BaseBdev1", 00:25:36.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.947 "is_configured": false, 00:25:36.947 "data_offset": 0, 00:25:36.947 "data_size": 0 00:25:36.947 }, 00:25:36.947 { 00:25:36.947 "name": "BaseBdev2", 00:25:36.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.947 "is_configured": false, 00:25:36.947 "data_offset": 0, 00:25:36.947 "data_size": 0 00:25:36.947 }, 00:25:36.947 { 00:25:36.947 "name": "BaseBdev3", 00:25:36.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.947 "is_configured": false, 00:25:36.947 "data_offset": 0, 00:25:36.947 "data_size": 0 00:25:36.947 } 00:25:36.947 ] 00:25:36.947 }' 00:25:36.947 12:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.947 12:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:37.206 12:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:37.465 [2024-06-07 12:28:01.104283] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:37.465 [2024-06-07 12:28:01.104345] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:25:37.724 12:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:25:37.724 [2024-06-07 12:28:01.340320] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:37.724 [2024-06-07 12:28:01.340415] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:37.724 [2024-06-07 12:28:01.340427] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:37.724 [2024-06-07 12:28:01.340450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:37.724 [2024-06-07 12:28:01.340458] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:37.724 [2024-06-07 12:28:01.340487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:37.724 12:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:25:38.291 [2024-06-07 12:28:01.676099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:38.291 BaseBdev1 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:38.292 12:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:38.550 [ 00:25:38.550 { 00:25:38.550 "name": "BaseBdev1", 00:25:38.550 "aliases": [ 00:25:38.550 "3e0219c3-e172-451a-bc82-75a43464a0d2" 00:25:38.550 ], 00:25:38.550 "product_name": "Malloc disk", 00:25:38.550 "block_size": 512, 00:25:38.550 "num_blocks": 65536, 00:25:38.550 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:38.550 "assigned_rate_limits": { 00:25:38.550 "rw_ios_per_sec": 0, 00:25:38.550 "rw_mbytes_per_sec": 0, 00:25:38.550 "r_mbytes_per_sec": 0, 00:25:38.550 "w_mbytes_per_sec": 0 00:25:38.550 }, 00:25:38.550 "claimed": true, 00:25:38.550 "claim_type": "exclusive_write", 00:25:38.550 "zoned": false, 00:25:38.550 "supported_io_types": { 00:25:38.550 "read": true, 00:25:38.550 "write": true, 00:25:38.550 "unmap": true, 00:25:38.550 "write_zeroes": true, 00:25:38.550 "flush": true, 00:25:38.550 "reset": true, 00:25:38.550 "compare": false, 00:25:38.550 "compare_and_write": false, 00:25:38.550 "abort": true, 00:25:38.550 "nvme_admin": false, 00:25:38.550 "nvme_io": false 00:25:38.550 }, 00:25:38.550 "memory_domains": [ 00:25:38.550 { 00:25:38.550 "dma_device_id": "system", 00:25:38.550 "dma_device_type": 1 00:25:38.550 }, 00:25:38.550 { 00:25:38.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:38.550 "dma_device_type": 2 00:25:38.550 } 00:25:38.550 ], 00:25:38.550 "driver_specific": {} 00:25:38.550 } 00:25:38.550 ] 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.550 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:38.808 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.808 "name": "Existed_Raid", 00:25:38.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.808 "strip_size_kb": 64, 00:25:38.808 "state": "configuring", 00:25:38.808 "raid_level": "concat", 00:25:38.808 "superblock": false, 00:25:38.808 "num_base_bdevs": 3, 00:25:38.808 "num_base_bdevs_discovered": 1, 00:25:38.808 "num_base_bdevs_operational": 3, 00:25:38.808 "base_bdevs_list": [ 00:25:38.808 { 00:25:38.808 "name": "BaseBdev1", 00:25:38.808 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:38.808 "is_configured": true, 00:25:38.808 "data_offset": 0, 00:25:38.808 "data_size": 65536 00:25:38.808 }, 00:25:38.808 { 00:25:38.808 "name": "BaseBdev2", 00:25:38.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.808 "is_configured": false, 00:25:38.808 "data_offset": 0, 00:25:38.808 "data_size": 0 00:25:38.808 }, 00:25:38.808 { 00:25:38.808 "name": "BaseBdev3", 00:25:38.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.808 "is_configured": false, 00:25:38.808 "data_offset": 0, 00:25:38.808 "data_size": 0 00:25:38.808 } 00:25:38.808 ] 00:25:38.808 }' 00:25:38.808 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.808 12:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:39.374 12:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:39.632 [2024-06-07 12:28:03.244375] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:39.632 [2024-06-07 12:28:03.244466] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:25:39.632 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:25:39.919 [2024-06-07 12:28:03.460462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:39.919 [2024-06-07 12:28:03.462342] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:39.919 [2024-06-07 12:28:03.462411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:39.919 [2024-06-07 12:28:03.462421] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:39.919 [2024-06-07 12:28:03.462451] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:39.919 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.182 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:40.182 "name": "Existed_Raid", 00:25:40.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.182 "strip_size_kb": 64, 00:25:40.182 "state": "configuring", 00:25:40.182 "raid_level": "concat", 00:25:40.182 "superblock": false, 00:25:40.182 "num_base_bdevs": 3, 00:25:40.182 "num_base_bdevs_discovered": 1, 00:25:40.182 "num_base_bdevs_operational": 3, 00:25:40.182 "base_bdevs_list": [ 00:25:40.182 { 00:25:40.182 "name": "BaseBdev1", 00:25:40.182 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:40.182 "is_configured": true, 00:25:40.182 "data_offset": 0, 00:25:40.182 "data_size": 65536 00:25:40.182 }, 00:25:40.182 { 00:25:40.182 "name": "BaseBdev2", 00:25:40.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.182 "is_configured": false, 00:25:40.182 "data_offset": 0, 00:25:40.182 "data_size": 0 00:25:40.182 }, 00:25:40.182 { 00:25:40.182 "name": "BaseBdev3", 00:25:40.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.182 "is_configured": false, 00:25:40.182 "data_offset": 0, 00:25:40.182 "data_size": 0 00:25:40.182 } 00:25:40.182 ] 00:25:40.182 }' 00:25:40.182 12:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:40.182 12:28:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:41.117 12:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:25:41.117 [2024-06-07 12:28:04.751639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:41.117 BaseBdev2 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:41.377 12:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:41.636 [ 00:25:41.636 { 00:25:41.636 "name": "BaseBdev2", 00:25:41.636 "aliases": [ 00:25:41.636 "becb9cba-7a1b-4140-8713-0f88a68ce3db" 00:25:41.636 ], 00:25:41.636 "product_name": "Malloc disk", 00:25:41.636 "block_size": 512, 00:25:41.636 "num_blocks": 65536, 00:25:41.636 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:41.636 "assigned_rate_limits": { 00:25:41.636 "rw_ios_per_sec": 0, 00:25:41.636 "rw_mbytes_per_sec": 0, 00:25:41.636 "r_mbytes_per_sec": 0, 00:25:41.636 "w_mbytes_per_sec": 0 00:25:41.636 }, 00:25:41.636 "claimed": true, 00:25:41.636 "claim_type": "exclusive_write", 00:25:41.636 "zoned": false, 00:25:41.636 "supported_io_types": { 00:25:41.636 "read": true, 00:25:41.636 "write": true, 00:25:41.636 "unmap": true, 00:25:41.636 "write_zeroes": true, 00:25:41.636 "flush": true, 00:25:41.636 "reset": true, 00:25:41.636 "compare": false, 00:25:41.636 "compare_and_write": false, 00:25:41.636 "abort": true, 00:25:41.636 "nvme_admin": false, 00:25:41.636 "nvme_io": false 00:25:41.636 }, 00:25:41.636 "memory_domains": [ 00:25:41.636 { 00:25:41.636 "dma_device_id": "system", 00:25:41.636 "dma_device_type": 1 00:25:41.636 }, 00:25:41.636 { 00:25:41.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:41.636 "dma_device_type": 2 00:25:41.636 } 00:25:41.636 ], 00:25:41.636 "driver_specific": {} 00:25:41.636 } 00:25:41.636 ] 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.636 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:42.204 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.204 "name": "Existed_Raid", 00:25:42.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.204 "strip_size_kb": 64, 00:25:42.204 "state": "configuring", 00:25:42.204 "raid_level": "concat", 00:25:42.204 "superblock": false, 00:25:42.204 "num_base_bdevs": 3, 00:25:42.204 "num_base_bdevs_discovered": 2, 00:25:42.204 "num_base_bdevs_operational": 3, 00:25:42.204 "base_bdevs_list": [ 00:25:42.204 { 00:25:42.204 "name": "BaseBdev1", 00:25:42.204 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:42.204 "is_configured": true, 00:25:42.204 "data_offset": 0, 00:25:42.204 "data_size": 65536 00:25:42.204 }, 00:25:42.204 { 00:25:42.204 "name": "BaseBdev2", 00:25:42.204 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:42.204 "is_configured": true, 00:25:42.204 "data_offset": 0, 00:25:42.204 "data_size": 65536 00:25:42.204 }, 00:25:42.204 { 00:25:42.204 "name": "BaseBdev3", 00:25:42.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.204 "is_configured": false, 00:25:42.204 "data_offset": 0, 00:25:42.204 "data_size": 0 00:25:42.204 } 00:25:42.204 ] 00:25:42.204 }' 00:25:42.204 12:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.204 12:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:25:42.772 [2024-06-07 12:28:06.369532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:42.772 [2024-06-07 12:28:06.369587] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:25:42.772 [2024-06-07 12:28:06.369596] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:25:42.772 [2024-06-07 12:28:06.369760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:25:42.772 [2024-06-07 12:28:06.370107] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:25:42.772 [2024-06-07 12:28:06.370118] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:25:42.772 [2024-06-07 12:28:06.370318] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.772 BaseBdev3 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:42.772 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:25:43.338 [ 00:25:43.338 { 00:25:43.338 "name": "BaseBdev3", 00:25:43.338 "aliases": [ 00:25:43.338 "0735902e-9c45-4965-9625-491ef98b1055" 00:25:43.338 ], 00:25:43.338 "product_name": "Malloc disk", 00:25:43.338 "block_size": 512, 00:25:43.338 "num_blocks": 65536, 00:25:43.338 "uuid": "0735902e-9c45-4965-9625-491ef98b1055", 00:25:43.338 "assigned_rate_limits": { 00:25:43.338 "rw_ios_per_sec": 0, 00:25:43.338 "rw_mbytes_per_sec": 0, 00:25:43.338 "r_mbytes_per_sec": 0, 00:25:43.338 "w_mbytes_per_sec": 0 00:25:43.338 }, 00:25:43.338 "claimed": true, 00:25:43.338 "claim_type": "exclusive_write", 00:25:43.338 "zoned": false, 00:25:43.338 "supported_io_types": { 00:25:43.338 "read": true, 00:25:43.338 "write": true, 00:25:43.338 "unmap": true, 00:25:43.338 "write_zeroes": true, 00:25:43.338 "flush": true, 00:25:43.338 "reset": true, 00:25:43.338 "compare": false, 00:25:43.338 "compare_and_write": false, 00:25:43.338 "abort": true, 00:25:43.338 "nvme_admin": false, 00:25:43.338 "nvme_io": false 00:25:43.338 }, 00:25:43.338 "memory_domains": [ 00:25:43.338 { 00:25:43.338 "dma_device_id": "system", 00:25:43.338 "dma_device_type": 1 00:25:43.338 }, 00:25:43.338 { 00:25:43.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:43.338 "dma_device_type": 2 00:25:43.338 } 00:25:43.338 ], 00:25:43.338 "driver_specific": {} 00:25:43.338 } 00:25:43.338 ] 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.338 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.339 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.339 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.339 12:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:43.597 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.597 "name": "Existed_Raid", 00:25:43.597 "uuid": "6aa90498-0ab6-48b1-ad05-1bb4e42b17ce", 00:25:43.597 "strip_size_kb": 64, 00:25:43.597 "state": "online", 00:25:43.597 "raid_level": "concat", 00:25:43.597 "superblock": false, 00:25:43.597 "num_base_bdevs": 3, 00:25:43.597 "num_base_bdevs_discovered": 3, 00:25:43.597 "num_base_bdevs_operational": 3, 00:25:43.597 "base_bdevs_list": [ 00:25:43.597 { 00:25:43.597 "name": "BaseBdev1", 00:25:43.597 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:43.597 "is_configured": true, 00:25:43.597 "data_offset": 0, 00:25:43.597 "data_size": 65536 00:25:43.597 }, 00:25:43.597 { 00:25:43.597 "name": "BaseBdev2", 00:25:43.597 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:43.597 "is_configured": true, 00:25:43.597 "data_offset": 0, 00:25:43.597 "data_size": 65536 00:25:43.597 }, 00:25:43.597 { 00:25:43.597 "name": "BaseBdev3", 00:25:43.597 "uuid": "0735902e-9c45-4965-9625-491ef98b1055", 00:25:43.597 "is_configured": true, 00:25:43.597 "data_offset": 0, 00:25:43.597 "data_size": 65536 00:25:43.597 } 00:25:43.597 ] 00:25:43.597 }' 00:25:43.597 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.597 12:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:44.162 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:44.454 [2024-06-07 12:28:07.969969] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:44.454 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:44.454 "name": "Existed_Raid", 00:25:44.454 "aliases": [ 00:25:44.454 "6aa90498-0ab6-48b1-ad05-1bb4e42b17ce" 00:25:44.454 ], 00:25:44.454 "product_name": "Raid Volume", 00:25:44.454 "block_size": 512, 00:25:44.454 "num_blocks": 196608, 00:25:44.454 "uuid": "6aa90498-0ab6-48b1-ad05-1bb4e42b17ce", 00:25:44.454 "assigned_rate_limits": { 00:25:44.454 "rw_ios_per_sec": 0, 00:25:44.454 "rw_mbytes_per_sec": 0, 00:25:44.454 "r_mbytes_per_sec": 0, 00:25:44.454 "w_mbytes_per_sec": 0 00:25:44.454 }, 00:25:44.454 "claimed": false, 00:25:44.454 "zoned": false, 00:25:44.454 "supported_io_types": { 00:25:44.454 "read": true, 00:25:44.454 "write": true, 00:25:44.454 "unmap": true, 00:25:44.454 "write_zeroes": true, 00:25:44.454 "flush": true, 00:25:44.454 "reset": true, 00:25:44.454 "compare": false, 00:25:44.454 "compare_and_write": false, 00:25:44.454 "abort": false, 00:25:44.454 "nvme_admin": false, 00:25:44.454 "nvme_io": false 00:25:44.454 }, 00:25:44.454 "memory_domains": [ 00:25:44.454 { 00:25:44.454 "dma_device_id": "system", 00:25:44.454 "dma_device_type": 1 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:44.454 "dma_device_type": 2 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "dma_device_id": "system", 00:25:44.454 "dma_device_type": 1 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:44.454 "dma_device_type": 2 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "dma_device_id": "system", 00:25:44.454 "dma_device_type": 1 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:44.454 "dma_device_type": 2 00:25:44.454 } 00:25:44.454 ], 00:25:44.454 "driver_specific": { 00:25:44.454 "raid": { 00:25:44.454 "uuid": "6aa90498-0ab6-48b1-ad05-1bb4e42b17ce", 00:25:44.454 "strip_size_kb": 64, 00:25:44.454 "state": "online", 00:25:44.454 "raid_level": "concat", 00:25:44.454 "superblock": false, 00:25:44.454 "num_base_bdevs": 3, 00:25:44.454 "num_base_bdevs_discovered": 3, 00:25:44.454 "num_base_bdevs_operational": 3, 00:25:44.454 "base_bdevs_list": [ 00:25:44.454 { 00:25:44.454 "name": "BaseBdev1", 00:25:44.454 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:44.454 "is_configured": true, 00:25:44.454 "data_offset": 0, 00:25:44.454 "data_size": 65536 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "name": "BaseBdev2", 00:25:44.454 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:44.454 "is_configured": true, 00:25:44.454 "data_offset": 0, 00:25:44.454 "data_size": 65536 00:25:44.454 }, 00:25:44.454 { 00:25:44.454 "name": "BaseBdev3", 00:25:44.454 "uuid": "0735902e-9c45-4965-9625-491ef98b1055", 00:25:44.454 "is_configured": true, 00:25:44.454 "data_offset": 0, 00:25:44.454 "data_size": 65536 00:25:44.454 } 00:25:44.454 ] 00:25:44.454 } 00:25:44.454 } 00:25:44.454 }' 00:25:44.454 12:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:44.454 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:44.454 BaseBdev2 00:25:44.454 BaseBdev3' 00:25:44.454 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:44.454 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:44.454 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:44.761 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:44.761 "name": "BaseBdev1", 00:25:44.761 "aliases": [ 00:25:44.761 "3e0219c3-e172-451a-bc82-75a43464a0d2" 00:25:44.761 ], 00:25:44.761 "product_name": "Malloc disk", 00:25:44.761 "block_size": 512, 00:25:44.761 "num_blocks": 65536, 00:25:44.761 "uuid": "3e0219c3-e172-451a-bc82-75a43464a0d2", 00:25:44.761 "assigned_rate_limits": { 00:25:44.761 "rw_ios_per_sec": 0, 00:25:44.761 "rw_mbytes_per_sec": 0, 00:25:44.761 "r_mbytes_per_sec": 0, 00:25:44.761 "w_mbytes_per_sec": 0 00:25:44.761 }, 00:25:44.761 "claimed": true, 00:25:44.761 "claim_type": "exclusive_write", 00:25:44.761 "zoned": false, 00:25:44.761 "supported_io_types": { 00:25:44.761 "read": true, 00:25:44.761 "write": true, 00:25:44.761 "unmap": true, 00:25:44.761 "write_zeroes": true, 00:25:44.761 "flush": true, 00:25:44.761 "reset": true, 00:25:44.761 "compare": false, 00:25:44.761 "compare_and_write": false, 00:25:44.761 "abort": true, 00:25:44.761 "nvme_admin": false, 00:25:44.761 "nvme_io": false 00:25:44.761 }, 00:25:44.761 "memory_domains": [ 00:25:44.761 { 00:25:44.761 "dma_device_id": "system", 00:25:44.761 "dma_device_type": 1 00:25:44.761 }, 00:25:44.761 { 00:25:44.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:44.761 "dma_device_type": 2 00:25:44.761 } 00:25:44.761 ], 00:25:44.761 "driver_specific": {} 00:25:44.761 }' 00:25:44.761 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:44.761 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:45.020 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:45.278 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:45.278 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:45.278 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:45.278 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:45.536 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:45.536 "name": "BaseBdev2", 00:25:45.536 "aliases": [ 00:25:45.536 "becb9cba-7a1b-4140-8713-0f88a68ce3db" 00:25:45.536 ], 00:25:45.536 "product_name": "Malloc disk", 00:25:45.536 "block_size": 512, 00:25:45.536 "num_blocks": 65536, 00:25:45.536 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:45.536 "assigned_rate_limits": { 00:25:45.536 "rw_ios_per_sec": 0, 00:25:45.536 "rw_mbytes_per_sec": 0, 00:25:45.536 "r_mbytes_per_sec": 0, 00:25:45.536 "w_mbytes_per_sec": 0 00:25:45.536 }, 00:25:45.536 "claimed": true, 00:25:45.536 "claim_type": "exclusive_write", 00:25:45.536 "zoned": false, 00:25:45.536 "supported_io_types": { 00:25:45.536 "read": true, 00:25:45.536 "write": true, 00:25:45.536 "unmap": true, 00:25:45.536 "write_zeroes": true, 00:25:45.536 "flush": true, 00:25:45.536 "reset": true, 00:25:45.536 "compare": false, 00:25:45.536 "compare_and_write": false, 00:25:45.536 "abort": true, 00:25:45.536 "nvme_admin": false, 00:25:45.536 "nvme_io": false 00:25:45.536 }, 00:25:45.536 "memory_domains": [ 00:25:45.536 { 00:25:45.536 "dma_device_id": "system", 00:25:45.536 "dma_device_type": 1 00:25:45.536 }, 00:25:45.536 { 00:25:45.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:45.536 "dma_device_type": 2 00:25:45.536 } 00:25:45.536 ], 00:25:45.536 "driver_specific": {} 00:25:45.536 }' 00:25:45.536 12:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:45.536 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:25:45.794 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:46.052 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:46.052 "name": "BaseBdev3", 00:25:46.052 "aliases": [ 00:25:46.052 "0735902e-9c45-4965-9625-491ef98b1055" 00:25:46.052 ], 00:25:46.052 "product_name": "Malloc disk", 00:25:46.052 "block_size": 512, 00:25:46.052 "num_blocks": 65536, 00:25:46.052 "uuid": "0735902e-9c45-4965-9625-491ef98b1055", 00:25:46.052 "assigned_rate_limits": { 00:25:46.052 "rw_ios_per_sec": 0, 00:25:46.052 "rw_mbytes_per_sec": 0, 00:25:46.052 "r_mbytes_per_sec": 0, 00:25:46.052 "w_mbytes_per_sec": 0 00:25:46.052 }, 00:25:46.052 "claimed": true, 00:25:46.052 "claim_type": "exclusive_write", 00:25:46.052 "zoned": false, 00:25:46.052 "supported_io_types": { 00:25:46.052 "read": true, 00:25:46.052 "write": true, 00:25:46.052 "unmap": true, 00:25:46.052 "write_zeroes": true, 00:25:46.052 "flush": true, 00:25:46.052 "reset": true, 00:25:46.052 "compare": false, 00:25:46.052 "compare_and_write": false, 00:25:46.052 "abort": true, 00:25:46.052 "nvme_admin": false, 00:25:46.052 "nvme_io": false 00:25:46.052 }, 00:25:46.052 "memory_domains": [ 00:25:46.052 { 00:25:46.052 "dma_device_id": "system", 00:25:46.052 "dma_device_type": 1 00:25:46.052 }, 00:25:46.052 { 00:25:46.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:46.052 "dma_device_type": 2 00:25:46.052 } 00:25:46.052 ], 00:25:46.052 "driver_specific": {} 00:25:46.052 }' 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:46.310 12:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:46.568 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:46.568 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:46.568 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:46.568 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:46.568 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:46.826 [2024-06-07 12:28:10.306241] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:46.826 [2024-06-07 12:28:10.306498] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:46.826 [2024-06-07 12:28:10.306682] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:46.826 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.083 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.083 "name": "Existed_Raid", 00:25:47.083 "uuid": "6aa90498-0ab6-48b1-ad05-1bb4e42b17ce", 00:25:47.083 "strip_size_kb": 64, 00:25:47.083 "state": "offline", 00:25:47.083 "raid_level": "concat", 00:25:47.083 "superblock": false, 00:25:47.083 "num_base_bdevs": 3, 00:25:47.083 "num_base_bdevs_discovered": 2, 00:25:47.083 "num_base_bdevs_operational": 2, 00:25:47.083 "base_bdevs_list": [ 00:25:47.083 { 00:25:47.083 "name": null, 00:25:47.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.083 "is_configured": false, 00:25:47.083 "data_offset": 0, 00:25:47.083 "data_size": 65536 00:25:47.083 }, 00:25:47.083 { 00:25:47.083 "name": "BaseBdev2", 00:25:47.083 "uuid": "becb9cba-7a1b-4140-8713-0f88a68ce3db", 00:25:47.083 "is_configured": true, 00:25:47.083 "data_offset": 0, 00:25:47.083 "data_size": 65536 00:25:47.083 }, 00:25:47.083 { 00:25:47.083 "name": "BaseBdev3", 00:25:47.083 "uuid": "0735902e-9c45-4965-9625-491ef98b1055", 00:25:47.083 "is_configured": true, 00:25:47.083 "data_offset": 0, 00:25:47.083 "data_size": 65536 00:25:47.083 } 00:25:47.083 ] 00:25:47.083 }' 00:25:47.083 12:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.083 12:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:47.650 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:47.650 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:47.650 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.650 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:47.907 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:47.907 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:47.907 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:48.165 [2024-06-07 12:28:11.800507] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:48.422 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:48.422 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:48.422 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.422 12:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:48.680 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:48.680 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:48.680 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:25:48.680 [2024-06-07 12:28:12.286709] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:25:48.680 [2024-06-07 12:28:12.286996] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:25:48.938 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:25:49.196 BaseBdev2 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:49.196 12:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:49.824 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:50.084 [ 00:25:50.084 { 00:25:50.084 "name": "BaseBdev2", 00:25:50.084 "aliases": [ 00:25:50.084 "880025ab-a0ad-4c09-bbcd-72dc699c22a9" 00:25:50.084 ], 00:25:50.084 "product_name": "Malloc disk", 00:25:50.084 "block_size": 512, 00:25:50.084 "num_blocks": 65536, 00:25:50.084 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:50.084 "assigned_rate_limits": { 00:25:50.084 "rw_ios_per_sec": 0, 00:25:50.084 "rw_mbytes_per_sec": 0, 00:25:50.084 "r_mbytes_per_sec": 0, 00:25:50.084 "w_mbytes_per_sec": 0 00:25:50.084 }, 00:25:50.084 "claimed": false, 00:25:50.084 "zoned": false, 00:25:50.084 "supported_io_types": { 00:25:50.084 "read": true, 00:25:50.084 "write": true, 00:25:50.084 "unmap": true, 00:25:50.084 "write_zeroes": true, 00:25:50.084 "flush": true, 00:25:50.084 "reset": true, 00:25:50.084 "compare": false, 00:25:50.084 "compare_and_write": false, 00:25:50.084 "abort": true, 00:25:50.084 "nvme_admin": false, 00:25:50.084 "nvme_io": false 00:25:50.084 }, 00:25:50.084 "memory_domains": [ 00:25:50.084 { 00:25:50.084 "dma_device_id": "system", 00:25:50.084 "dma_device_type": 1 00:25:50.084 }, 00:25:50.084 { 00:25:50.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.084 "dma_device_type": 2 00:25:50.084 } 00:25:50.084 ], 00:25:50.084 "driver_specific": {} 00:25:50.084 } 00:25:50.084 ] 00:25:50.084 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:50.084 12:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:25:50.084 12:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:25:50.084 12:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:25:50.342 BaseBdev3 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:50.342 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:50.599 12:28:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:25:50.857 [ 00:25:50.857 { 00:25:50.857 "name": "BaseBdev3", 00:25:50.857 "aliases": [ 00:25:50.857 "6cd09d25-3336-4113-ab3a-f41e7e38b795" 00:25:50.857 ], 00:25:50.857 "product_name": "Malloc disk", 00:25:50.857 "block_size": 512, 00:25:50.857 "num_blocks": 65536, 00:25:50.857 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:50.857 "assigned_rate_limits": { 00:25:50.857 "rw_ios_per_sec": 0, 00:25:50.857 "rw_mbytes_per_sec": 0, 00:25:50.857 "r_mbytes_per_sec": 0, 00:25:50.857 "w_mbytes_per_sec": 0 00:25:50.857 }, 00:25:50.857 "claimed": false, 00:25:50.857 "zoned": false, 00:25:50.857 "supported_io_types": { 00:25:50.857 "read": true, 00:25:50.857 "write": true, 00:25:50.857 "unmap": true, 00:25:50.857 "write_zeroes": true, 00:25:50.857 "flush": true, 00:25:50.857 "reset": true, 00:25:50.857 "compare": false, 00:25:50.857 "compare_and_write": false, 00:25:50.857 "abort": true, 00:25:50.857 "nvme_admin": false, 00:25:50.857 "nvme_io": false 00:25:50.857 }, 00:25:50.857 "memory_domains": [ 00:25:50.857 { 00:25:50.857 "dma_device_id": "system", 00:25:50.857 "dma_device_type": 1 00:25:50.857 }, 00:25:50.857 { 00:25:50.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.857 "dma_device_type": 2 00:25:50.857 } 00:25:50.857 ], 00:25:50.857 "driver_specific": {} 00:25:50.857 } 00:25:50.857 ] 00:25:50.857 12:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:50.857 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:25:50.857 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:25:50.857 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:25:50.857 [2024-06-07 12:28:14.496335] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:50.857 [2024-06-07 12:28:14.496686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:50.857 [2024-06-07 12:28:14.496824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:50.857 [2024-06-07 12:28:14.498877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:51.115 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.372 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.372 "name": "Existed_Raid", 00:25:51.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.372 "strip_size_kb": 64, 00:25:51.372 "state": "configuring", 00:25:51.372 "raid_level": "concat", 00:25:51.372 "superblock": false, 00:25:51.372 "num_base_bdevs": 3, 00:25:51.372 "num_base_bdevs_discovered": 2, 00:25:51.372 "num_base_bdevs_operational": 3, 00:25:51.372 "base_bdevs_list": [ 00:25:51.372 { 00:25:51.372 "name": "BaseBdev1", 00:25:51.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.372 "is_configured": false, 00:25:51.372 "data_offset": 0, 00:25:51.372 "data_size": 0 00:25:51.372 }, 00:25:51.372 { 00:25:51.372 "name": "BaseBdev2", 00:25:51.372 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:51.372 "is_configured": true, 00:25:51.372 "data_offset": 0, 00:25:51.372 "data_size": 65536 00:25:51.372 }, 00:25:51.373 { 00:25:51.373 "name": "BaseBdev3", 00:25:51.373 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:51.373 "is_configured": true, 00:25:51.373 "data_offset": 0, 00:25:51.373 "data_size": 65536 00:25:51.373 } 00:25:51.373 ] 00:25:51.373 }' 00:25:51.373 12:28:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.373 12:28:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:51.937 [2024-06-07 12:28:15.532438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.937 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:52.195 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.195 "name": "Existed_Raid", 00:25:52.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.195 "strip_size_kb": 64, 00:25:52.195 "state": "configuring", 00:25:52.195 "raid_level": "concat", 00:25:52.195 "superblock": false, 00:25:52.195 "num_base_bdevs": 3, 00:25:52.195 "num_base_bdevs_discovered": 1, 00:25:52.195 "num_base_bdevs_operational": 3, 00:25:52.195 "base_bdevs_list": [ 00:25:52.195 { 00:25:52.195 "name": "BaseBdev1", 00:25:52.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.195 "is_configured": false, 00:25:52.195 "data_offset": 0, 00:25:52.195 "data_size": 0 00:25:52.195 }, 00:25:52.195 { 00:25:52.195 "name": null, 00:25:52.195 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:52.195 "is_configured": false, 00:25:52.195 "data_offset": 0, 00:25:52.195 "data_size": 65536 00:25:52.195 }, 00:25:52.195 { 00:25:52.195 "name": "BaseBdev3", 00:25:52.195 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:52.195 "is_configured": true, 00:25:52.195 "data_offset": 0, 00:25:52.195 "data_size": 65536 00:25:52.195 } 00:25:52.195 ] 00:25:52.195 }' 00:25:52.195 12:28:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.195 12:28:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:52.774 12:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.774 12:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:25:53.032 12:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:25:53.032 12:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:25:53.290 [2024-06-07 12:28:16.846343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:53.290 BaseBdev1 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:53.290 12:28:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:53.548 12:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:53.806 [ 00:25:53.806 { 00:25:53.806 "name": "BaseBdev1", 00:25:53.806 "aliases": [ 00:25:53.806 "2814e2ce-c548-4bbb-af38-06b2b542d087" 00:25:53.806 ], 00:25:53.806 "product_name": "Malloc disk", 00:25:53.806 "block_size": 512, 00:25:53.806 "num_blocks": 65536, 00:25:53.806 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:53.806 "assigned_rate_limits": { 00:25:53.806 "rw_ios_per_sec": 0, 00:25:53.806 "rw_mbytes_per_sec": 0, 00:25:53.806 "r_mbytes_per_sec": 0, 00:25:53.806 "w_mbytes_per_sec": 0 00:25:53.806 }, 00:25:53.806 "claimed": true, 00:25:53.806 "claim_type": "exclusive_write", 00:25:53.806 "zoned": false, 00:25:53.806 "supported_io_types": { 00:25:53.806 "read": true, 00:25:53.806 "write": true, 00:25:53.806 "unmap": true, 00:25:53.806 "write_zeroes": true, 00:25:53.806 "flush": true, 00:25:53.806 "reset": true, 00:25:53.806 "compare": false, 00:25:53.806 "compare_and_write": false, 00:25:53.806 "abort": true, 00:25:53.806 "nvme_admin": false, 00:25:53.806 "nvme_io": false 00:25:53.806 }, 00:25:53.806 "memory_domains": [ 00:25:53.806 { 00:25:53.806 "dma_device_id": "system", 00:25:53.806 "dma_device_type": 1 00:25:53.806 }, 00:25:53.806 { 00:25:53.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.806 "dma_device_type": 2 00:25:53.806 } 00:25:53.806 ], 00:25:53.806 "driver_specific": {} 00:25:53.806 } 00:25:53.806 ] 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:53.806 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.065 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:54.065 "name": "Existed_Raid", 00:25:54.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.065 "strip_size_kb": 64, 00:25:54.065 "state": "configuring", 00:25:54.065 "raid_level": "concat", 00:25:54.065 "superblock": false, 00:25:54.065 "num_base_bdevs": 3, 00:25:54.065 "num_base_bdevs_discovered": 2, 00:25:54.065 "num_base_bdevs_operational": 3, 00:25:54.065 "base_bdevs_list": [ 00:25:54.065 { 00:25:54.065 "name": "BaseBdev1", 00:25:54.065 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:54.065 "is_configured": true, 00:25:54.065 "data_offset": 0, 00:25:54.065 "data_size": 65536 00:25:54.065 }, 00:25:54.065 { 00:25:54.065 "name": null, 00:25:54.065 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:54.065 "is_configured": false, 00:25:54.065 "data_offset": 0, 00:25:54.065 "data_size": 65536 00:25:54.065 }, 00:25:54.065 { 00:25:54.065 "name": "BaseBdev3", 00:25:54.065 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:54.065 "is_configured": true, 00:25:54.065 "data_offset": 0, 00:25:54.065 "data_size": 65536 00:25:54.065 } 00:25:54.065 ] 00:25:54.065 }' 00:25:54.065 12:28:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:54.065 12:28:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:54.631 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.631 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:25:54.888 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:25:54.888 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:25:55.147 [2024-06-07 12:28:18.650694] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.147 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:55.404 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.404 "name": "Existed_Raid", 00:25:55.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.404 "strip_size_kb": 64, 00:25:55.404 "state": "configuring", 00:25:55.404 "raid_level": "concat", 00:25:55.404 "superblock": false, 00:25:55.404 "num_base_bdevs": 3, 00:25:55.404 "num_base_bdevs_discovered": 1, 00:25:55.404 "num_base_bdevs_operational": 3, 00:25:55.404 "base_bdevs_list": [ 00:25:55.404 { 00:25:55.404 "name": "BaseBdev1", 00:25:55.404 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:55.404 "is_configured": true, 00:25:55.404 "data_offset": 0, 00:25:55.404 "data_size": 65536 00:25:55.404 }, 00:25:55.404 { 00:25:55.404 "name": null, 00:25:55.404 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:55.404 "is_configured": false, 00:25:55.404 "data_offset": 0, 00:25:55.404 "data_size": 65536 00:25:55.404 }, 00:25:55.404 { 00:25:55.404 "name": null, 00:25:55.404 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:55.404 "is_configured": false, 00:25:55.404 "data_offset": 0, 00:25:55.404 "data_size": 65536 00:25:55.404 } 00:25:55.404 ] 00:25:55.404 }' 00:25:55.404 12:28:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.404 12:28:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:55.969 12:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.969 12:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:25:56.226 12:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:25:56.227 12:28:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:25:56.484 [2024-06-07 12:28:20.058870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.484 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:56.741 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.741 "name": "Existed_Raid", 00:25:56.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.741 "strip_size_kb": 64, 00:25:56.741 "state": "configuring", 00:25:56.741 "raid_level": "concat", 00:25:56.741 "superblock": false, 00:25:56.741 "num_base_bdevs": 3, 00:25:56.741 "num_base_bdevs_discovered": 2, 00:25:56.741 "num_base_bdevs_operational": 3, 00:25:56.741 "base_bdevs_list": [ 00:25:56.741 { 00:25:56.741 "name": "BaseBdev1", 00:25:56.741 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:56.741 "is_configured": true, 00:25:56.741 "data_offset": 0, 00:25:56.741 "data_size": 65536 00:25:56.741 }, 00:25:56.741 { 00:25:56.741 "name": null, 00:25:56.741 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:56.741 "is_configured": false, 00:25:56.741 "data_offset": 0, 00:25:56.741 "data_size": 65536 00:25:56.741 }, 00:25:56.741 { 00:25:56.741 "name": "BaseBdev3", 00:25:56.741 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:56.741 "is_configured": true, 00:25:56.741 "data_offset": 0, 00:25:56.741 "data_size": 65536 00:25:56.741 } 00:25:56.741 ] 00:25:56.741 }' 00:25:56.741 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.741 12:28:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:57.307 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:25:57.307 12:28:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.565 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:25:57.565 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:57.823 [2024-06-07 12:28:21.343043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.823 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:58.082 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.082 "name": "Existed_Raid", 00:25:58.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.082 "strip_size_kb": 64, 00:25:58.082 "state": "configuring", 00:25:58.082 "raid_level": "concat", 00:25:58.082 "superblock": false, 00:25:58.082 "num_base_bdevs": 3, 00:25:58.082 "num_base_bdevs_discovered": 1, 00:25:58.082 "num_base_bdevs_operational": 3, 00:25:58.082 "base_bdevs_list": [ 00:25:58.082 { 00:25:58.082 "name": null, 00:25:58.082 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:58.082 "is_configured": false, 00:25:58.082 "data_offset": 0, 00:25:58.082 "data_size": 65536 00:25:58.082 }, 00:25:58.082 { 00:25:58.082 "name": null, 00:25:58.082 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:58.082 "is_configured": false, 00:25:58.082 "data_offset": 0, 00:25:58.082 "data_size": 65536 00:25:58.082 }, 00:25:58.082 { 00:25:58.082 "name": "BaseBdev3", 00:25:58.082 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:58.082 "is_configured": true, 00:25:58.082 "data_offset": 0, 00:25:58.082 "data_size": 65536 00:25:58.082 } 00:25:58.082 ] 00:25:58.082 }' 00:25:58.082 12:28:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.082 12:28:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:58.648 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:25:58.648 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.908 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:25:58.908 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:25:59.166 [2024-06-07 12:28:22.655302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.166 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:59.431 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.431 "name": "Existed_Raid", 00:25:59.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.431 "strip_size_kb": 64, 00:25:59.431 "state": "configuring", 00:25:59.431 "raid_level": "concat", 00:25:59.431 "superblock": false, 00:25:59.431 "num_base_bdevs": 3, 00:25:59.431 "num_base_bdevs_discovered": 2, 00:25:59.431 "num_base_bdevs_operational": 3, 00:25:59.431 "base_bdevs_list": [ 00:25:59.431 { 00:25:59.431 "name": null, 00:25:59.431 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:25:59.431 "is_configured": false, 00:25:59.431 "data_offset": 0, 00:25:59.431 "data_size": 65536 00:25:59.431 }, 00:25:59.431 { 00:25:59.431 "name": "BaseBdev2", 00:25:59.431 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:25:59.431 "is_configured": true, 00:25:59.431 "data_offset": 0, 00:25:59.431 "data_size": 65536 00:25:59.431 }, 00:25:59.431 { 00:25:59.431 "name": "BaseBdev3", 00:25:59.431 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:25:59.431 "is_configured": true, 00:25:59.431 "data_offset": 0, 00:25:59.431 "data_size": 65536 00:25:59.431 } 00:25:59.431 ] 00:25:59.431 }' 00:25:59.431 12:28:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.431 12:28:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:00.009 12:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:00.009 12:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.267 12:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:26:00.267 12:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.267 12:28:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:26:00.525 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2814e2ce-c548-4bbb-af38-06b2b542d087 00:26:00.784 [2024-06-07 12:28:24.301252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:26:00.784 [2024-06-07 12:28:24.301554] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:26:00.784 [2024-06-07 12:28:24.301609] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:26:00.784 [2024-06-07 12:28:24.301870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:26:00.784 [2024-06-07 12:28:24.302254] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:26:00.784 [2024-06-07 12:28:24.302385] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:26:00.784 [2024-06-07 12:28:24.302672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.784 NewBaseBdev 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:00.784 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:01.041 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:26:01.298 [ 00:26:01.298 { 00:26:01.298 "name": "NewBaseBdev", 00:26:01.298 "aliases": [ 00:26:01.298 "2814e2ce-c548-4bbb-af38-06b2b542d087" 00:26:01.298 ], 00:26:01.298 "product_name": "Malloc disk", 00:26:01.298 "block_size": 512, 00:26:01.298 "num_blocks": 65536, 00:26:01.298 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:26:01.298 "assigned_rate_limits": { 00:26:01.298 "rw_ios_per_sec": 0, 00:26:01.298 "rw_mbytes_per_sec": 0, 00:26:01.298 "r_mbytes_per_sec": 0, 00:26:01.298 "w_mbytes_per_sec": 0 00:26:01.298 }, 00:26:01.298 "claimed": true, 00:26:01.298 "claim_type": "exclusive_write", 00:26:01.298 "zoned": false, 00:26:01.298 "supported_io_types": { 00:26:01.298 "read": true, 00:26:01.298 "write": true, 00:26:01.298 "unmap": true, 00:26:01.298 "write_zeroes": true, 00:26:01.298 "flush": true, 00:26:01.298 "reset": true, 00:26:01.298 "compare": false, 00:26:01.298 "compare_and_write": false, 00:26:01.298 "abort": true, 00:26:01.298 "nvme_admin": false, 00:26:01.298 "nvme_io": false 00:26:01.298 }, 00:26:01.298 "memory_domains": [ 00:26:01.298 { 00:26:01.298 "dma_device_id": "system", 00:26:01.298 "dma_device_type": 1 00:26:01.298 }, 00:26:01.298 { 00:26:01.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:01.298 "dma_device_type": 2 00:26:01.298 } 00:26:01.298 ], 00:26:01.298 "driver_specific": {} 00:26:01.298 } 00:26:01.298 ] 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:01.298 12:28:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.554 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.554 "name": "Existed_Raid", 00:26:01.554 "uuid": "6bae38a2-650e-45fc-a384-ff80e2e121c7", 00:26:01.554 "strip_size_kb": 64, 00:26:01.554 "state": "online", 00:26:01.554 "raid_level": "concat", 00:26:01.554 "superblock": false, 00:26:01.554 "num_base_bdevs": 3, 00:26:01.554 "num_base_bdevs_discovered": 3, 00:26:01.554 "num_base_bdevs_operational": 3, 00:26:01.554 "base_bdevs_list": [ 00:26:01.554 { 00:26:01.554 "name": "NewBaseBdev", 00:26:01.554 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:26:01.554 "is_configured": true, 00:26:01.554 "data_offset": 0, 00:26:01.554 "data_size": 65536 00:26:01.554 }, 00:26:01.554 { 00:26:01.554 "name": "BaseBdev2", 00:26:01.554 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:26:01.554 "is_configured": true, 00:26:01.554 "data_offset": 0, 00:26:01.554 "data_size": 65536 00:26:01.554 }, 00:26:01.554 { 00:26:01.554 "name": "BaseBdev3", 00:26:01.554 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:26:01.554 "is_configured": true, 00:26:01.554 "data_offset": 0, 00:26:01.554 "data_size": 65536 00:26:01.554 } 00:26:01.554 ] 00:26:01.554 }' 00:26:01.554 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.554 12:28:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:02.120 12:28:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:02.684 [2024-06-07 12:28:26.093778] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:02.684 "name": "Existed_Raid", 00:26:02.684 "aliases": [ 00:26:02.684 "6bae38a2-650e-45fc-a384-ff80e2e121c7" 00:26:02.684 ], 00:26:02.684 "product_name": "Raid Volume", 00:26:02.684 "block_size": 512, 00:26:02.684 "num_blocks": 196608, 00:26:02.684 "uuid": "6bae38a2-650e-45fc-a384-ff80e2e121c7", 00:26:02.684 "assigned_rate_limits": { 00:26:02.684 "rw_ios_per_sec": 0, 00:26:02.684 "rw_mbytes_per_sec": 0, 00:26:02.684 "r_mbytes_per_sec": 0, 00:26:02.684 "w_mbytes_per_sec": 0 00:26:02.684 }, 00:26:02.684 "claimed": false, 00:26:02.684 "zoned": false, 00:26:02.684 "supported_io_types": { 00:26:02.684 "read": true, 00:26:02.684 "write": true, 00:26:02.684 "unmap": true, 00:26:02.684 "write_zeroes": true, 00:26:02.684 "flush": true, 00:26:02.684 "reset": true, 00:26:02.684 "compare": false, 00:26:02.684 "compare_and_write": false, 00:26:02.684 "abort": false, 00:26:02.684 "nvme_admin": false, 00:26:02.684 "nvme_io": false 00:26:02.684 }, 00:26:02.684 "memory_domains": [ 00:26:02.684 { 00:26:02.684 "dma_device_id": "system", 00:26:02.684 "dma_device_type": 1 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.684 "dma_device_type": 2 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "dma_device_id": "system", 00:26:02.684 "dma_device_type": 1 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.684 "dma_device_type": 2 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "dma_device_id": "system", 00:26:02.684 "dma_device_type": 1 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.684 "dma_device_type": 2 00:26:02.684 } 00:26:02.684 ], 00:26:02.684 "driver_specific": { 00:26:02.684 "raid": { 00:26:02.684 "uuid": "6bae38a2-650e-45fc-a384-ff80e2e121c7", 00:26:02.684 "strip_size_kb": 64, 00:26:02.684 "state": "online", 00:26:02.684 "raid_level": "concat", 00:26:02.684 "superblock": false, 00:26:02.684 "num_base_bdevs": 3, 00:26:02.684 "num_base_bdevs_discovered": 3, 00:26:02.684 "num_base_bdevs_operational": 3, 00:26:02.684 "base_bdevs_list": [ 00:26:02.684 { 00:26:02.684 "name": "NewBaseBdev", 00:26:02.684 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:26:02.684 "is_configured": true, 00:26:02.684 "data_offset": 0, 00:26:02.684 "data_size": 65536 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "name": "BaseBdev2", 00:26:02.684 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:26:02.684 "is_configured": true, 00:26:02.684 "data_offset": 0, 00:26:02.684 "data_size": 65536 00:26:02.684 }, 00:26:02.684 { 00:26:02.684 "name": "BaseBdev3", 00:26:02.684 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:26:02.684 "is_configured": true, 00:26:02.684 "data_offset": 0, 00:26:02.684 "data_size": 65536 00:26:02.684 } 00:26:02.684 ] 00:26:02.684 } 00:26:02.684 } 00:26:02.684 }' 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:26:02.684 BaseBdev2 00:26:02.684 BaseBdev3' 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:26:02.684 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:02.942 "name": "NewBaseBdev", 00:26:02.942 "aliases": [ 00:26:02.942 "2814e2ce-c548-4bbb-af38-06b2b542d087" 00:26:02.942 ], 00:26:02.942 "product_name": "Malloc disk", 00:26:02.942 "block_size": 512, 00:26:02.942 "num_blocks": 65536, 00:26:02.942 "uuid": "2814e2ce-c548-4bbb-af38-06b2b542d087", 00:26:02.942 "assigned_rate_limits": { 00:26:02.942 "rw_ios_per_sec": 0, 00:26:02.942 "rw_mbytes_per_sec": 0, 00:26:02.942 "r_mbytes_per_sec": 0, 00:26:02.942 "w_mbytes_per_sec": 0 00:26:02.942 }, 00:26:02.942 "claimed": true, 00:26:02.942 "claim_type": "exclusive_write", 00:26:02.942 "zoned": false, 00:26:02.942 "supported_io_types": { 00:26:02.942 "read": true, 00:26:02.942 "write": true, 00:26:02.942 "unmap": true, 00:26:02.942 "write_zeroes": true, 00:26:02.942 "flush": true, 00:26:02.942 "reset": true, 00:26:02.942 "compare": false, 00:26:02.942 "compare_and_write": false, 00:26:02.942 "abort": true, 00:26:02.942 "nvme_admin": false, 00:26:02.942 "nvme_io": false 00:26:02.942 }, 00:26:02.942 "memory_domains": [ 00:26:02.942 { 00:26:02.942 "dma_device_id": "system", 00:26:02.942 "dma_device_type": 1 00:26:02.942 }, 00:26:02.942 { 00:26:02.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.942 "dma_device_type": 2 00:26:02.942 } 00:26:02.942 ], 00:26:02.942 "driver_specific": {} 00:26:02.942 }' 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:02.942 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:03.201 12:28:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:03.459 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:03.459 "name": "BaseBdev2", 00:26:03.459 "aliases": [ 00:26:03.459 "880025ab-a0ad-4c09-bbcd-72dc699c22a9" 00:26:03.459 ], 00:26:03.459 "product_name": "Malloc disk", 00:26:03.459 "block_size": 512, 00:26:03.459 "num_blocks": 65536, 00:26:03.459 "uuid": "880025ab-a0ad-4c09-bbcd-72dc699c22a9", 00:26:03.459 "assigned_rate_limits": { 00:26:03.459 "rw_ios_per_sec": 0, 00:26:03.459 "rw_mbytes_per_sec": 0, 00:26:03.459 "r_mbytes_per_sec": 0, 00:26:03.459 "w_mbytes_per_sec": 0 00:26:03.459 }, 00:26:03.459 "claimed": true, 00:26:03.459 "claim_type": "exclusive_write", 00:26:03.459 "zoned": false, 00:26:03.459 "supported_io_types": { 00:26:03.459 "read": true, 00:26:03.459 "write": true, 00:26:03.459 "unmap": true, 00:26:03.459 "write_zeroes": true, 00:26:03.459 "flush": true, 00:26:03.459 "reset": true, 00:26:03.459 "compare": false, 00:26:03.459 "compare_and_write": false, 00:26:03.459 "abort": true, 00:26:03.459 "nvme_admin": false, 00:26:03.459 "nvme_io": false 00:26:03.459 }, 00:26:03.459 "memory_domains": [ 00:26:03.459 { 00:26:03.459 "dma_device_id": "system", 00:26:03.459 "dma_device_type": 1 00:26:03.459 }, 00:26:03.459 { 00:26:03.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:03.459 "dma_device_type": 2 00:26:03.459 } 00:26:03.459 ], 00:26:03.459 "driver_specific": {} 00:26:03.459 }' 00:26:03.459 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:03.717 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.974 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.974 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:03.974 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:03.974 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:03.974 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:04.232 "name": "BaseBdev3", 00:26:04.232 "aliases": [ 00:26:04.232 "6cd09d25-3336-4113-ab3a-f41e7e38b795" 00:26:04.232 ], 00:26:04.232 "product_name": "Malloc disk", 00:26:04.232 "block_size": 512, 00:26:04.232 "num_blocks": 65536, 00:26:04.232 "uuid": "6cd09d25-3336-4113-ab3a-f41e7e38b795", 00:26:04.232 "assigned_rate_limits": { 00:26:04.232 "rw_ios_per_sec": 0, 00:26:04.232 "rw_mbytes_per_sec": 0, 00:26:04.232 "r_mbytes_per_sec": 0, 00:26:04.232 "w_mbytes_per_sec": 0 00:26:04.232 }, 00:26:04.232 "claimed": true, 00:26:04.232 "claim_type": "exclusive_write", 00:26:04.232 "zoned": false, 00:26:04.232 "supported_io_types": { 00:26:04.232 "read": true, 00:26:04.232 "write": true, 00:26:04.232 "unmap": true, 00:26:04.232 "write_zeroes": true, 00:26:04.232 "flush": true, 00:26:04.232 "reset": true, 00:26:04.232 "compare": false, 00:26:04.232 "compare_and_write": false, 00:26:04.232 "abort": true, 00:26:04.232 "nvme_admin": false, 00:26:04.232 "nvme_io": false 00:26:04.232 }, 00:26:04.232 "memory_domains": [ 00:26:04.232 { 00:26:04.232 "dma_device_id": "system", 00:26:04.232 "dma_device_type": 1 00:26:04.232 }, 00:26:04.232 { 00:26:04.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:04.232 "dma_device_type": 2 00:26:04.232 } 00:26:04.232 ], 00:26:04.232 "driver_specific": {} 00:26:04.232 }' 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:04.232 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:04.491 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:04.491 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:04.491 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:04.491 12:28:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:04.749 [2024-06-07 12:28:28.209902] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:04.749 [2024-06-07 12:28:28.210149] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:04.749 [2024-06-07 12:28:28.210376] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:04.749 [2024-06-07 12:28:28.210547] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:04.749 [2024-06-07 12:28:28.210643] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 205544 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 205544 ']' 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 205544 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 205544 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 205544' 00:26:04.749 killing process with pid 205544 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 205544 00:26:04.749 [2024-06-07 12:28:28.259594] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:04.749 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 205544 00:26:04.749 [2024-06-07 12:28:28.319512] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:26:05.316 00:26:05.316 real 0m30.005s 00:26:05.316 user 0m55.296s 00:26:05.316 sys 0m4.695s 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:05.316 ************************************ 00:26:05.316 END TEST raid_state_function_test 00:26:05.316 ************************************ 00:26:05.316 12:28:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:26:05.316 12:28:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:26:05.316 12:28:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:05.316 12:28:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:05.316 ************************************ 00:26:05.316 START TEST raid_state_function_test_sb 00:26:05.316 ************************************ 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 true 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:26:05.316 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=206529 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 206529' 00:26:05.317 Process raid pid: 206529 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 206529 /var/tmp/spdk-raid.sock 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 206529 ']' 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:05.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:05.317 12:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:05.317 [2024-06-07 12:28:28.770179] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:26:05.317 [2024-06-07 12:28:28.770657] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:05.317 [2024-06-07 12:28:28.913051] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:05.575 [2024-06-07 12:28:29.008126] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:05.575 [2024-06-07 12:28:29.091348] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:06.140 12:28:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:06.140 12:28:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:26:06.140 12:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:26:06.398 [2024-06-07 12:28:30.010526] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:06.398 [2024-06-07 12:28:30.010855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:06.398 [2024-06-07 12:28:30.010988] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:06.398 [2024-06-07 12:28:30.011068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:06.398 [2024-06-07 12:28:30.011155] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:06.398 [2024-06-07 12:28:30.011289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:06.398 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.655 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.655 "name": "Existed_Raid", 00:26:06.655 "uuid": "985bf7eb-edb1-461c-a3a3-7e0ac066701e", 00:26:06.655 "strip_size_kb": 64, 00:26:06.655 "state": "configuring", 00:26:06.655 "raid_level": "concat", 00:26:06.655 "superblock": true, 00:26:06.655 "num_base_bdevs": 3, 00:26:06.655 "num_base_bdevs_discovered": 0, 00:26:06.655 "num_base_bdevs_operational": 3, 00:26:06.655 "base_bdevs_list": [ 00:26:06.655 { 00:26:06.655 "name": "BaseBdev1", 00:26:06.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.655 "is_configured": false, 00:26:06.655 "data_offset": 0, 00:26:06.655 "data_size": 0 00:26:06.655 }, 00:26:06.655 { 00:26:06.655 "name": "BaseBdev2", 00:26:06.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.655 "is_configured": false, 00:26:06.655 "data_offset": 0, 00:26:06.655 "data_size": 0 00:26:06.655 }, 00:26:06.655 { 00:26:06.655 "name": "BaseBdev3", 00:26:06.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.655 "is_configured": false, 00:26:06.655 "data_offset": 0, 00:26:06.655 "data_size": 0 00:26:06.655 } 00:26:06.655 ] 00:26:06.655 }' 00:26:06.655 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.655 12:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:07.587 12:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:07.587 [2024-06-07 12:28:31.130584] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:07.587 [2024-06-07 12:28:31.130857] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:26:07.587 12:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:26:07.845 [2024-06-07 12:28:31.346665] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:07.845 [2024-06-07 12:28:31.346984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:07.845 [2024-06-07 12:28:31.347081] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:07.845 [2024-06-07 12:28:31.347143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:07.845 [2024-06-07 12:28:31.347243] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:07.845 [2024-06-07 12:28:31.347326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:07.845 12:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:26:08.103 [2024-06-07 12:28:31.650506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:08.103 BaseBdev1 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:08.103 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:08.361 12:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:08.619 [ 00:26:08.619 { 00:26:08.619 "name": "BaseBdev1", 00:26:08.619 "aliases": [ 00:26:08.619 "28b1a258-4639-46e4-adf1-16e411c0c3f3" 00:26:08.619 ], 00:26:08.619 "product_name": "Malloc disk", 00:26:08.619 "block_size": 512, 00:26:08.619 "num_blocks": 65536, 00:26:08.619 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:08.619 "assigned_rate_limits": { 00:26:08.619 "rw_ios_per_sec": 0, 00:26:08.619 "rw_mbytes_per_sec": 0, 00:26:08.619 "r_mbytes_per_sec": 0, 00:26:08.619 "w_mbytes_per_sec": 0 00:26:08.619 }, 00:26:08.619 "claimed": true, 00:26:08.619 "claim_type": "exclusive_write", 00:26:08.619 "zoned": false, 00:26:08.619 "supported_io_types": { 00:26:08.619 "read": true, 00:26:08.619 "write": true, 00:26:08.619 "unmap": true, 00:26:08.619 "write_zeroes": true, 00:26:08.619 "flush": true, 00:26:08.619 "reset": true, 00:26:08.619 "compare": false, 00:26:08.619 "compare_and_write": false, 00:26:08.619 "abort": true, 00:26:08.619 "nvme_admin": false, 00:26:08.619 "nvme_io": false 00:26:08.619 }, 00:26:08.619 "memory_domains": [ 00:26:08.619 { 00:26:08.619 "dma_device_id": "system", 00:26:08.619 "dma_device_type": 1 00:26:08.619 }, 00:26:08.619 { 00:26:08.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:08.619 "dma_device_type": 2 00:26:08.619 } 00:26:08.619 ], 00:26:08.619 "driver_specific": {} 00:26:08.619 } 00:26:08.619 ] 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:08.619 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.878 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.878 "name": "Existed_Raid", 00:26:08.878 "uuid": "a7d4eefe-2ab0-4306-b2e6-ec2e8267a5c1", 00:26:08.878 "strip_size_kb": 64, 00:26:08.878 "state": "configuring", 00:26:08.878 "raid_level": "concat", 00:26:08.878 "superblock": true, 00:26:08.878 "num_base_bdevs": 3, 00:26:08.878 "num_base_bdevs_discovered": 1, 00:26:08.878 "num_base_bdevs_operational": 3, 00:26:08.878 "base_bdevs_list": [ 00:26:08.878 { 00:26:08.878 "name": "BaseBdev1", 00:26:08.878 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:08.878 "is_configured": true, 00:26:08.878 "data_offset": 2048, 00:26:08.878 "data_size": 63488 00:26:08.878 }, 00:26:08.878 { 00:26:08.878 "name": "BaseBdev2", 00:26:08.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.878 "is_configured": false, 00:26:08.878 "data_offset": 0, 00:26:08.878 "data_size": 0 00:26:08.878 }, 00:26:08.878 { 00:26:08.878 "name": "BaseBdev3", 00:26:08.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.878 "is_configured": false, 00:26:08.878 "data_offset": 0, 00:26:08.878 "data_size": 0 00:26:08.878 } 00:26:08.878 ] 00:26:08.878 }' 00:26:08.878 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.878 12:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:09.474 12:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:09.731 [2024-06-07 12:28:33.162800] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:09.731 [2024-06-07 12:28:33.163146] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:26:09.731 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:26:09.990 [2024-06-07 12:28:33.454961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:09.990 [2024-06-07 12:28:33.457580] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:09.990 [2024-06-07 12:28:33.457809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:09.990 [2024-06-07 12:28:33.457932] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:09.990 [2024-06-07 12:28:33.458061] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.990 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:10.249 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.249 "name": "Existed_Raid", 00:26:10.249 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:10.249 "strip_size_kb": 64, 00:26:10.249 "state": "configuring", 00:26:10.249 "raid_level": "concat", 00:26:10.249 "superblock": true, 00:26:10.249 "num_base_bdevs": 3, 00:26:10.249 "num_base_bdevs_discovered": 1, 00:26:10.249 "num_base_bdevs_operational": 3, 00:26:10.249 "base_bdevs_list": [ 00:26:10.249 { 00:26:10.249 "name": "BaseBdev1", 00:26:10.249 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:10.249 "is_configured": true, 00:26:10.249 "data_offset": 2048, 00:26:10.249 "data_size": 63488 00:26:10.249 }, 00:26:10.249 { 00:26:10.249 "name": "BaseBdev2", 00:26:10.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.249 "is_configured": false, 00:26:10.249 "data_offset": 0, 00:26:10.249 "data_size": 0 00:26:10.249 }, 00:26:10.249 { 00:26:10.249 "name": "BaseBdev3", 00:26:10.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.249 "is_configured": false, 00:26:10.249 "data_offset": 0, 00:26:10.249 "data_size": 0 00:26:10.249 } 00:26:10.249 ] 00:26:10.249 }' 00:26:10.249 12:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.249 12:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:10.816 12:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:26:11.074 [2024-06-07 12:28:34.539347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:11.074 BaseBdev2 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:11.074 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:11.333 12:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:11.631 [ 00:26:11.631 { 00:26:11.631 "name": "BaseBdev2", 00:26:11.631 "aliases": [ 00:26:11.631 "4754f79b-dfda-4dff-ad82-d060c6a4daa4" 00:26:11.631 ], 00:26:11.631 "product_name": "Malloc disk", 00:26:11.631 "block_size": 512, 00:26:11.631 "num_blocks": 65536, 00:26:11.631 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:11.631 "assigned_rate_limits": { 00:26:11.631 "rw_ios_per_sec": 0, 00:26:11.631 "rw_mbytes_per_sec": 0, 00:26:11.631 "r_mbytes_per_sec": 0, 00:26:11.631 "w_mbytes_per_sec": 0 00:26:11.631 }, 00:26:11.631 "claimed": true, 00:26:11.631 "claim_type": "exclusive_write", 00:26:11.631 "zoned": false, 00:26:11.631 "supported_io_types": { 00:26:11.631 "read": true, 00:26:11.631 "write": true, 00:26:11.631 "unmap": true, 00:26:11.631 "write_zeroes": true, 00:26:11.631 "flush": true, 00:26:11.631 "reset": true, 00:26:11.631 "compare": false, 00:26:11.631 "compare_and_write": false, 00:26:11.631 "abort": true, 00:26:11.631 "nvme_admin": false, 00:26:11.631 "nvme_io": false 00:26:11.631 }, 00:26:11.631 "memory_domains": [ 00:26:11.631 { 00:26:11.631 "dma_device_id": "system", 00:26:11.631 "dma_device_type": 1 00:26:11.631 }, 00:26:11.631 { 00:26:11.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:11.631 "dma_device_type": 2 00:26:11.631 } 00:26:11.631 ], 00:26:11.631 "driver_specific": {} 00:26:11.631 } 00:26:11.631 ] 00:26:11.631 12:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.632 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:11.912 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.912 "name": "Existed_Raid", 00:26:11.912 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:11.912 "strip_size_kb": 64, 00:26:11.912 "state": "configuring", 00:26:11.912 "raid_level": "concat", 00:26:11.912 "superblock": true, 00:26:11.912 "num_base_bdevs": 3, 00:26:11.912 "num_base_bdevs_discovered": 2, 00:26:11.912 "num_base_bdevs_operational": 3, 00:26:11.912 "base_bdevs_list": [ 00:26:11.912 { 00:26:11.912 "name": "BaseBdev1", 00:26:11.912 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:11.912 "is_configured": true, 00:26:11.912 "data_offset": 2048, 00:26:11.912 "data_size": 63488 00:26:11.912 }, 00:26:11.912 { 00:26:11.912 "name": "BaseBdev2", 00:26:11.912 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:11.912 "is_configured": true, 00:26:11.912 "data_offset": 2048, 00:26:11.912 "data_size": 63488 00:26:11.912 }, 00:26:11.912 { 00:26:11.912 "name": "BaseBdev3", 00:26:11.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.912 "is_configured": false, 00:26:11.912 "data_offset": 0, 00:26:11.912 "data_size": 0 00:26:11.912 } 00:26:11.912 ] 00:26:11.912 }' 00:26:11.912 12:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.912 12:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:12.478 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:26:12.737 [2024-06-07 12:28:36.233191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:12.737 [2024-06-07 12:28:36.233690] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:26:12.737 [2024-06-07 12:28:36.233801] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:26:12.737 [2024-06-07 12:28:36.233980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:26:12.737 [2024-06-07 12:28:36.234347] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:26:12.737 [2024-06-07 12:28:36.234467] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:26:12.737 [2024-06-07 12:28:36.234678] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:12.737 BaseBdev3 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:12.737 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:12.995 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:26:13.252 [ 00:26:13.252 { 00:26:13.252 "name": "BaseBdev3", 00:26:13.252 "aliases": [ 00:26:13.252 "cd679e3e-876a-46fc-a3b5-657b3160572b" 00:26:13.252 ], 00:26:13.252 "product_name": "Malloc disk", 00:26:13.252 "block_size": 512, 00:26:13.252 "num_blocks": 65536, 00:26:13.252 "uuid": "cd679e3e-876a-46fc-a3b5-657b3160572b", 00:26:13.252 "assigned_rate_limits": { 00:26:13.252 "rw_ios_per_sec": 0, 00:26:13.252 "rw_mbytes_per_sec": 0, 00:26:13.252 "r_mbytes_per_sec": 0, 00:26:13.252 "w_mbytes_per_sec": 0 00:26:13.252 }, 00:26:13.252 "claimed": true, 00:26:13.252 "claim_type": "exclusive_write", 00:26:13.252 "zoned": false, 00:26:13.252 "supported_io_types": { 00:26:13.252 "read": true, 00:26:13.252 "write": true, 00:26:13.252 "unmap": true, 00:26:13.252 "write_zeroes": true, 00:26:13.252 "flush": true, 00:26:13.252 "reset": true, 00:26:13.252 "compare": false, 00:26:13.252 "compare_and_write": false, 00:26:13.252 "abort": true, 00:26:13.252 "nvme_admin": false, 00:26:13.252 "nvme_io": false 00:26:13.252 }, 00:26:13.252 "memory_domains": [ 00:26:13.252 { 00:26:13.252 "dma_device_id": "system", 00:26:13.252 "dma_device_type": 1 00:26:13.253 }, 00:26:13.253 { 00:26:13.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:13.253 "dma_device_type": 2 00:26:13.253 } 00:26:13.253 ], 00:26:13.253 "driver_specific": {} 00:26:13.253 } 00:26:13.253 ] 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.253 12:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:13.510 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.510 "name": "Existed_Raid", 00:26:13.510 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:13.510 "strip_size_kb": 64, 00:26:13.510 "state": "online", 00:26:13.510 "raid_level": "concat", 00:26:13.510 "superblock": true, 00:26:13.510 "num_base_bdevs": 3, 00:26:13.510 "num_base_bdevs_discovered": 3, 00:26:13.510 "num_base_bdevs_operational": 3, 00:26:13.510 "base_bdevs_list": [ 00:26:13.510 { 00:26:13.510 "name": "BaseBdev1", 00:26:13.510 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:13.510 "is_configured": true, 00:26:13.510 "data_offset": 2048, 00:26:13.510 "data_size": 63488 00:26:13.510 }, 00:26:13.510 { 00:26:13.510 "name": "BaseBdev2", 00:26:13.510 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:13.510 "is_configured": true, 00:26:13.510 "data_offset": 2048, 00:26:13.510 "data_size": 63488 00:26:13.510 }, 00:26:13.510 { 00:26:13.510 "name": "BaseBdev3", 00:26:13.510 "uuid": "cd679e3e-876a-46fc-a3b5-657b3160572b", 00:26:13.510 "is_configured": true, 00:26:13.510 "data_offset": 2048, 00:26:13.510 "data_size": 63488 00:26:13.510 } 00:26:13.510 ] 00:26:13.510 }' 00:26:13.510 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.510 12:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:14.075 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:14.075 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:14.075 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:14.076 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:14.076 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:14.076 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:26:14.076 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:14.076 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:14.335 [2024-06-07 12:28:37.949767] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:14.335 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:14.335 "name": "Existed_Raid", 00:26:14.335 "aliases": [ 00:26:14.335 "0091b0af-3471-4342-9ca5-72a9ac96d883" 00:26:14.335 ], 00:26:14.335 "product_name": "Raid Volume", 00:26:14.335 "block_size": 512, 00:26:14.335 "num_blocks": 190464, 00:26:14.335 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:14.335 "assigned_rate_limits": { 00:26:14.335 "rw_ios_per_sec": 0, 00:26:14.335 "rw_mbytes_per_sec": 0, 00:26:14.335 "r_mbytes_per_sec": 0, 00:26:14.335 "w_mbytes_per_sec": 0 00:26:14.335 }, 00:26:14.335 "claimed": false, 00:26:14.335 "zoned": false, 00:26:14.335 "supported_io_types": { 00:26:14.335 "read": true, 00:26:14.335 "write": true, 00:26:14.335 "unmap": true, 00:26:14.335 "write_zeroes": true, 00:26:14.335 "flush": true, 00:26:14.335 "reset": true, 00:26:14.335 "compare": false, 00:26:14.335 "compare_and_write": false, 00:26:14.335 "abort": false, 00:26:14.335 "nvme_admin": false, 00:26:14.335 "nvme_io": false 00:26:14.335 }, 00:26:14.335 "memory_domains": [ 00:26:14.335 { 00:26:14.335 "dma_device_id": "system", 00:26:14.335 "dma_device_type": 1 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:14.335 "dma_device_type": 2 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "dma_device_id": "system", 00:26:14.335 "dma_device_type": 1 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:14.335 "dma_device_type": 2 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "dma_device_id": "system", 00:26:14.335 "dma_device_type": 1 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:14.335 "dma_device_type": 2 00:26:14.335 } 00:26:14.335 ], 00:26:14.335 "driver_specific": { 00:26:14.335 "raid": { 00:26:14.335 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:14.335 "strip_size_kb": 64, 00:26:14.335 "state": "online", 00:26:14.335 "raid_level": "concat", 00:26:14.335 "superblock": true, 00:26:14.335 "num_base_bdevs": 3, 00:26:14.335 "num_base_bdevs_discovered": 3, 00:26:14.335 "num_base_bdevs_operational": 3, 00:26:14.335 "base_bdevs_list": [ 00:26:14.335 { 00:26:14.335 "name": "BaseBdev1", 00:26:14.335 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:14.335 "is_configured": true, 00:26:14.335 "data_offset": 2048, 00:26:14.335 "data_size": 63488 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "name": "BaseBdev2", 00:26:14.335 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:14.335 "is_configured": true, 00:26:14.335 "data_offset": 2048, 00:26:14.335 "data_size": 63488 00:26:14.335 }, 00:26:14.335 { 00:26:14.335 "name": "BaseBdev3", 00:26:14.335 "uuid": "cd679e3e-876a-46fc-a3b5-657b3160572b", 00:26:14.335 "is_configured": true, 00:26:14.335 "data_offset": 2048, 00:26:14.335 "data_size": 63488 00:26:14.335 } 00:26:14.335 ] 00:26:14.335 } 00:26:14.335 } 00:26:14.335 }' 00:26:14.335 12:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:14.605 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:14.605 BaseBdev2 00:26:14.605 BaseBdev3' 00:26:14.605 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:14.605 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:14.605 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:14.862 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:14.862 "name": "BaseBdev1", 00:26:14.862 "aliases": [ 00:26:14.862 "28b1a258-4639-46e4-adf1-16e411c0c3f3" 00:26:14.862 ], 00:26:14.862 "product_name": "Malloc disk", 00:26:14.862 "block_size": 512, 00:26:14.862 "num_blocks": 65536, 00:26:14.862 "uuid": "28b1a258-4639-46e4-adf1-16e411c0c3f3", 00:26:14.862 "assigned_rate_limits": { 00:26:14.862 "rw_ios_per_sec": 0, 00:26:14.863 "rw_mbytes_per_sec": 0, 00:26:14.863 "r_mbytes_per_sec": 0, 00:26:14.863 "w_mbytes_per_sec": 0 00:26:14.863 }, 00:26:14.863 "claimed": true, 00:26:14.863 "claim_type": "exclusive_write", 00:26:14.863 "zoned": false, 00:26:14.863 "supported_io_types": { 00:26:14.863 "read": true, 00:26:14.863 "write": true, 00:26:14.863 "unmap": true, 00:26:14.863 "write_zeroes": true, 00:26:14.863 "flush": true, 00:26:14.863 "reset": true, 00:26:14.863 "compare": false, 00:26:14.863 "compare_and_write": false, 00:26:14.863 "abort": true, 00:26:14.863 "nvme_admin": false, 00:26:14.863 "nvme_io": false 00:26:14.863 }, 00:26:14.863 "memory_domains": [ 00:26:14.863 { 00:26:14.863 "dma_device_id": "system", 00:26:14.863 "dma_device_type": 1 00:26:14.863 }, 00:26:14.863 { 00:26:14.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:14.863 "dma_device_type": 2 00:26:14.863 } 00:26:14.863 ], 00:26:14.863 "driver_specific": {} 00:26:14.863 }' 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:14.863 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.120 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.120 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:15.120 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:15.121 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:15.121 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:15.378 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:15.378 "name": "BaseBdev2", 00:26:15.378 "aliases": [ 00:26:15.378 "4754f79b-dfda-4dff-ad82-d060c6a4daa4" 00:26:15.378 ], 00:26:15.378 "product_name": "Malloc disk", 00:26:15.378 "block_size": 512, 00:26:15.378 "num_blocks": 65536, 00:26:15.378 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:15.378 "assigned_rate_limits": { 00:26:15.378 "rw_ios_per_sec": 0, 00:26:15.378 "rw_mbytes_per_sec": 0, 00:26:15.378 "r_mbytes_per_sec": 0, 00:26:15.378 "w_mbytes_per_sec": 0 00:26:15.378 }, 00:26:15.378 "claimed": true, 00:26:15.378 "claim_type": "exclusive_write", 00:26:15.378 "zoned": false, 00:26:15.378 "supported_io_types": { 00:26:15.378 "read": true, 00:26:15.378 "write": true, 00:26:15.378 "unmap": true, 00:26:15.378 "write_zeroes": true, 00:26:15.378 "flush": true, 00:26:15.378 "reset": true, 00:26:15.378 "compare": false, 00:26:15.378 "compare_and_write": false, 00:26:15.378 "abort": true, 00:26:15.378 "nvme_admin": false, 00:26:15.378 "nvme_io": false 00:26:15.378 }, 00:26:15.378 "memory_domains": [ 00:26:15.378 { 00:26:15.378 "dma_device_id": "system", 00:26:15.378 "dma_device_type": 1 00:26:15.378 }, 00:26:15.378 { 00:26:15.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:15.378 "dma_device_type": 2 00:26:15.378 } 00:26:15.378 ], 00:26:15.378 "driver_specific": {} 00:26:15.378 }' 00:26:15.378 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.378 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.378 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:15.378 12:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:15.378 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:15.637 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:15.896 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:15.896 "name": "BaseBdev3", 00:26:15.896 "aliases": [ 00:26:15.896 "cd679e3e-876a-46fc-a3b5-657b3160572b" 00:26:15.896 ], 00:26:15.896 "product_name": "Malloc disk", 00:26:15.896 "block_size": 512, 00:26:15.896 "num_blocks": 65536, 00:26:15.896 "uuid": "cd679e3e-876a-46fc-a3b5-657b3160572b", 00:26:15.896 "assigned_rate_limits": { 00:26:15.896 "rw_ios_per_sec": 0, 00:26:15.896 "rw_mbytes_per_sec": 0, 00:26:15.896 "r_mbytes_per_sec": 0, 00:26:15.896 "w_mbytes_per_sec": 0 00:26:15.896 }, 00:26:15.896 "claimed": true, 00:26:15.896 "claim_type": "exclusive_write", 00:26:15.896 "zoned": false, 00:26:15.896 "supported_io_types": { 00:26:15.896 "read": true, 00:26:15.896 "write": true, 00:26:15.896 "unmap": true, 00:26:15.896 "write_zeroes": true, 00:26:15.896 "flush": true, 00:26:15.896 "reset": true, 00:26:15.896 "compare": false, 00:26:15.896 "compare_and_write": false, 00:26:15.896 "abort": true, 00:26:15.896 "nvme_admin": false, 00:26:15.896 "nvme_io": false 00:26:15.896 }, 00:26:15.896 "memory_domains": [ 00:26:15.896 { 00:26:15.896 "dma_device_id": "system", 00:26:15.896 "dma_device_type": 1 00:26:15.896 }, 00:26:15.896 { 00:26:15.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:15.896 "dma_device_type": 2 00:26:15.896 } 00:26:15.896 ], 00:26:15.896 "driver_specific": {} 00:26:15.896 }' 00:26:15.896 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.896 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.896 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:15.896 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:16.155 12:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:16.414 [2024-06-07 12:28:39.973942] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:16.414 [2024-06-07 12:28:39.974186] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:16.414 [2024-06-07 12:28:39.974437] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.414 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:16.672 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.672 "name": "Existed_Raid", 00:26:16.672 "uuid": "0091b0af-3471-4342-9ca5-72a9ac96d883", 00:26:16.672 "strip_size_kb": 64, 00:26:16.672 "state": "offline", 00:26:16.672 "raid_level": "concat", 00:26:16.672 "superblock": true, 00:26:16.672 "num_base_bdevs": 3, 00:26:16.672 "num_base_bdevs_discovered": 2, 00:26:16.672 "num_base_bdevs_operational": 2, 00:26:16.672 "base_bdevs_list": [ 00:26:16.672 { 00:26:16.672 "name": null, 00:26:16.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.672 "is_configured": false, 00:26:16.672 "data_offset": 2048, 00:26:16.672 "data_size": 63488 00:26:16.672 }, 00:26:16.672 { 00:26:16.672 "name": "BaseBdev2", 00:26:16.672 "uuid": "4754f79b-dfda-4dff-ad82-d060c6a4daa4", 00:26:16.672 "is_configured": true, 00:26:16.672 "data_offset": 2048, 00:26:16.672 "data_size": 63488 00:26:16.672 }, 00:26:16.672 { 00:26:16.672 "name": "BaseBdev3", 00:26:16.672 "uuid": "cd679e3e-876a-46fc-a3b5-657b3160572b", 00:26:16.672 "is_configured": true, 00:26:16.672 "data_offset": 2048, 00:26:16.672 "data_size": 63488 00:26:16.672 } 00:26:16.672 ] 00:26:16.672 }' 00:26:16.672 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.672 12:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:17.606 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:17.606 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:17.606 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.606 12:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:17.606 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:17.606 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:17.606 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:18.172 [2024-06-07 12:28:41.533063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:18.172 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:18.172 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:18.172 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.172 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:18.430 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:18.430 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:18.430 12:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:26:18.689 [2024-06-07 12:28:42.132118] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:18.689 [2024-06-07 12:28:42.132467] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:26:18.689 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:18.689 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:18.689 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:18.689 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.947 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:18.947 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:18.947 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:26:18.947 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:26:18.947 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:18.948 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:26:19.206 BaseBdev2 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:19.206 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:19.464 12:28:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:19.722 [ 00:26:19.722 { 00:26:19.722 "name": "BaseBdev2", 00:26:19.722 "aliases": [ 00:26:19.722 "f3835ce6-9583-4f03-ab7d-ee42f34ba00a" 00:26:19.722 ], 00:26:19.722 "product_name": "Malloc disk", 00:26:19.722 "block_size": 512, 00:26:19.722 "num_blocks": 65536, 00:26:19.722 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:19.722 "assigned_rate_limits": { 00:26:19.722 "rw_ios_per_sec": 0, 00:26:19.722 "rw_mbytes_per_sec": 0, 00:26:19.722 "r_mbytes_per_sec": 0, 00:26:19.722 "w_mbytes_per_sec": 0 00:26:19.722 }, 00:26:19.722 "claimed": false, 00:26:19.722 "zoned": false, 00:26:19.722 "supported_io_types": { 00:26:19.722 "read": true, 00:26:19.722 "write": true, 00:26:19.722 "unmap": true, 00:26:19.722 "write_zeroes": true, 00:26:19.722 "flush": true, 00:26:19.722 "reset": true, 00:26:19.722 "compare": false, 00:26:19.722 "compare_and_write": false, 00:26:19.722 "abort": true, 00:26:19.722 "nvme_admin": false, 00:26:19.722 "nvme_io": false 00:26:19.722 }, 00:26:19.722 "memory_domains": [ 00:26:19.722 { 00:26:19.722 "dma_device_id": "system", 00:26:19.722 "dma_device_type": 1 00:26:19.722 }, 00:26:19.722 { 00:26:19.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:19.722 "dma_device_type": 2 00:26:19.722 } 00:26:19.722 ], 00:26:19.722 "driver_specific": {} 00:26:19.722 } 00:26:19.722 ] 00:26:19.722 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:19.722 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:19.722 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:19.722 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:26:19.980 BaseBdev3 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:19.980 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:20.238 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:26:20.497 [ 00:26:20.497 { 00:26:20.497 "name": "BaseBdev3", 00:26:20.497 "aliases": [ 00:26:20.497 "2f18c56f-a5db-443c-bd9a-a354c18ab4e5" 00:26:20.497 ], 00:26:20.497 "product_name": "Malloc disk", 00:26:20.497 "block_size": 512, 00:26:20.497 "num_blocks": 65536, 00:26:20.497 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:20.497 "assigned_rate_limits": { 00:26:20.497 "rw_ios_per_sec": 0, 00:26:20.497 "rw_mbytes_per_sec": 0, 00:26:20.497 "r_mbytes_per_sec": 0, 00:26:20.497 "w_mbytes_per_sec": 0 00:26:20.497 }, 00:26:20.497 "claimed": false, 00:26:20.497 "zoned": false, 00:26:20.497 "supported_io_types": { 00:26:20.497 "read": true, 00:26:20.497 "write": true, 00:26:20.497 "unmap": true, 00:26:20.498 "write_zeroes": true, 00:26:20.498 "flush": true, 00:26:20.498 "reset": true, 00:26:20.498 "compare": false, 00:26:20.498 "compare_and_write": false, 00:26:20.498 "abort": true, 00:26:20.498 "nvme_admin": false, 00:26:20.498 "nvme_io": false 00:26:20.498 }, 00:26:20.498 "memory_domains": [ 00:26:20.498 { 00:26:20.498 "dma_device_id": "system", 00:26:20.498 "dma_device_type": 1 00:26:20.498 }, 00:26:20.498 { 00:26:20.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:20.498 "dma_device_type": 2 00:26:20.498 } 00:26:20.498 ], 00:26:20.498 "driver_specific": {} 00:26:20.498 } 00:26:20.498 ] 00:26:20.498 12:28:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:20.498 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:20.498 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:20.498 12:28:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:26:20.757 [2024-06-07 12:28:44.143120] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:20.757 [2024-06-07 12:28:44.143555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:20.757 [2024-06-07 12:28:44.143700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:20.757 [2024-06-07 12:28:44.145858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.757 "name": "Existed_Raid", 00:26:20.757 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:20.757 "strip_size_kb": 64, 00:26:20.757 "state": "configuring", 00:26:20.757 "raid_level": "concat", 00:26:20.757 "superblock": true, 00:26:20.757 "num_base_bdevs": 3, 00:26:20.757 "num_base_bdevs_discovered": 2, 00:26:20.757 "num_base_bdevs_operational": 3, 00:26:20.757 "base_bdevs_list": [ 00:26:20.757 { 00:26:20.757 "name": "BaseBdev1", 00:26:20.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.757 "is_configured": false, 00:26:20.757 "data_offset": 0, 00:26:20.757 "data_size": 0 00:26:20.757 }, 00:26:20.757 { 00:26:20.757 "name": "BaseBdev2", 00:26:20.757 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:20.757 "is_configured": true, 00:26:20.757 "data_offset": 2048, 00:26:20.757 "data_size": 63488 00:26:20.757 }, 00:26:20.757 { 00:26:20.757 "name": "BaseBdev3", 00:26:20.757 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:20.757 "is_configured": true, 00:26:20.757 "data_offset": 2048, 00:26:20.757 "data_size": 63488 00:26:20.757 } 00:26:20.757 ] 00:26:20.757 }' 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.757 12:28:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:21.324 12:28:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:21.582 [2024-06-07 12:28:45.195158] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.582 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.840 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:21.840 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.840 "name": "Existed_Raid", 00:26:21.840 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:21.840 "strip_size_kb": 64, 00:26:21.840 "state": "configuring", 00:26:21.840 "raid_level": "concat", 00:26:21.840 "superblock": true, 00:26:21.841 "num_base_bdevs": 3, 00:26:21.841 "num_base_bdevs_discovered": 1, 00:26:21.841 "num_base_bdevs_operational": 3, 00:26:21.841 "base_bdevs_list": [ 00:26:21.841 { 00:26:21.841 "name": "BaseBdev1", 00:26:21.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.841 "is_configured": false, 00:26:21.841 "data_offset": 0, 00:26:21.841 "data_size": 0 00:26:21.841 }, 00:26:21.841 { 00:26:21.841 "name": null, 00:26:21.841 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:21.841 "is_configured": false, 00:26:21.841 "data_offset": 2048, 00:26:21.841 "data_size": 63488 00:26:21.841 }, 00:26:21.841 { 00:26:21.841 "name": "BaseBdev3", 00:26:21.841 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:21.841 "is_configured": true, 00:26:21.841 "data_offset": 2048, 00:26:21.841 "data_size": 63488 00:26:21.841 } 00:26:21.841 ] 00:26:21.841 }' 00:26:21.841 12:28:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.841 12:28:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:22.466 12:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.466 12:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:26:23.033 [2024-06-07 12:28:46.600944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:23.033 BaseBdev1 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:23.033 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:23.291 12:28:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:23.549 [ 00:26:23.549 { 00:26:23.549 "name": "BaseBdev1", 00:26:23.549 "aliases": [ 00:26:23.549 "587ef4ca-fca0-49bf-9472-1f588dd1d631" 00:26:23.549 ], 00:26:23.549 "product_name": "Malloc disk", 00:26:23.549 "block_size": 512, 00:26:23.549 "num_blocks": 65536, 00:26:23.549 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:23.549 "assigned_rate_limits": { 00:26:23.549 "rw_ios_per_sec": 0, 00:26:23.549 "rw_mbytes_per_sec": 0, 00:26:23.549 "r_mbytes_per_sec": 0, 00:26:23.549 "w_mbytes_per_sec": 0 00:26:23.549 }, 00:26:23.549 "claimed": true, 00:26:23.549 "claim_type": "exclusive_write", 00:26:23.549 "zoned": false, 00:26:23.549 "supported_io_types": { 00:26:23.549 "read": true, 00:26:23.549 "write": true, 00:26:23.549 "unmap": true, 00:26:23.549 "write_zeroes": true, 00:26:23.549 "flush": true, 00:26:23.549 "reset": true, 00:26:23.549 "compare": false, 00:26:23.549 "compare_and_write": false, 00:26:23.549 "abort": true, 00:26:23.549 "nvme_admin": false, 00:26:23.549 "nvme_io": false 00:26:23.549 }, 00:26:23.549 "memory_domains": [ 00:26:23.549 { 00:26:23.549 "dma_device_id": "system", 00:26:23.549 "dma_device_type": 1 00:26:23.549 }, 00:26:23.549 { 00:26:23.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.549 "dma_device_type": 2 00:26:23.549 } 00:26:23.549 ], 00:26:23.549 "driver_specific": {} 00:26:23.549 } 00:26:23.549 ] 00:26:23.549 12:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:23.549 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:23.549 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.550 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:23.808 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:23.808 "name": "Existed_Raid", 00:26:23.808 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:23.808 "strip_size_kb": 64, 00:26:23.808 "state": "configuring", 00:26:23.808 "raid_level": "concat", 00:26:23.808 "superblock": true, 00:26:23.808 "num_base_bdevs": 3, 00:26:23.808 "num_base_bdevs_discovered": 2, 00:26:23.808 "num_base_bdevs_operational": 3, 00:26:23.808 "base_bdevs_list": [ 00:26:23.808 { 00:26:23.808 "name": "BaseBdev1", 00:26:23.808 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:23.808 "is_configured": true, 00:26:23.808 "data_offset": 2048, 00:26:23.808 "data_size": 63488 00:26:23.808 }, 00:26:23.808 { 00:26:23.808 "name": null, 00:26:23.808 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:23.808 "is_configured": false, 00:26:23.808 "data_offset": 2048, 00:26:23.808 "data_size": 63488 00:26:23.808 }, 00:26:23.808 { 00:26:23.808 "name": "BaseBdev3", 00:26:23.808 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:23.808 "is_configured": true, 00:26:23.808 "data_offset": 2048, 00:26:23.808 "data_size": 63488 00:26:23.808 } 00:26:23.808 ] 00:26:23.808 }' 00:26:23.808 12:28:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:23.808 12:28:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:24.743 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.743 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:25.002 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:26:25.002 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:26:25.260 [2024-06-07 12:28:48.749409] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.260 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.261 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.261 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.261 12:28:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:25.519 12:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.519 "name": "Existed_Raid", 00:26:25.519 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:25.519 "strip_size_kb": 64, 00:26:25.519 "state": "configuring", 00:26:25.519 "raid_level": "concat", 00:26:25.519 "superblock": true, 00:26:25.519 "num_base_bdevs": 3, 00:26:25.519 "num_base_bdevs_discovered": 1, 00:26:25.519 "num_base_bdevs_operational": 3, 00:26:25.519 "base_bdevs_list": [ 00:26:25.519 { 00:26:25.519 "name": "BaseBdev1", 00:26:25.519 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:25.519 "is_configured": true, 00:26:25.519 "data_offset": 2048, 00:26:25.519 "data_size": 63488 00:26:25.519 }, 00:26:25.519 { 00:26:25.519 "name": null, 00:26:25.519 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:25.519 "is_configured": false, 00:26:25.519 "data_offset": 2048, 00:26:25.519 "data_size": 63488 00:26:25.519 }, 00:26:25.519 { 00:26:25.519 "name": null, 00:26:25.519 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:25.519 "is_configured": false, 00:26:25.519 "data_offset": 2048, 00:26:25.519 "data_size": 63488 00:26:25.519 } 00:26:25.519 ] 00:26:25.519 }' 00:26:25.519 12:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.519 12:28:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:26.085 12:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:26.085 12:28:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.653 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:26:26.653 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:26:26.653 [2024-06-07 12:28:50.281635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.991 "name": "Existed_Raid", 00:26:26.991 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:26.991 "strip_size_kb": 64, 00:26:26.991 "state": "configuring", 00:26:26.991 "raid_level": "concat", 00:26:26.991 "superblock": true, 00:26:26.991 "num_base_bdevs": 3, 00:26:26.991 "num_base_bdevs_discovered": 2, 00:26:26.991 "num_base_bdevs_operational": 3, 00:26:26.991 "base_bdevs_list": [ 00:26:26.991 { 00:26:26.991 "name": "BaseBdev1", 00:26:26.991 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:26.991 "is_configured": true, 00:26:26.991 "data_offset": 2048, 00:26:26.991 "data_size": 63488 00:26:26.991 }, 00:26:26.991 { 00:26:26.991 "name": null, 00:26:26.991 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:26.991 "is_configured": false, 00:26:26.991 "data_offset": 2048, 00:26:26.991 "data_size": 63488 00:26:26.991 }, 00:26:26.991 { 00:26:26.991 "name": "BaseBdev3", 00:26:26.991 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:26.991 "is_configured": true, 00:26:26.991 "data_offset": 2048, 00:26:26.991 "data_size": 63488 00:26:26.991 } 00:26:26.991 ] 00:26:26.991 }' 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.991 12:28:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:27.557 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.557 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:27.815 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:26:27.815 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:28.073 [2024-06-07 12:28:51.673789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:28.073 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:28.332 12:28:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.628 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:28.628 "name": "Existed_Raid", 00:26:28.628 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:28.628 "strip_size_kb": 64, 00:26:28.628 "state": "configuring", 00:26:28.628 "raid_level": "concat", 00:26:28.628 "superblock": true, 00:26:28.628 "num_base_bdevs": 3, 00:26:28.628 "num_base_bdevs_discovered": 1, 00:26:28.628 "num_base_bdevs_operational": 3, 00:26:28.628 "base_bdevs_list": [ 00:26:28.628 { 00:26:28.628 "name": null, 00:26:28.628 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:28.628 "is_configured": false, 00:26:28.628 "data_offset": 2048, 00:26:28.628 "data_size": 63488 00:26:28.628 }, 00:26:28.628 { 00:26:28.628 "name": null, 00:26:28.628 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:28.628 "is_configured": false, 00:26:28.628 "data_offset": 2048, 00:26:28.628 "data_size": 63488 00:26:28.628 }, 00:26:28.628 { 00:26:28.628 "name": "BaseBdev3", 00:26:28.628 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:28.628 "is_configured": true, 00:26:28.628 "data_offset": 2048, 00:26:28.628 "data_size": 63488 00:26:28.628 } 00:26:28.628 ] 00:26:28.628 }' 00:26:28.628 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:28.628 12:28:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:29.199 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.199 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:29.461 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:26:29.461 12:28:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:26:29.461 [2024-06-07 12:28:53.086698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.719 "name": "Existed_Raid", 00:26:29.719 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:29.719 "strip_size_kb": 64, 00:26:29.719 "state": "configuring", 00:26:29.719 "raid_level": "concat", 00:26:29.719 "superblock": true, 00:26:29.719 "num_base_bdevs": 3, 00:26:29.719 "num_base_bdevs_discovered": 2, 00:26:29.719 "num_base_bdevs_operational": 3, 00:26:29.719 "base_bdevs_list": [ 00:26:29.719 { 00:26:29.719 "name": null, 00:26:29.719 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:29.719 "is_configured": false, 00:26:29.719 "data_offset": 2048, 00:26:29.719 "data_size": 63488 00:26:29.719 }, 00:26:29.719 { 00:26:29.719 "name": "BaseBdev2", 00:26:29.719 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:29.719 "is_configured": true, 00:26:29.719 "data_offset": 2048, 00:26:29.719 "data_size": 63488 00:26:29.719 }, 00:26:29.719 { 00:26:29.719 "name": "BaseBdev3", 00:26:29.719 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:29.719 "is_configured": true, 00:26:29.719 "data_offset": 2048, 00:26:29.719 "data_size": 63488 00:26:29.719 } 00:26:29.719 ] 00:26:29.719 }' 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.719 12:28:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:30.653 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.653 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:30.653 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:26:30.653 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.653 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 587ef4ca-fca0-49bf-9472-1f588dd1d631 00:26:31.221 [2024-06-07 12:28:54.844435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:26:31.221 [2024-06-07 12:28:54.844878] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:26:31.221 [2024-06-07 12:28:54.845004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:26:31.221 [2024-06-07 12:28:54.845125] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:26:31.221 [2024-06-07 12:28:54.845498] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:26:31.221 [2024-06-07 12:28:54.845546] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:26:31.221 [2024-06-07 12:28:54.845716] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.221 NewBaseBdev 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:31.221 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:31.479 12:28:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:31.479 12:28:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:26:31.738 [ 00:26:31.738 { 00:26:31.738 "name": "NewBaseBdev", 00:26:31.738 "aliases": [ 00:26:31.738 "587ef4ca-fca0-49bf-9472-1f588dd1d631" 00:26:31.738 ], 00:26:31.738 "product_name": "Malloc disk", 00:26:31.738 "block_size": 512, 00:26:31.738 "num_blocks": 65536, 00:26:31.738 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:31.738 "assigned_rate_limits": { 00:26:31.738 "rw_ios_per_sec": 0, 00:26:31.738 "rw_mbytes_per_sec": 0, 00:26:31.738 "r_mbytes_per_sec": 0, 00:26:31.738 "w_mbytes_per_sec": 0 00:26:31.738 }, 00:26:31.738 "claimed": true, 00:26:31.738 "claim_type": "exclusive_write", 00:26:31.738 "zoned": false, 00:26:31.738 "supported_io_types": { 00:26:31.738 "read": true, 00:26:31.738 "write": true, 00:26:31.738 "unmap": true, 00:26:31.738 "write_zeroes": true, 00:26:31.738 "flush": true, 00:26:31.738 "reset": true, 00:26:31.738 "compare": false, 00:26:31.738 "compare_and_write": false, 00:26:31.738 "abort": true, 00:26:31.738 "nvme_admin": false, 00:26:31.738 "nvme_io": false 00:26:31.738 }, 00:26:31.738 "memory_domains": [ 00:26:31.738 { 00:26:31.738 "dma_device_id": "system", 00:26:31.738 "dma_device_type": 1 00:26:31.738 }, 00:26:31.738 { 00:26:31.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:31.738 "dma_device_type": 2 00:26:31.738 } 00:26:31.738 ], 00:26:31.738 "driver_specific": {} 00:26:31.738 } 00:26:31.739 ] 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.739 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:31.998 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.998 "name": "Existed_Raid", 00:26:31.998 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:31.998 "strip_size_kb": 64, 00:26:31.998 "state": "online", 00:26:31.998 "raid_level": "concat", 00:26:31.998 "superblock": true, 00:26:31.998 "num_base_bdevs": 3, 00:26:31.998 "num_base_bdevs_discovered": 3, 00:26:31.998 "num_base_bdevs_operational": 3, 00:26:31.998 "base_bdevs_list": [ 00:26:31.998 { 00:26:31.998 "name": "NewBaseBdev", 00:26:31.998 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:31.998 "is_configured": true, 00:26:31.998 "data_offset": 2048, 00:26:31.998 "data_size": 63488 00:26:31.998 }, 00:26:31.998 { 00:26:31.998 "name": "BaseBdev2", 00:26:31.998 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:31.998 "is_configured": true, 00:26:31.998 "data_offset": 2048, 00:26:31.998 "data_size": 63488 00:26:31.998 }, 00:26:31.998 { 00:26:31.998 "name": "BaseBdev3", 00:26:31.998 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:31.998 "is_configured": true, 00:26:31.998 "data_offset": 2048, 00:26:31.998 "data_size": 63488 00:26:31.998 } 00:26:31.998 ] 00:26:31.998 }' 00:26:31.998 12:28:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.998 12:28:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:32.978 [2024-06-07 12:28:56.572878] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:32.978 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:32.979 "name": "Existed_Raid", 00:26:32.979 "aliases": [ 00:26:32.979 "09ffd6cf-b70d-4661-8a49-621a9a402a47" 00:26:32.979 ], 00:26:32.979 "product_name": "Raid Volume", 00:26:32.979 "block_size": 512, 00:26:32.979 "num_blocks": 190464, 00:26:32.979 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:32.979 "assigned_rate_limits": { 00:26:32.979 "rw_ios_per_sec": 0, 00:26:32.979 "rw_mbytes_per_sec": 0, 00:26:32.979 "r_mbytes_per_sec": 0, 00:26:32.979 "w_mbytes_per_sec": 0 00:26:32.979 }, 00:26:32.979 "claimed": false, 00:26:32.979 "zoned": false, 00:26:32.979 "supported_io_types": { 00:26:32.979 "read": true, 00:26:32.979 "write": true, 00:26:32.979 "unmap": true, 00:26:32.979 "write_zeroes": true, 00:26:32.979 "flush": true, 00:26:32.979 "reset": true, 00:26:32.979 "compare": false, 00:26:32.979 "compare_and_write": false, 00:26:32.979 "abort": false, 00:26:32.979 "nvme_admin": false, 00:26:32.979 "nvme_io": false 00:26:32.979 }, 00:26:32.979 "memory_domains": [ 00:26:32.979 { 00:26:32.979 "dma_device_id": "system", 00:26:32.979 "dma_device_type": 1 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.979 "dma_device_type": 2 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "dma_device_id": "system", 00:26:32.979 "dma_device_type": 1 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.979 "dma_device_type": 2 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "dma_device_id": "system", 00:26:32.979 "dma_device_type": 1 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.979 "dma_device_type": 2 00:26:32.979 } 00:26:32.979 ], 00:26:32.979 "driver_specific": { 00:26:32.979 "raid": { 00:26:32.979 "uuid": "09ffd6cf-b70d-4661-8a49-621a9a402a47", 00:26:32.979 "strip_size_kb": 64, 00:26:32.979 "state": "online", 00:26:32.979 "raid_level": "concat", 00:26:32.979 "superblock": true, 00:26:32.979 "num_base_bdevs": 3, 00:26:32.979 "num_base_bdevs_discovered": 3, 00:26:32.979 "num_base_bdevs_operational": 3, 00:26:32.979 "base_bdevs_list": [ 00:26:32.979 { 00:26:32.979 "name": "NewBaseBdev", 00:26:32.979 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:32.979 "is_configured": true, 00:26:32.979 "data_offset": 2048, 00:26:32.979 "data_size": 63488 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "name": "BaseBdev2", 00:26:32.979 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:32.979 "is_configured": true, 00:26:32.979 "data_offset": 2048, 00:26:32.979 "data_size": 63488 00:26:32.979 }, 00:26:32.979 { 00:26:32.979 "name": "BaseBdev3", 00:26:32.979 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:32.979 "is_configured": true, 00:26:32.979 "data_offset": 2048, 00:26:32.979 "data_size": 63488 00:26:32.979 } 00:26:32.979 ] 00:26:32.979 } 00:26:32.979 } 00:26:32.979 }' 00:26:32.979 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:33.237 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:26:33.237 BaseBdev2 00:26:33.237 BaseBdev3' 00:26:33.237 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:33.237 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:33.237 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:26:33.496 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:33.496 "name": "NewBaseBdev", 00:26:33.496 "aliases": [ 00:26:33.496 "587ef4ca-fca0-49bf-9472-1f588dd1d631" 00:26:33.496 ], 00:26:33.496 "product_name": "Malloc disk", 00:26:33.496 "block_size": 512, 00:26:33.496 "num_blocks": 65536, 00:26:33.496 "uuid": "587ef4ca-fca0-49bf-9472-1f588dd1d631", 00:26:33.496 "assigned_rate_limits": { 00:26:33.496 "rw_ios_per_sec": 0, 00:26:33.496 "rw_mbytes_per_sec": 0, 00:26:33.496 "r_mbytes_per_sec": 0, 00:26:33.496 "w_mbytes_per_sec": 0 00:26:33.496 }, 00:26:33.496 "claimed": true, 00:26:33.496 "claim_type": "exclusive_write", 00:26:33.496 "zoned": false, 00:26:33.496 "supported_io_types": { 00:26:33.496 "read": true, 00:26:33.496 "write": true, 00:26:33.496 "unmap": true, 00:26:33.496 "write_zeroes": true, 00:26:33.496 "flush": true, 00:26:33.496 "reset": true, 00:26:33.496 "compare": false, 00:26:33.496 "compare_and_write": false, 00:26:33.496 "abort": true, 00:26:33.496 "nvme_admin": false, 00:26:33.496 "nvme_io": false 00:26:33.496 }, 00:26:33.496 "memory_domains": [ 00:26:33.496 { 00:26:33.496 "dma_device_id": "system", 00:26:33.496 "dma_device_type": 1 00:26:33.496 }, 00:26:33.496 { 00:26:33.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.496 "dma_device_type": 2 00:26:33.496 } 00:26:33.496 ], 00:26:33.496 "driver_specific": {} 00:26:33.496 }' 00:26:33.496 12:28:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.496 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.496 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:33.496 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.496 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:33.755 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:34.013 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:34.013 "name": "BaseBdev2", 00:26:34.013 "aliases": [ 00:26:34.013 "f3835ce6-9583-4f03-ab7d-ee42f34ba00a" 00:26:34.013 ], 00:26:34.013 "product_name": "Malloc disk", 00:26:34.013 "block_size": 512, 00:26:34.013 "num_blocks": 65536, 00:26:34.013 "uuid": "f3835ce6-9583-4f03-ab7d-ee42f34ba00a", 00:26:34.013 "assigned_rate_limits": { 00:26:34.013 "rw_ios_per_sec": 0, 00:26:34.013 "rw_mbytes_per_sec": 0, 00:26:34.013 "r_mbytes_per_sec": 0, 00:26:34.013 "w_mbytes_per_sec": 0 00:26:34.013 }, 00:26:34.013 "claimed": true, 00:26:34.013 "claim_type": "exclusive_write", 00:26:34.013 "zoned": false, 00:26:34.013 "supported_io_types": { 00:26:34.013 "read": true, 00:26:34.013 "write": true, 00:26:34.013 "unmap": true, 00:26:34.013 "write_zeroes": true, 00:26:34.013 "flush": true, 00:26:34.013 "reset": true, 00:26:34.013 "compare": false, 00:26:34.013 "compare_and_write": false, 00:26:34.013 "abort": true, 00:26:34.013 "nvme_admin": false, 00:26:34.013 "nvme_io": false 00:26:34.013 }, 00:26:34.013 "memory_domains": [ 00:26:34.013 { 00:26:34.013 "dma_device_id": "system", 00:26:34.013 "dma_device_type": 1 00:26:34.013 }, 00:26:34.013 { 00:26:34.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.013 "dma_device_type": 2 00:26:34.013 } 00:26:34.013 ], 00:26:34.013 "driver_specific": {} 00:26:34.013 }' 00:26:34.013 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.013 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.013 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:34.013 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.270 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.270 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:34.270 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.270 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.270 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:34.271 12:28:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:34.529 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:34.529 "name": "BaseBdev3", 00:26:34.529 "aliases": [ 00:26:34.529 "2f18c56f-a5db-443c-bd9a-a354c18ab4e5" 00:26:34.529 ], 00:26:34.529 "product_name": "Malloc disk", 00:26:34.529 "block_size": 512, 00:26:34.529 "num_blocks": 65536, 00:26:34.529 "uuid": "2f18c56f-a5db-443c-bd9a-a354c18ab4e5", 00:26:34.529 "assigned_rate_limits": { 00:26:34.529 "rw_ios_per_sec": 0, 00:26:34.529 "rw_mbytes_per_sec": 0, 00:26:34.529 "r_mbytes_per_sec": 0, 00:26:34.529 "w_mbytes_per_sec": 0 00:26:34.529 }, 00:26:34.529 "claimed": true, 00:26:34.529 "claim_type": "exclusive_write", 00:26:34.529 "zoned": false, 00:26:34.529 "supported_io_types": { 00:26:34.529 "read": true, 00:26:34.529 "write": true, 00:26:34.529 "unmap": true, 00:26:34.529 "write_zeroes": true, 00:26:34.529 "flush": true, 00:26:34.529 "reset": true, 00:26:34.529 "compare": false, 00:26:34.529 "compare_and_write": false, 00:26:34.529 "abort": true, 00:26:34.529 "nvme_admin": false, 00:26:34.529 "nvme_io": false 00:26:34.529 }, 00:26:34.529 "memory_domains": [ 00:26:34.529 { 00:26:34.529 "dma_device_id": "system", 00:26:34.529 "dma_device_type": 1 00:26:34.529 }, 00:26:34.529 { 00:26:34.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.529 "dma_device_type": 2 00:26:34.529 } 00:26:34.529 ], 00:26:34.529 "driver_specific": {} 00:26:34.529 }' 00:26:34.530 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:34.823 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.081 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:35.081 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:35.081 [2024-06-07 12:28:58.677041] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:35.081 [2024-06-07 12:28:58.677366] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:35.082 [2024-06-07 12:28:58.677641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:35.082 [2024-06-07 12:28:58.677767] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:35.082 [2024-06-07 12:28:58.677841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 206529 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 206529 ']' 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 206529 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:35.082 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 206529 00:26:35.340 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:35.340 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:35.340 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 206529' 00:26:35.340 killing process with pid 206529 00:26:35.340 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 206529 00:26:35.340 [2024-06-07 12:28:58.732091] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:35.340 12:28:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 206529 00:26:35.340 [2024-06-07 12:28:58.793344] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:35.598 12:28:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:26:35.598 00:26:35.598 real 0m30.421s 00:26:35.598 user 0m56.069s 00:26:35.598 sys 0m4.833s 00:26:35.598 12:28:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:35.598 12:28:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:35.598 ************************************ 00:26:35.598 END TEST raid_state_function_test_sb 00:26:35.598 ************************************ 00:26:35.598 12:28:59 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:26:35.598 12:28:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:26:35.598 12:28:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:35.598 12:28:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:35.598 ************************************ 00:26:35.598 START TEST raid_superblock_test 00:26:35.598 ************************************ 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 3 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=207513 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 207513 /var/tmp/spdk-raid.sock 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 207513 ']' 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:35.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:35.598 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:35.599 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:35.857 [2024-06-07 12:28:59.266369] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:26:35.857 [2024-06-07 12:28:59.266898] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207513 ] 00:26:35.857 [2024-06-07 12:28:59.411520] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.114 [2024-06-07 12:28:59.509124] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.114 [2024-06-07 12:28:59.592527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:36.114 12:28:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:26:36.372 malloc1 00:26:36.630 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:36.889 [2024-06-07 12:29:00.309045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:36.889 [2024-06-07 12:29:00.309462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.889 [2024-06-07 12:29:00.309611] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:26:36.889 [2024-06-07 12:29:00.309781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.889 [2024-06-07 12:29:00.312338] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.889 [2024-06-07 12:29:00.312530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:36.889 pt1 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:36.889 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:26:37.148 malloc2 00:26:37.148 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:37.148 [2024-06-07 12:29:00.792752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:37.148 [2024-06-07 12:29:00.793108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.148 [2024-06-07 12:29:00.793276] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:26:37.148 [2024-06-07 12:29:00.793410] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.405 [2024-06-07 12:29:00.795770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.405 [2024-06-07 12:29:00.795949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:37.405 pt2 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:37.405 12:29:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:26:37.663 malloc3 00:26:37.663 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:26:37.922 [2024-06-07 12:29:01.349687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:26:37.922 [2024-06-07 12:29:01.349981] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.922 [2024-06-07 12:29:01.350149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:26:37.922 [2024-06-07 12:29:01.350331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.922 [2024-06-07 12:29:01.352806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.922 [2024-06-07 12:29:01.353009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:26:37.922 pt3 00:26:37.922 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:37.922 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:37.922 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:26:38.181 [2024-06-07 12:29:01.589912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:38.181 [2024-06-07 12:29:01.592282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:38.181 [2024-06-07 12:29:01.592477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:26:38.181 [2024-06-07 12:29:01.592738] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:26:38.181 [2024-06-07 12:29:01.592850] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:26:38.181 [2024-06-07 12:29:01.593056] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:26:38.181 [2024-06-07 12:29:01.593431] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:26:38.181 [2024-06-07 12:29:01.593548] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007880 00:26:38.181 [2024-06-07 12:29:01.593829] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.181 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.439 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.439 "name": "raid_bdev1", 00:26:38.439 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:38.439 "strip_size_kb": 64, 00:26:38.439 "state": "online", 00:26:38.439 "raid_level": "concat", 00:26:38.439 "superblock": true, 00:26:38.439 "num_base_bdevs": 3, 00:26:38.439 "num_base_bdevs_discovered": 3, 00:26:38.439 "num_base_bdevs_operational": 3, 00:26:38.439 "base_bdevs_list": [ 00:26:38.439 { 00:26:38.439 "name": "pt1", 00:26:38.439 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:38.439 "is_configured": true, 00:26:38.439 "data_offset": 2048, 00:26:38.439 "data_size": 63488 00:26:38.439 }, 00:26:38.439 { 00:26:38.439 "name": "pt2", 00:26:38.439 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:38.439 "is_configured": true, 00:26:38.439 "data_offset": 2048, 00:26:38.439 "data_size": 63488 00:26:38.439 }, 00:26:38.439 { 00:26:38.439 "name": "pt3", 00:26:38.439 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:38.439 "is_configured": true, 00:26:38.439 "data_offset": 2048, 00:26:38.439 "data_size": 63488 00:26:38.439 } 00:26:38.439 ] 00:26:38.439 }' 00:26:38.439 12:29:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.439 12:29:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:39.015 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:39.278 [2024-06-07 12:29:02.802185] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:39.278 "name": "raid_bdev1", 00:26:39.278 "aliases": [ 00:26:39.278 "84de772c-4a6a-4890-a70a-d4f2f8c276b5" 00:26:39.278 ], 00:26:39.278 "product_name": "Raid Volume", 00:26:39.278 "block_size": 512, 00:26:39.278 "num_blocks": 190464, 00:26:39.278 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:39.278 "assigned_rate_limits": { 00:26:39.278 "rw_ios_per_sec": 0, 00:26:39.278 "rw_mbytes_per_sec": 0, 00:26:39.278 "r_mbytes_per_sec": 0, 00:26:39.278 "w_mbytes_per_sec": 0 00:26:39.278 }, 00:26:39.278 "claimed": false, 00:26:39.278 "zoned": false, 00:26:39.278 "supported_io_types": { 00:26:39.278 "read": true, 00:26:39.278 "write": true, 00:26:39.278 "unmap": true, 00:26:39.278 "write_zeroes": true, 00:26:39.278 "flush": true, 00:26:39.278 "reset": true, 00:26:39.278 "compare": false, 00:26:39.278 "compare_and_write": false, 00:26:39.278 "abort": false, 00:26:39.278 "nvme_admin": false, 00:26:39.278 "nvme_io": false 00:26:39.278 }, 00:26:39.278 "memory_domains": [ 00:26:39.278 { 00:26:39.278 "dma_device_id": "system", 00:26:39.278 "dma_device_type": 1 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.278 "dma_device_type": 2 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "dma_device_id": "system", 00:26:39.278 "dma_device_type": 1 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.278 "dma_device_type": 2 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "dma_device_id": "system", 00:26:39.278 "dma_device_type": 1 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.278 "dma_device_type": 2 00:26:39.278 } 00:26:39.278 ], 00:26:39.278 "driver_specific": { 00:26:39.278 "raid": { 00:26:39.278 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:39.278 "strip_size_kb": 64, 00:26:39.278 "state": "online", 00:26:39.278 "raid_level": "concat", 00:26:39.278 "superblock": true, 00:26:39.278 "num_base_bdevs": 3, 00:26:39.278 "num_base_bdevs_discovered": 3, 00:26:39.278 "num_base_bdevs_operational": 3, 00:26:39.278 "base_bdevs_list": [ 00:26:39.278 { 00:26:39.278 "name": "pt1", 00:26:39.278 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:39.278 "is_configured": true, 00:26:39.278 "data_offset": 2048, 00:26:39.278 "data_size": 63488 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "name": "pt2", 00:26:39.278 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:39.278 "is_configured": true, 00:26:39.278 "data_offset": 2048, 00:26:39.278 "data_size": 63488 00:26:39.278 }, 00:26:39.278 { 00:26:39.278 "name": "pt3", 00:26:39.278 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:39.278 "is_configured": true, 00:26:39.278 "data_offset": 2048, 00:26:39.278 "data_size": 63488 00:26:39.278 } 00:26:39.278 ] 00:26:39.278 } 00:26:39.278 } 00:26:39.278 }' 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:39.278 pt2 00:26:39.278 pt3' 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:39.278 12:29:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:39.845 "name": "pt1", 00:26:39.845 "aliases": [ 00:26:39.845 "00000000-0000-0000-0000-000000000001" 00:26:39.845 ], 00:26:39.845 "product_name": "passthru", 00:26:39.845 "block_size": 512, 00:26:39.845 "num_blocks": 65536, 00:26:39.845 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:39.845 "assigned_rate_limits": { 00:26:39.845 "rw_ios_per_sec": 0, 00:26:39.845 "rw_mbytes_per_sec": 0, 00:26:39.845 "r_mbytes_per_sec": 0, 00:26:39.845 "w_mbytes_per_sec": 0 00:26:39.845 }, 00:26:39.845 "claimed": true, 00:26:39.845 "claim_type": "exclusive_write", 00:26:39.845 "zoned": false, 00:26:39.845 "supported_io_types": { 00:26:39.845 "read": true, 00:26:39.845 "write": true, 00:26:39.845 "unmap": true, 00:26:39.845 "write_zeroes": true, 00:26:39.845 "flush": true, 00:26:39.845 "reset": true, 00:26:39.845 "compare": false, 00:26:39.845 "compare_and_write": false, 00:26:39.845 "abort": true, 00:26:39.845 "nvme_admin": false, 00:26:39.845 "nvme_io": false 00:26:39.845 }, 00:26:39.845 "memory_domains": [ 00:26:39.845 { 00:26:39.845 "dma_device_id": "system", 00:26:39.845 "dma_device_type": 1 00:26:39.845 }, 00:26:39.845 { 00:26:39.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.845 "dma_device_type": 2 00:26:39.845 } 00:26:39.845 ], 00:26:39.845 "driver_specific": { 00:26:39.845 "passthru": { 00:26:39.845 "name": "pt1", 00:26:39.845 "base_bdev_name": "malloc1" 00:26:39.845 } 00:26:39.845 } 00:26:39.845 }' 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:39.845 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:40.105 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:40.105 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.105 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.106 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:40.106 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:40.106 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:40.106 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:40.383 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:40.383 "name": "pt2", 00:26:40.383 "aliases": [ 00:26:40.383 "00000000-0000-0000-0000-000000000002" 00:26:40.383 ], 00:26:40.383 "product_name": "passthru", 00:26:40.383 "block_size": 512, 00:26:40.383 "num_blocks": 65536, 00:26:40.383 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:40.383 "assigned_rate_limits": { 00:26:40.383 "rw_ios_per_sec": 0, 00:26:40.383 "rw_mbytes_per_sec": 0, 00:26:40.383 "r_mbytes_per_sec": 0, 00:26:40.383 "w_mbytes_per_sec": 0 00:26:40.383 }, 00:26:40.383 "claimed": true, 00:26:40.383 "claim_type": "exclusive_write", 00:26:40.383 "zoned": false, 00:26:40.383 "supported_io_types": { 00:26:40.383 "read": true, 00:26:40.383 "write": true, 00:26:40.383 "unmap": true, 00:26:40.383 "write_zeroes": true, 00:26:40.383 "flush": true, 00:26:40.383 "reset": true, 00:26:40.383 "compare": false, 00:26:40.383 "compare_and_write": false, 00:26:40.383 "abort": true, 00:26:40.383 "nvme_admin": false, 00:26:40.383 "nvme_io": false 00:26:40.383 }, 00:26:40.383 "memory_domains": [ 00:26:40.383 { 00:26:40.383 "dma_device_id": "system", 00:26:40.383 "dma_device_type": 1 00:26:40.383 }, 00:26:40.383 { 00:26:40.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:40.383 "dma_device_type": 2 00:26:40.383 } 00:26:40.383 ], 00:26:40.383 "driver_specific": { 00:26:40.383 "passthru": { 00:26:40.383 "name": "pt2", 00:26:40.383 "base_bdev_name": "malloc2" 00:26:40.383 } 00:26:40.383 } 00:26:40.383 }' 00:26:40.383 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:40.383 12:29:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:40.642 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.901 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.901 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:40.901 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:40.901 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:26:40.901 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:41.160 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:41.160 "name": "pt3", 00:26:41.160 "aliases": [ 00:26:41.160 "00000000-0000-0000-0000-000000000003" 00:26:41.160 ], 00:26:41.160 "product_name": "passthru", 00:26:41.160 "block_size": 512, 00:26:41.160 "num_blocks": 65536, 00:26:41.160 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:41.160 "assigned_rate_limits": { 00:26:41.160 "rw_ios_per_sec": 0, 00:26:41.160 "rw_mbytes_per_sec": 0, 00:26:41.160 "r_mbytes_per_sec": 0, 00:26:41.160 "w_mbytes_per_sec": 0 00:26:41.160 }, 00:26:41.160 "claimed": true, 00:26:41.160 "claim_type": "exclusive_write", 00:26:41.160 "zoned": false, 00:26:41.160 "supported_io_types": { 00:26:41.160 "read": true, 00:26:41.160 "write": true, 00:26:41.160 "unmap": true, 00:26:41.160 "write_zeroes": true, 00:26:41.160 "flush": true, 00:26:41.160 "reset": true, 00:26:41.160 "compare": false, 00:26:41.160 "compare_and_write": false, 00:26:41.160 "abort": true, 00:26:41.160 "nvme_admin": false, 00:26:41.160 "nvme_io": false 00:26:41.160 }, 00:26:41.160 "memory_domains": [ 00:26:41.160 { 00:26:41.160 "dma_device_id": "system", 00:26:41.160 "dma_device_type": 1 00:26:41.160 }, 00:26:41.160 { 00:26:41.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.160 "dma_device_type": 2 00:26:41.160 } 00:26:41.160 ], 00:26:41.160 "driver_specific": { 00:26:41.160 "passthru": { 00:26:41.160 "name": "pt3", 00:26:41.160 "base_bdev_name": "malloc3" 00:26:41.160 } 00:26:41.160 } 00:26:41.160 }' 00:26:41.160 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:41.160 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:41.160 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:41.160 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:41.418 12:29:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:41.418 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:41.418 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:41.418 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:41.985 [2024-06-07 12:29:05.338578] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:41.985 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=84de772c-4a6a-4890-a70a-d4f2f8c276b5 00:26:41.985 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 84de772c-4a6a-4890-a70a-d4f2f8c276b5 ']' 00:26:41.985 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:42.244 [2024-06-07 12:29:05.662484] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:42.244 [2024-06-07 12:29:05.662770] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:42.244 [2024-06-07 12:29:05.662982] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:42.244 [2024-06-07 12:29:05.663150] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:42.244 [2024-06-07 12:29:05.663252] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name raid_bdev1, state offline 00:26:42.244 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.244 12:29:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:42.502 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:42.502 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:42.502 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:42.502 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:42.760 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:42.760 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:43.326 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:43.326 12:29:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:26:43.584 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:43.584 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:26:43.869 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:26:43.870 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:26:44.128 [2024-06-07 12:29:07.674787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:44.128 [2024-06-07 12:29:07.677457] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:44.128 [2024-06-07 12:29:07.677726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:26:44.128 [2024-06-07 12:29:07.677828] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:44.128 [2024-06-07 12:29:07.678206] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:44.128 [2024-06-07 12:29:07.678388] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:26:44.128 [2024-06-07 12:29:07.678576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:44.128 [2024-06-07 12:29:07.678676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state configuring 00:26:44.128 request: 00:26:44.128 { 00:26:44.128 "name": "raid_bdev1", 00:26:44.128 "raid_level": "concat", 00:26:44.128 "base_bdevs": [ 00:26:44.128 "malloc1", 00:26:44.128 "malloc2", 00:26:44.128 "malloc3" 00:26:44.128 ], 00:26:44.128 "superblock": false, 00:26:44.128 "strip_size_kb": 64, 00:26:44.128 "method": "bdev_raid_create", 00:26:44.128 "req_id": 1 00:26:44.128 } 00:26:44.128 Got JSON-RPC error response 00:26:44.128 response: 00:26:44.128 { 00:26:44.128 "code": -17, 00:26:44.128 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:44.128 } 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:44.128 12:29:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.386 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:44.386 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:44.386 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:44.644 [2024-06-07 12:29:08.239052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:44.644 [2024-06-07 12:29:08.239388] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.644 [2024-06-07 12:29:08.239496] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:26:44.644 [2024-06-07 12:29:08.239652] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.644 [2024-06-07 12:29:08.242007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.645 [2024-06-07 12:29:08.242194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:44.645 [2024-06-07 12:29:08.242489] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:44.645 [2024-06-07 12:29:08.242670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:44.645 pt1 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.645 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.210 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.210 "name": "raid_bdev1", 00:26:45.210 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:45.210 "strip_size_kb": 64, 00:26:45.210 "state": "configuring", 00:26:45.210 "raid_level": "concat", 00:26:45.210 "superblock": true, 00:26:45.210 "num_base_bdevs": 3, 00:26:45.210 "num_base_bdevs_discovered": 1, 00:26:45.211 "num_base_bdevs_operational": 3, 00:26:45.211 "base_bdevs_list": [ 00:26:45.211 { 00:26:45.211 "name": "pt1", 00:26:45.211 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:45.211 "is_configured": true, 00:26:45.211 "data_offset": 2048, 00:26:45.211 "data_size": 63488 00:26:45.211 }, 00:26:45.211 { 00:26:45.211 "name": null, 00:26:45.211 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:45.211 "is_configured": false, 00:26:45.211 "data_offset": 2048, 00:26:45.211 "data_size": 63488 00:26:45.211 }, 00:26:45.211 { 00:26:45.211 "name": null, 00:26:45.211 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:45.211 "is_configured": false, 00:26:45.211 "data_offset": 2048, 00:26:45.211 "data_size": 63488 00:26:45.211 } 00:26:45.211 ] 00:26:45.211 }' 00:26:45.211 12:29:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.211 12:29:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:45.778 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:26:45.778 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:45.778 [2024-06-07 12:29:09.407249] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:45.778 [2024-06-07 12:29:09.408990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:45.778 [2024-06-07 12:29:09.409110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:26:45.778 [2024-06-07 12:29:09.409388] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:45.778 [2024-06-07 12:29:09.409863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:45.778 [2024-06-07 12:29:09.410037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:45.778 [2024-06-07 12:29:09.410292] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:45.778 [2024-06-07 12:29:09.410439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:45.778 pt2 00:26:46.038 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:46.296 [2024-06-07 12:29:09.719416] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:26:46.296 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:26:46.296 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.297 12:29:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.555 12:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.555 "name": "raid_bdev1", 00:26:46.555 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:46.555 "strip_size_kb": 64, 00:26:46.555 "state": "configuring", 00:26:46.555 "raid_level": "concat", 00:26:46.555 "superblock": true, 00:26:46.555 "num_base_bdevs": 3, 00:26:46.555 "num_base_bdevs_discovered": 1, 00:26:46.555 "num_base_bdevs_operational": 3, 00:26:46.555 "base_bdevs_list": [ 00:26:46.555 { 00:26:46.555 "name": "pt1", 00:26:46.555 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:46.555 "is_configured": true, 00:26:46.555 "data_offset": 2048, 00:26:46.555 "data_size": 63488 00:26:46.555 }, 00:26:46.555 { 00:26:46.555 "name": null, 00:26:46.555 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:46.555 "is_configured": false, 00:26:46.555 "data_offset": 2048, 00:26:46.555 "data_size": 63488 00:26:46.555 }, 00:26:46.555 { 00:26:46.555 "name": null, 00:26:46.555 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:46.555 "is_configured": false, 00:26:46.555 "data_offset": 2048, 00:26:46.555 "data_size": 63488 00:26:46.555 } 00:26:46.555 ] 00:26:46.555 }' 00:26:46.555 12:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.555 12:29:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:47.491 12:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:47.491 12:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:47.491 12:29:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:47.491 [2024-06-07 12:29:11.095689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:47.491 [2024-06-07 12:29:11.096064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:47.491 [2024-06-07 12:29:11.096250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:26:47.491 [2024-06-07 12:29:11.096405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:47.491 [2024-06-07 12:29:11.096867] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:47.491 [2024-06-07 12:29:11.097083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:47.491 [2024-06-07 12:29:11.097322] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:47.491 [2024-06-07 12:29:11.097394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:47.491 pt2 00:26:47.491 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:47.491 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:47.491 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:26:48.059 [2024-06-07 12:29:11.403712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:26:48.059 [2024-06-07 12:29:11.404086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:48.059 [2024-06-07 12:29:11.404252] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009380 00:26:48.059 [2024-06-07 12:29:11.404385] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:48.059 [2024-06-07 12:29:11.404920] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:48.059 [2024-06-07 12:29:11.405097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:26:48.059 [2024-06-07 12:29:11.405326] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:26:48.059 [2024-06-07 12:29:11.405451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:26:48.059 [2024-06-07 12:29:11.405655] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:26:48.059 [2024-06-07 12:29:11.405758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:26:48.059 [2024-06-07 12:29:11.405883] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:26:48.059 [2024-06-07 12:29:11.406205] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:26:48.059 [2024-06-07 12:29:11.406363] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:26:48.059 [2024-06-07 12:29:11.406567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.059 pt3 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.059 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.317 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.317 "name": "raid_bdev1", 00:26:48.317 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:48.317 "strip_size_kb": 64, 00:26:48.317 "state": "online", 00:26:48.317 "raid_level": "concat", 00:26:48.317 "superblock": true, 00:26:48.317 "num_base_bdevs": 3, 00:26:48.317 "num_base_bdevs_discovered": 3, 00:26:48.317 "num_base_bdevs_operational": 3, 00:26:48.317 "base_bdevs_list": [ 00:26:48.317 { 00:26:48.317 "name": "pt1", 00:26:48.317 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:48.317 "is_configured": true, 00:26:48.317 "data_offset": 2048, 00:26:48.317 "data_size": 63488 00:26:48.317 }, 00:26:48.317 { 00:26:48.317 "name": "pt2", 00:26:48.317 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:48.317 "is_configured": true, 00:26:48.317 "data_offset": 2048, 00:26:48.317 "data_size": 63488 00:26:48.317 }, 00:26:48.317 { 00:26:48.317 "name": "pt3", 00:26:48.317 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:48.317 "is_configured": true, 00:26:48.317 "data_offset": 2048, 00:26:48.317 "data_size": 63488 00:26:48.317 } 00:26:48.317 ] 00:26:48.317 }' 00:26:48.317 12:29:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.317 12:29:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:48.883 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:49.142 [2024-06-07 12:29:12.724072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:49.142 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:49.142 "name": "raid_bdev1", 00:26:49.142 "aliases": [ 00:26:49.142 "84de772c-4a6a-4890-a70a-d4f2f8c276b5" 00:26:49.142 ], 00:26:49.142 "product_name": "Raid Volume", 00:26:49.142 "block_size": 512, 00:26:49.142 "num_blocks": 190464, 00:26:49.142 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:49.142 "assigned_rate_limits": { 00:26:49.142 "rw_ios_per_sec": 0, 00:26:49.142 "rw_mbytes_per_sec": 0, 00:26:49.142 "r_mbytes_per_sec": 0, 00:26:49.142 "w_mbytes_per_sec": 0 00:26:49.142 }, 00:26:49.142 "claimed": false, 00:26:49.142 "zoned": false, 00:26:49.142 "supported_io_types": { 00:26:49.142 "read": true, 00:26:49.142 "write": true, 00:26:49.142 "unmap": true, 00:26:49.142 "write_zeroes": true, 00:26:49.142 "flush": true, 00:26:49.142 "reset": true, 00:26:49.142 "compare": false, 00:26:49.142 "compare_and_write": false, 00:26:49.142 "abort": false, 00:26:49.142 "nvme_admin": false, 00:26:49.142 "nvme_io": false 00:26:49.142 }, 00:26:49.142 "memory_domains": [ 00:26:49.142 { 00:26:49.142 "dma_device_id": "system", 00:26:49.142 "dma_device_type": 1 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:49.142 "dma_device_type": 2 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "dma_device_id": "system", 00:26:49.142 "dma_device_type": 1 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:49.142 "dma_device_type": 2 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "dma_device_id": "system", 00:26:49.142 "dma_device_type": 1 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:49.142 "dma_device_type": 2 00:26:49.142 } 00:26:49.142 ], 00:26:49.142 "driver_specific": { 00:26:49.142 "raid": { 00:26:49.142 "uuid": "84de772c-4a6a-4890-a70a-d4f2f8c276b5", 00:26:49.142 "strip_size_kb": 64, 00:26:49.142 "state": "online", 00:26:49.142 "raid_level": "concat", 00:26:49.142 "superblock": true, 00:26:49.142 "num_base_bdevs": 3, 00:26:49.142 "num_base_bdevs_discovered": 3, 00:26:49.142 "num_base_bdevs_operational": 3, 00:26:49.142 "base_bdevs_list": [ 00:26:49.142 { 00:26:49.142 "name": "pt1", 00:26:49.142 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:49.142 "is_configured": true, 00:26:49.142 "data_offset": 2048, 00:26:49.142 "data_size": 63488 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "name": "pt2", 00:26:49.142 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:49.142 "is_configured": true, 00:26:49.142 "data_offset": 2048, 00:26:49.142 "data_size": 63488 00:26:49.142 }, 00:26:49.142 { 00:26:49.142 "name": "pt3", 00:26:49.142 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:49.142 "is_configured": true, 00:26:49.142 "data_offset": 2048, 00:26:49.142 "data_size": 63488 00:26:49.142 } 00:26:49.142 ] 00:26:49.142 } 00:26:49.142 } 00:26:49.142 }' 00:26:49.142 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:49.401 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:49.401 pt2 00:26:49.401 pt3' 00:26:49.401 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:49.401 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:49.401 12:29:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:49.661 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:49.661 "name": "pt1", 00:26:49.661 "aliases": [ 00:26:49.661 "00000000-0000-0000-0000-000000000001" 00:26:49.661 ], 00:26:49.661 "product_name": "passthru", 00:26:49.661 "block_size": 512, 00:26:49.661 "num_blocks": 65536, 00:26:49.661 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:49.661 "assigned_rate_limits": { 00:26:49.661 "rw_ios_per_sec": 0, 00:26:49.661 "rw_mbytes_per_sec": 0, 00:26:49.661 "r_mbytes_per_sec": 0, 00:26:49.661 "w_mbytes_per_sec": 0 00:26:49.661 }, 00:26:49.661 "claimed": true, 00:26:49.661 "claim_type": "exclusive_write", 00:26:49.661 "zoned": false, 00:26:49.661 "supported_io_types": { 00:26:49.661 "read": true, 00:26:49.661 "write": true, 00:26:49.661 "unmap": true, 00:26:49.661 "write_zeroes": true, 00:26:49.661 "flush": true, 00:26:49.661 "reset": true, 00:26:49.661 "compare": false, 00:26:49.661 "compare_and_write": false, 00:26:49.661 "abort": true, 00:26:49.661 "nvme_admin": false, 00:26:49.661 "nvme_io": false 00:26:49.661 }, 00:26:49.661 "memory_domains": [ 00:26:49.661 { 00:26:49.661 "dma_device_id": "system", 00:26:49.661 "dma_device_type": 1 00:26:49.661 }, 00:26:49.661 { 00:26:49.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:49.661 "dma_device_type": 2 00:26:49.661 } 00:26:49.661 ], 00:26:49.661 "driver_specific": { 00:26:49.661 "passthru": { 00:26:49.661 "name": "pt1", 00:26:49.661 "base_bdev_name": "malloc1" 00:26:49.661 } 00:26:49.661 } 00:26:49.661 }' 00:26:49.661 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:49.661 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:49.661 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:49.661 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:49.917 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:50.176 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:50.176 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:50.176 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:50.176 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:50.434 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:50.434 "name": "pt2", 00:26:50.435 "aliases": [ 00:26:50.435 "00000000-0000-0000-0000-000000000002" 00:26:50.435 ], 00:26:50.435 "product_name": "passthru", 00:26:50.435 "block_size": 512, 00:26:50.435 "num_blocks": 65536, 00:26:50.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:50.435 "assigned_rate_limits": { 00:26:50.435 "rw_ios_per_sec": 0, 00:26:50.435 "rw_mbytes_per_sec": 0, 00:26:50.435 "r_mbytes_per_sec": 0, 00:26:50.435 "w_mbytes_per_sec": 0 00:26:50.435 }, 00:26:50.435 "claimed": true, 00:26:50.435 "claim_type": "exclusive_write", 00:26:50.435 "zoned": false, 00:26:50.435 "supported_io_types": { 00:26:50.435 "read": true, 00:26:50.435 "write": true, 00:26:50.435 "unmap": true, 00:26:50.435 "write_zeroes": true, 00:26:50.435 "flush": true, 00:26:50.435 "reset": true, 00:26:50.435 "compare": false, 00:26:50.435 "compare_and_write": false, 00:26:50.435 "abort": true, 00:26:50.435 "nvme_admin": false, 00:26:50.435 "nvme_io": false 00:26:50.435 }, 00:26:50.435 "memory_domains": [ 00:26:50.435 { 00:26:50.435 "dma_device_id": "system", 00:26:50.435 "dma_device_type": 1 00:26:50.435 }, 00:26:50.435 { 00:26:50.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:50.435 "dma_device_type": 2 00:26:50.435 } 00:26:50.435 ], 00:26:50.435 "driver_specific": { 00:26:50.435 "passthru": { 00:26:50.435 "name": "pt2", 00:26:50.435 "base_bdev_name": "malloc2" 00:26:50.435 } 00:26:50.435 } 00:26:50.435 }' 00:26:50.435 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:50.435 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:50.435 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:50.435 12:29:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:50.435 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:26:50.694 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:51.344 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:51.344 "name": "pt3", 00:26:51.344 "aliases": [ 00:26:51.344 "00000000-0000-0000-0000-000000000003" 00:26:51.344 ], 00:26:51.344 "product_name": "passthru", 00:26:51.344 "block_size": 512, 00:26:51.344 "num_blocks": 65536, 00:26:51.344 "uuid": "00000000-0000-0000-0000-000000000003", 00:26:51.344 "assigned_rate_limits": { 00:26:51.344 "rw_ios_per_sec": 0, 00:26:51.344 "rw_mbytes_per_sec": 0, 00:26:51.344 "r_mbytes_per_sec": 0, 00:26:51.344 "w_mbytes_per_sec": 0 00:26:51.344 }, 00:26:51.344 "claimed": true, 00:26:51.344 "claim_type": "exclusive_write", 00:26:51.344 "zoned": false, 00:26:51.344 "supported_io_types": { 00:26:51.344 "read": true, 00:26:51.344 "write": true, 00:26:51.344 "unmap": true, 00:26:51.344 "write_zeroes": true, 00:26:51.344 "flush": true, 00:26:51.344 "reset": true, 00:26:51.344 "compare": false, 00:26:51.344 "compare_and_write": false, 00:26:51.344 "abort": true, 00:26:51.344 "nvme_admin": false, 00:26:51.345 "nvme_io": false 00:26:51.345 }, 00:26:51.345 "memory_domains": [ 00:26:51.345 { 00:26:51.345 "dma_device_id": "system", 00:26:51.345 "dma_device_type": 1 00:26:51.345 }, 00:26:51.345 { 00:26:51.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:51.345 "dma_device_type": 2 00:26:51.345 } 00:26:51.345 ], 00:26:51.345 "driver_specific": { 00:26:51.345 "passthru": { 00:26:51.345 "name": "pt3", 00:26:51.345 "base_bdev_name": "malloc3" 00:26:51.345 } 00:26:51.345 } 00:26:51.345 }' 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:51.345 12:29:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:51.912 [2024-06-07 12:29:15.260490] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 84de772c-4a6a-4890-a70a-d4f2f8c276b5 '!=' 84de772c-4a6a-4890-a70a-d4f2f8c276b5 ']' 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 207513 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 207513 ']' 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 207513 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 207513 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 207513' 00:26:51.912 killing process with pid 207513 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 207513 00:26:51.912 [2024-06-07 12:29:15.319251] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:51.912 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 207513 00:26:51.912 [2024-06-07 12:29:15.319540] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:51.912 [2024-06-07 12:29:15.319733] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:51.912 [2024-06-07 12:29:15.319826] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:26:51.912 [2024-06-07 12:29:15.386807] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:52.170 12:29:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:26:52.170 00:26:52.170 real 0m16.544s 00:26:52.170 user 0m30.228s 00:26:52.170 sys 0m2.660s 00:26:52.170 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:52.170 12:29:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:52.170 ************************************ 00:26:52.170 END TEST raid_superblock_test 00:26:52.170 ************************************ 00:26:52.429 12:29:15 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:26:52.429 12:29:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:26:52.429 12:29:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:52.429 12:29:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:52.429 ************************************ 00:26:52.429 START TEST raid_read_error_test 00:26:52.429 ************************************ 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 read 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.U1UaAGu8Ij 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=208007 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 208007 /var/tmp/spdk-raid.sock 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 208007 ']' 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:52.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:52.429 12:29:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:26:52.429 [2024-06-07 12:29:15.899415] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:26:52.429 [2024-06-07 12:29:15.900694] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208007 ] 00:26:52.429 [2024-06-07 12:29:16.050162] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.688 [2024-06-07 12:29:16.143866] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.688 [2024-06-07 12:29:16.225457] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.688 12:29:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:52.688 12:29:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:26:52.688 12:29:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:26:52.688 12:29:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:53.253 BaseBdev1_malloc 00:26:53.253 12:29:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:26:53.512 true 00:26:53.512 12:29:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:26:53.771 [2024-06-07 12:29:17.315555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:26:53.771 [2024-06-07 12:29:17.315938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:53.771 [2024-06-07 12:29:17.316152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:26:53.771 [2024-06-07 12:29:17.316381] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:53.771 [2024-06-07 12:29:17.319607] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:53.771 [2024-06-07 12:29:17.319850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:53.771 BaseBdev1 00:26:53.771 12:29:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:26:53.771 12:29:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:54.030 BaseBdev2_malloc 00:26:54.288 12:29:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:26:54.546 true 00:26:54.546 12:29:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:26:54.804 [2024-06-07 12:29:18.304743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:26:54.804 [2024-06-07 12:29:18.305170] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:54.804 [2024-06-07 12:29:18.305399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:26:54.804 [2024-06-07 12:29:18.305582] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:54.804 [2024-06-07 12:29:18.308688] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:54.804 [2024-06-07 12:29:18.308980] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:54.804 BaseBdev2 00:26:54.804 12:29:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:26:54.804 12:29:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:55.061 BaseBdev3_malloc 00:26:55.061 12:29:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:26:55.626 true 00:26:55.626 12:29:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:26:55.885 [2024-06-07 12:29:19.283343] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:26:55.885 [2024-06-07 12:29:19.283693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:55.885 [2024-06-07 12:29:19.283875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:26:55.885 [2024-06-07 12:29:19.284033] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:55.885 [2024-06-07 12:29:19.286521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:55.885 [2024-06-07 12:29:19.286715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:55.885 BaseBdev3 00:26:55.885 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:26:56.143 [2024-06-07 12:29:19.731691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:56.143 [2024-06-07 12:29:19.734215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:56.143 [2024-06-07 12:29:19.734441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:56.143 [2024-06-07 12:29:19.734742] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:26:56.143 [2024-06-07 12:29:19.734868] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:26:56.143 [2024-06-07 12:29:19.735151] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:26:56.143 [2024-06-07 12:29:19.735733] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:26:56.143 [2024-06-07 12:29:19.735858] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:26:56.143 [2024-06-07 12:29:19.736151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:56.143 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.144 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.144 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.144 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.144 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.144 12:29:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.710 12:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:56.710 "name": "raid_bdev1", 00:26:56.710 "uuid": "f7d0706a-8b57-477d-a4c5-fd37df2b9082", 00:26:56.710 "strip_size_kb": 64, 00:26:56.710 "state": "online", 00:26:56.710 "raid_level": "concat", 00:26:56.710 "superblock": true, 00:26:56.710 "num_base_bdevs": 3, 00:26:56.710 "num_base_bdevs_discovered": 3, 00:26:56.710 "num_base_bdevs_operational": 3, 00:26:56.710 "base_bdevs_list": [ 00:26:56.710 { 00:26:56.710 "name": "BaseBdev1", 00:26:56.710 "uuid": "5632bd86-8c23-53ad-b0d8-33216f8d516f", 00:26:56.710 "is_configured": true, 00:26:56.710 "data_offset": 2048, 00:26:56.710 "data_size": 63488 00:26:56.710 }, 00:26:56.710 { 00:26:56.710 "name": "BaseBdev2", 00:26:56.710 "uuid": "af827691-59f2-5da9-a8dd-27757857d1e1", 00:26:56.710 "is_configured": true, 00:26:56.710 "data_offset": 2048, 00:26:56.710 "data_size": 63488 00:26:56.710 }, 00:26:56.710 { 00:26:56.710 "name": "BaseBdev3", 00:26:56.710 "uuid": "9d390e70-c003-5cab-8432-b92a9133bca7", 00:26:56.710 "is_configured": true, 00:26:56.710 "data_offset": 2048, 00:26:56.710 "data_size": 63488 00:26:56.710 } 00:26:56.710 ] 00:26:56.710 }' 00:26:56.710 12:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:56.710 12:29:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:26:57.276 12:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:26:57.276 12:29:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:57.276 [2024-06-07 12:29:20.828664] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:26:58.212 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:26:58.470 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:26:58.470 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:26:58.470 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:26:58.470 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:26:58.470 12:29:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.470 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.729 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.729 "name": "raid_bdev1", 00:26:58.729 "uuid": "f7d0706a-8b57-477d-a4c5-fd37df2b9082", 00:26:58.729 "strip_size_kb": 64, 00:26:58.729 "state": "online", 00:26:58.729 "raid_level": "concat", 00:26:58.729 "superblock": true, 00:26:58.729 "num_base_bdevs": 3, 00:26:58.729 "num_base_bdevs_discovered": 3, 00:26:58.729 "num_base_bdevs_operational": 3, 00:26:58.729 "base_bdevs_list": [ 00:26:58.729 { 00:26:58.729 "name": "BaseBdev1", 00:26:58.729 "uuid": "5632bd86-8c23-53ad-b0d8-33216f8d516f", 00:26:58.729 "is_configured": true, 00:26:58.729 "data_offset": 2048, 00:26:58.729 "data_size": 63488 00:26:58.729 }, 00:26:58.729 { 00:26:58.729 "name": "BaseBdev2", 00:26:58.729 "uuid": "af827691-59f2-5da9-a8dd-27757857d1e1", 00:26:58.729 "is_configured": true, 00:26:58.729 "data_offset": 2048, 00:26:58.729 "data_size": 63488 00:26:58.729 }, 00:26:58.729 { 00:26:58.729 "name": "BaseBdev3", 00:26:58.729 "uuid": "9d390e70-c003-5cab-8432-b92a9133bca7", 00:26:58.729 "is_configured": true, 00:26:58.729 "data_offset": 2048, 00:26:58.729 "data_size": 63488 00:26:58.729 } 00:26:58.729 ] 00:26:58.729 }' 00:26:58.729 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.729 12:29:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:26:59.296 12:29:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:59.565 [2024-06-07 12:29:23.079557] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:59.566 [2024-06-07 12:29:23.079860] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:59.566 [2024-06-07 12:29:23.081334] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:59.566 [2024-06-07 12:29:23.081490] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:59.566 [2024-06-07 12:29:23.081549] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:59.566 [2024-06-07 12:29:23.081658] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:26:59.566 0 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 208007 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 208007 ']' 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 208007 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 208007 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 208007' 00:26:59.566 killing process with pid 208007 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 208007 00:26:59.566 [2024-06-07 12:29:23.131986] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:59.566 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 208007 00:26:59.566 [2024-06-07 12:29:23.180044] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.U1UaAGu8Ij 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:27:00.138 00:27:00.138 real 0m7.727s 00:27:00.138 user 0m12.529s 00:27:00.138 sys 0m1.282s 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:00.138 12:29:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:00.138 ************************************ 00:27:00.138 END TEST raid_read_error_test 00:27:00.138 ************************************ 00:27:00.138 12:29:23 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:27:00.138 12:29:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:00.138 12:29:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:00.138 12:29:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:00.138 ************************************ 00:27:00.138 START TEST raid_write_error_test 00:27:00.138 ************************************ 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 write 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.IheHaNbNIB 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=208198 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 208198 /var/tmp/spdk-raid.sock 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 208198 ']' 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:00.138 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:00.138 12:29:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:00.138 [2024-06-07 12:29:23.705560] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:27:00.138 [2024-06-07 12:29:23.706694] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208198 ] 00:27:00.396 [2024-06-07 12:29:23.857544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:00.396 [2024-06-07 12:29:23.953201] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:00.396 [2024-06-07 12:29:24.040845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.330 12:29:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:01.330 12:29:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:27:01.330 12:29:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:01.330 12:29:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:01.330 BaseBdev1_malloc 00:27:01.331 12:29:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:27:01.589 true 00:27:01.589 12:29:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:27:01.847 [2024-06-07 12:29:25.429155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:27:01.847 [2024-06-07 12:29:25.429500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:01.847 [2024-06-07 12:29:25.429611] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:27:01.847 [2024-06-07 12:29:25.429908] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:01.847 [2024-06-07 12:29:25.432854] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:01.847 [2024-06-07 12:29:25.433063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:01.847 BaseBdev1 00:27:01.847 12:29:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:01.847 12:29:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:02.105 BaseBdev2_malloc 00:27:02.106 12:29:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:27:02.364 true 00:27:02.364 12:29:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:27:02.981 [2024-06-07 12:29:26.269788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:27:02.981 [2024-06-07 12:29:26.270145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.981 [2024-06-07 12:29:26.270361] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:27:02.981 [2024-06-07 12:29:26.270547] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.981 [2024-06-07 12:29:26.273312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.981 [2024-06-07 12:29:26.273515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:02.981 BaseBdev2 00:27:02.981 12:29:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:02.981 12:29:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:02.981 BaseBdev3_malloc 00:27:02.981 12:29:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:27:03.242 true 00:27:03.242 12:29:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:27:03.502 [2024-06-07 12:29:27.001037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:27:03.502 [2024-06-07 12:29:27.001393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.502 [2024-06-07 12:29:27.001560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:27:03.502 [2024-06-07 12:29:27.001706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.502 [2024-06-07 12:29:27.004150] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.502 [2024-06-07 12:29:27.004349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:03.502 BaseBdev3 00:27:03.502 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:27:03.761 [2024-06-07 12:29:27.245214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:03.761 [2024-06-07 12:29:27.247393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:03.761 [2024-06-07 12:29:27.247599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:03.761 [2024-06-07 12:29:27.247851] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:27:03.761 [2024-06-07 12:29:27.247955] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:27:03.761 [2024-06-07 12:29:27.248133] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:27:03.761 [2024-06-07 12:29:27.248513] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:27:03.761 [2024-06-07 12:29:27.248624] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:27:03.761 [2024-06-07 12:29:27.248849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.761 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.020 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.020 "name": "raid_bdev1", 00:27:04.020 "uuid": "64fb89b1-6600-4742-8fb7-9dcb97f7e8b8", 00:27:04.020 "strip_size_kb": 64, 00:27:04.020 "state": "online", 00:27:04.020 "raid_level": "concat", 00:27:04.020 "superblock": true, 00:27:04.020 "num_base_bdevs": 3, 00:27:04.020 "num_base_bdevs_discovered": 3, 00:27:04.020 "num_base_bdevs_operational": 3, 00:27:04.020 "base_bdevs_list": [ 00:27:04.020 { 00:27:04.020 "name": "BaseBdev1", 00:27:04.020 "uuid": "37473175-9619-533c-a8a4-6952be32eb9c", 00:27:04.020 "is_configured": true, 00:27:04.020 "data_offset": 2048, 00:27:04.020 "data_size": 63488 00:27:04.020 }, 00:27:04.020 { 00:27:04.020 "name": "BaseBdev2", 00:27:04.020 "uuid": "8e166c89-ea2f-5f3e-b48e-3168d6fc5e45", 00:27:04.020 "is_configured": true, 00:27:04.020 "data_offset": 2048, 00:27:04.020 "data_size": 63488 00:27:04.020 }, 00:27:04.020 { 00:27:04.020 "name": "BaseBdev3", 00:27:04.020 "uuid": "f62944b4-f695-55d9-bfac-13b18597a830", 00:27:04.020 "is_configured": true, 00:27:04.020 "data_offset": 2048, 00:27:04.020 "data_size": 63488 00:27:04.020 } 00:27:04.020 ] 00:27:04.020 }' 00:27:04.020 12:29:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.020 12:29:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:04.587 12:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:27:04.587 12:29:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:04.587 [2024-06-07 12:29:28.191300] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:27:05.524 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.784 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.042 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.042 "name": "raid_bdev1", 00:27:06.042 "uuid": "64fb89b1-6600-4742-8fb7-9dcb97f7e8b8", 00:27:06.042 "strip_size_kb": 64, 00:27:06.042 "state": "online", 00:27:06.042 "raid_level": "concat", 00:27:06.042 "superblock": true, 00:27:06.042 "num_base_bdevs": 3, 00:27:06.042 "num_base_bdevs_discovered": 3, 00:27:06.042 "num_base_bdevs_operational": 3, 00:27:06.042 "base_bdevs_list": [ 00:27:06.042 { 00:27:06.042 "name": "BaseBdev1", 00:27:06.042 "uuid": "37473175-9619-533c-a8a4-6952be32eb9c", 00:27:06.042 "is_configured": true, 00:27:06.042 "data_offset": 2048, 00:27:06.042 "data_size": 63488 00:27:06.042 }, 00:27:06.042 { 00:27:06.042 "name": "BaseBdev2", 00:27:06.042 "uuid": "8e166c89-ea2f-5f3e-b48e-3168d6fc5e45", 00:27:06.042 "is_configured": true, 00:27:06.042 "data_offset": 2048, 00:27:06.042 "data_size": 63488 00:27:06.042 }, 00:27:06.042 { 00:27:06.042 "name": "BaseBdev3", 00:27:06.042 "uuid": "f62944b4-f695-55d9-bfac-13b18597a830", 00:27:06.042 "is_configured": true, 00:27:06.042 "data_offset": 2048, 00:27:06.042 "data_size": 63488 00:27:06.042 } 00:27:06.042 ] 00:27:06.042 }' 00:27:06.042 12:29:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.042 12:29:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:06.607 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:06.865 [2024-06-07 12:29:30.335544] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:06.865 [2024-06-07 12:29:30.335847] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:06.865 [2024-06-07 12:29:30.337211] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:06.865 [2024-06-07 12:29:30.337380] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:06.865 [2024-06-07 12:29:30.337458] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:06.865 [2024-06-07 12:29:30.337566] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:27:06.865 0 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 208198 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 208198 ']' 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 208198 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 208198 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 208198' 00:27:06.865 killing process with pid 208198 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 208198 00:27:06.865 [2024-06-07 12:29:30.396301] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:06.865 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 208198 00:27:06.865 [2024-06-07 12:29:30.445860] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.IheHaNbNIB 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:27:07.434 00:27:07.434 real 0m7.183s 00:27:07.434 user 0m11.153s 00:27:07.434 sys 0m1.166s 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:07.434 12:29:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:07.434 ************************************ 00:27:07.434 END TEST raid_write_error_test 00:27:07.434 ************************************ 00:27:07.434 12:29:30 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:27:07.434 12:29:30 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:27:07.434 12:29:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:07.434 12:29:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:07.434 12:29:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:07.434 ************************************ 00:27:07.434 START TEST raid_state_function_test 00:27:07.434 ************************************ 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 false 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=208389 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 208389' 00:27:07.434 Process raid pid: 208389 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 208389 /var/tmp/spdk-raid.sock 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 208389 ']' 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:07.434 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:07.434 12:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:07.434 [2024-06-07 12:29:30.949665] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:27:07.434 [2024-06-07 12:29:30.950178] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:07.693 [2024-06-07 12:29:31.093961] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:07.693 [2024-06-07 12:29:31.186523] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:07.693 [2024-06-07 12:29:31.267124] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:08.630 [2024-06-07 12:29:32.245824] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:08.630 [2024-06-07 12:29:32.246151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:08.630 [2024-06-07 12:29:32.246290] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:08.630 [2024-06-07 12:29:32.246363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:08.630 [2024-06-07 12:29:32.246569] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:08.630 [2024-06-07 12:29:32.246661] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.630 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.631 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.631 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.631 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.631 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:08.890 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.890 "name": "Existed_Raid", 00:27:08.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.890 "strip_size_kb": 0, 00:27:08.890 "state": "configuring", 00:27:08.890 "raid_level": "raid1", 00:27:08.890 "superblock": false, 00:27:08.890 "num_base_bdevs": 3, 00:27:08.890 "num_base_bdevs_discovered": 0, 00:27:08.890 "num_base_bdevs_operational": 3, 00:27:08.890 "base_bdevs_list": [ 00:27:08.890 { 00:27:08.890 "name": "BaseBdev1", 00:27:08.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.890 "is_configured": false, 00:27:08.890 "data_offset": 0, 00:27:08.890 "data_size": 0 00:27:08.890 }, 00:27:08.890 { 00:27:08.890 "name": "BaseBdev2", 00:27:08.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.890 "is_configured": false, 00:27:08.890 "data_offset": 0, 00:27:08.890 "data_size": 0 00:27:08.890 }, 00:27:08.890 { 00:27:08.890 "name": "BaseBdev3", 00:27:08.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.890 "is_configured": false, 00:27:08.890 "data_offset": 0, 00:27:08.890 "data_size": 0 00:27:08.890 } 00:27:08.890 ] 00:27:08.890 }' 00:27:08.890 12:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.890 12:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:09.459 12:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:09.717 [2024-06-07 12:29:33.349876] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:09.717 [2024-06-07 12:29:33.350152] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:27:09.976 12:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:10.234 [2024-06-07 12:29:33.649941] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:10.234 [2024-06-07 12:29:33.650258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:10.234 [2024-06-07 12:29:33.650389] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:10.234 [2024-06-07 12:29:33.650454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:10.234 [2024-06-07 12:29:33.650599] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:10.234 [2024-06-07 12:29:33.650669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:10.234 12:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:10.234 [2024-06-07 12:29:33.873714] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:10.234 BaseBdev1 00:27:10.491 12:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:10.492 12:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:10.751 [ 00:27:10.751 { 00:27:10.751 "name": "BaseBdev1", 00:27:10.751 "aliases": [ 00:27:10.751 "c270cd8d-6842-4081-9bfb-0dee44d321ab" 00:27:10.751 ], 00:27:10.751 "product_name": "Malloc disk", 00:27:10.751 "block_size": 512, 00:27:10.751 "num_blocks": 65536, 00:27:10.751 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:10.751 "assigned_rate_limits": { 00:27:10.751 "rw_ios_per_sec": 0, 00:27:10.751 "rw_mbytes_per_sec": 0, 00:27:10.751 "r_mbytes_per_sec": 0, 00:27:10.751 "w_mbytes_per_sec": 0 00:27:10.751 }, 00:27:10.751 "claimed": true, 00:27:10.751 "claim_type": "exclusive_write", 00:27:10.751 "zoned": false, 00:27:10.751 "supported_io_types": { 00:27:10.751 "read": true, 00:27:10.751 "write": true, 00:27:10.751 "unmap": true, 00:27:10.751 "write_zeroes": true, 00:27:10.751 "flush": true, 00:27:10.751 "reset": true, 00:27:10.751 "compare": false, 00:27:10.751 "compare_and_write": false, 00:27:10.751 "abort": true, 00:27:10.751 "nvme_admin": false, 00:27:10.751 "nvme_io": false 00:27:10.751 }, 00:27:10.751 "memory_domains": [ 00:27:10.751 { 00:27:10.751 "dma_device_id": "system", 00:27:10.751 "dma_device_type": 1 00:27:10.751 }, 00:27:10.751 { 00:27:10.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.751 "dma_device_type": 2 00:27:10.751 } 00:27:10.751 ], 00:27:10.751 "driver_specific": {} 00:27:10.751 } 00:27:10.751 ] 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.751 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:11.009 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:11.009 "name": "Existed_Raid", 00:27:11.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.009 "strip_size_kb": 0, 00:27:11.009 "state": "configuring", 00:27:11.009 "raid_level": "raid1", 00:27:11.009 "superblock": false, 00:27:11.009 "num_base_bdevs": 3, 00:27:11.009 "num_base_bdevs_discovered": 1, 00:27:11.009 "num_base_bdevs_operational": 3, 00:27:11.009 "base_bdevs_list": [ 00:27:11.009 { 00:27:11.009 "name": "BaseBdev1", 00:27:11.009 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:11.009 "is_configured": true, 00:27:11.009 "data_offset": 0, 00:27:11.009 "data_size": 65536 00:27:11.009 }, 00:27:11.009 { 00:27:11.009 "name": "BaseBdev2", 00:27:11.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.009 "is_configured": false, 00:27:11.009 "data_offset": 0, 00:27:11.009 "data_size": 0 00:27:11.009 }, 00:27:11.009 { 00:27:11.009 "name": "BaseBdev3", 00:27:11.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.009 "is_configured": false, 00:27:11.009 "data_offset": 0, 00:27:11.009 "data_size": 0 00:27:11.009 } 00:27:11.009 ] 00:27:11.009 }' 00:27:11.009 12:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:11.009 12:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:11.575 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:11.871 [2024-06-07 12:29:35.349963] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:11.871 [2024-06-07 12:29:35.350257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:27:11.871 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:12.130 [2024-06-07 12:29:35.658076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:12.130 [2024-06-07 12:29:35.660406] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:12.130 [2024-06-07 12:29:35.660605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:12.130 [2024-06-07 12:29:35.660694] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:12.130 [2024-06-07 12:29:35.660761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.130 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:12.389 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.389 "name": "Existed_Raid", 00:27:12.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.389 "strip_size_kb": 0, 00:27:12.389 "state": "configuring", 00:27:12.389 "raid_level": "raid1", 00:27:12.389 "superblock": false, 00:27:12.389 "num_base_bdevs": 3, 00:27:12.389 "num_base_bdevs_discovered": 1, 00:27:12.389 "num_base_bdevs_operational": 3, 00:27:12.389 "base_bdevs_list": [ 00:27:12.389 { 00:27:12.389 "name": "BaseBdev1", 00:27:12.389 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:12.389 "is_configured": true, 00:27:12.389 "data_offset": 0, 00:27:12.389 "data_size": 65536 00:27:12.389 }, 00:27:12.389 { 00:27:12.389 "name": "BaseBdev2", 00:27:12.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.389 "is_configured": false, 00:27:12.389 "data_offset": 0, 00:27:12.389 "data_size": 0 00:27:12.389 }, 00:27:12.389 { 00:27:12.389 "name": "BaseBdev3", 00:27:12.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.389 "is_configured": false, 00:27:12.389 "data_offset": 0, 00:27:12.389 "data_size": 0 00:27:12.389 } 00:27:12.389 ] 00:27:12.389 }' 00:27:12.389 12:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.389 12:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:12.956 12:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:13.214 [2024-06-07 12:29:36.847442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:13.214 BaseBdev2 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:13.473 12:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:13.732 12:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:13.990 [ 00:27:13.990 { 00:27:13.990 "name": "BaseBdev2", 00:27:13.990 "aliases": [ 00:27:13.990 "92fe509f-fa9e-4196-afa3-e96c96d65186" 00:27:13.990 ], 00:27:13.990 "product_name": "Malloc disk", 00:27:13.990 "block_size": 512, 00:27:13.990 "num_blocks": 65536, 00:27:13.990 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:13.990 "assigned_rate_limits": { 00:27:13.990 "rw_ios_per_sec": 0, 00:27:13.990 "rw_mbytes_per_sec": 0, 00:27:13.990 "r_mbytes_per_sec": 0, 00:27:13.990 "w_mbytes_per_sec": 0 00:27:13.990 }, 00:27:13.990 "claimed": true, 00:27:13.990 "claim_type": "exclusive_write", 00:27:13.990 "zoned": false, 00:27:13.990 "supported_io_types": { 00:27:13.990 "read": true, 00:27:13.990 "write": true, 00:27:13.990 "unmap": true, 00:27:13.990 "write_zeroes": true, 00:27:13.990 "flush": true, 00:27:13.990 "reset": true, 00:27:13.990 "compare": false, 00:27:13.990 "compare_and_write": false, 00:27:13.990 "abort": true, 00:27:13.990 "nvme_admin": false, 00:27:13.990 "nvme_io": false 00:27:13.990 }, 00:27:13.990 "memory_domains": [ 00:27:13.990 { 00:27:13.990 "dma_device_id": "system", 00:27:13.990 "dma_device_type": 1 00:27:13.990 }, 00:27:13.990 { 00:27:13.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.990 "dma_device_type": 2 00:27:13.990 } 00:27:13.990 ], 00:27:13.990 "driver_specific": {} 00:27:13.990 } 00:27:13.990 ] 00:27:13.990 12:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:13.990 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:13.990 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:13.991 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.249 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.249 "name": "Existed_Raid", 00:27:14.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.249 "strip_size_kb": 0, 00:27:14.249 "state": "configuring", 00:27:14.249 "raid_level": "raid1", 00:27:14.249 "superblock": false, 00:27:14.249 "num_base_bdevs": 3, 00:27:14.249 "num_base_bdevs_discovered": 2, 00:27:14.249 "num_base_bdevs_operational": 3, 00:27:14.249 "base_bdevs_list": [ 00:27:14.249 { 00:27:14.249 "name": "BaseBdev1", 00:27:14.249 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:14.249 "is_configured": true, 00:27:14.249 "data_offset": 0, 00:27:14.249 "data_size": 65536 00:27:14.249 }, 00:27:14.249 { 00:27:14.249 "name": "BaseBdev2", 00:27:14.249 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:14.249 "is_configured": true, 00:27:14.249 "data_offset": 0, 00:27:14.249 "data_size": 65536 00:27:14.249 }, 00:27:14.249 { 00:27:14.249 "name": "BaseBdev3", 00:27:14.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.249 "is_configured": false, 00:27:14.249 "data_offset": 0, 00:27:14.249 "data_size": 0 00:27:14.249 } 00:27:14.249 ] 00:27:14.249 }' 00:27:14.249 12:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.249 12:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:14.815 12:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:27:15.074 [2024-06-07 12:29:38.637148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:15.074 [2024-06-07 12:29:38.637467] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:27:15.074 [2024-06-07 12:29:38.637517] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:15.074 [2024-06-07 12:29:38.637766] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:27:15.074 [2024-06-07 12:29:38.638266] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:27:15.074 [2024-06-07 12:29:38.638392] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:27:15.074 [2024-06-07 12:29:38.638711] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.074 BaseBdev3 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:15.074 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:15.332 12:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:27:15.590 [ 00:27:15.590 { 00:27:15.590 "name": "BaseBdev3", 00:27:15.590 "aliases": [ 00:27:15.590 "93b8eae8-7b33-44bd-8f4f-18c694a33edb" 00:27:15.590 ], 00:27:15.590 "product_name": "Malloc disk", 00:27:15.590 "block_size": 512, 00:27:15.590 "num_blocks": 65536, 00:27:15.590 "uuid": "93b8eae8-7b33-44bd-8f4f-18c694a33edb", 00:27:15.590 "assigned_rate_limits": { 00:27:15.590 "rw_ios_per_sec": 0, 00:27:15.590 "rw_mbytes_per_sec": 0, 00:27:15.590 "r_mbytes_per_sec": 0, 00:27:15.590 "w_mbytes_per_sec": 0 00:27:15.590 }, 00:27:15.590 "claimed": true, 00:27:15.590 "claim_type": "exclusive_write", 00:27:15.590 "zoned": false, 00:27:15.590 "supported_io_types": { 00:27:15.590 "read": true, 00:27:15.590 "write": true, 00:27:15.590 "unmap": true, 00:27:15.590 "write_zeroes": true, 00:27:15.590 "flush": true, 00:27:15.590 "reset": true, 00:27:15.590 "compare": false, 00:27:15.590 "compare_and_write": false, 00:27:15.590 "abort": true, 00:27:15.590 "nvme_admin": false, 00:27:15.590 "nvme_io": false 00:27:15.590 }, 00:27:15.590 "memory_domains": [ 00:27:15.590 { 00:27:15.590 "dma_device_id": "system", 00:27:15.590 "dma_device_type": 1 00:27:15.590 }, 00:27:15.590 { 00:27:15.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.590 "dma_device_type": 2 00:27:15.590 } 00:27:15.590 ], 00:27:15.590 "driver_specific": {} 00:27:15.590 } 00:27:15.590 ] 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:15.590 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.591 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.591 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.591 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.591 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:15.591 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.157 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.157 "name": "Existed_Raid", 00:27:16.157 "uuid": "14e91a19-54ed-4ecc-8bf3-4b1ecd841b7b", 00:27:16.157 "strip_size_kb": 0, 00:27:16.157 "state": "online", 00:27:16.157 "raid_level": "raid1", 00:27:16.157 "superblock": false, 00:27:16.157 "num_base_bdevs": 3, 00:27:16.157 "num_base_bdevs_discovered": 3, 00:27:16.157 "num_base_bdevs_operational": 3, 00:27:16.157 "base_bdevs_list": [ 00:27:16.157 { 00:27:16.157 "name": "BaseBdev1", 00:27:16.157 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:16.157 "is_configured": true, 00:27:16.157 "data_offset": 0, 00:27:16.157 "data_size": 65536 00:27:16.157 }, 00:27:16.157 { 00:27:16.157 "name": "BaseBdev2", 00:27:16.157 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:16.157 "is_configured": true, 00:27:16.157 "data_offset": 0, 00:27:16.157 "data_size": 65536 00:27:16.157 }, 00:27:16.157 { 00:27:16.157 "name": "BaseBdev3", 00:27:16.157 "uuid": "93b8eae8-7b33-44bd-8f4f-18c694a33edb", 00:27:16.157 "is_configured": true, 00:27:16.157 "data_offset": 0, 00:27:16.157 "data_size": 65536 00:27:16.157 } 00:27:16.157 ] 00:27:16.157 }' 00:27:16.157 12:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.157 12:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:16.436 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:16.705 [2024-06-07 12:29:40.341716] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:16.965 "name": "Existed_Raid", 00:27:16.965 "aliases": [ 00:27:16.965 "14e91a19-54ed-4ecc-8bf3-4b1ecd841b7b" 00:27:16.965 ], 00:27:16.965 "product_name": "Raid Volume", 00:27:16.965 "block_size": 512, 00:27:16.965 "num_blocks": 65536, 00:27:16.965 "uuid": "14e91a19-54ed-4ecc-8bf3-4b1ecd841b7b", 00:27:16.965 "assigned_rate_limits": { 00:27:16.965 "rw_ios_per_sec": 0, 00:27:16.965 "rw_mbytes_per_sec": 0, 00:27:16.965 "r_mbytes_per_sec": 0, 00:27:16.965 "w_mbytes_per_sec": 0 00:27:16.965 }, 00:27:16.965 "claimed": false, 00:27:16.965 "zoned": false, 00:27:16.965 "supported_io_types": { 00:27:16.965 "read": true, 00:27:16.965 "write": true, 00:27:16.965 "unmap": false, 00:27:16.965 "write_zeroes": true, 00:27:16.965 "flush": false, 00:27:16.965 "reset": true, 00:27:16.965 "compare": false, 00:27:16.965 "compare_and_write": false, 00:27:16.965 "abort": false, 00:27:16.965 "nvme_admin": false, 00:27:16.965 "nvme_io": false 00:27:16.965 }, 00:27:16.965 "memory_domains": [ 00:27:16.965 { 00:27:16.965 "dma_device_id": "system", 00:27:16.965 "dma_device_type": 1 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.965 "dma_device_type": 2 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "dma_device_id": "system", 00:27:16.965 "dma_device_type": 1 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.965 "dma_device_type": 2 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "dma_device_id": "system", 00:27:16.965 "dma_device_type": 1 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.965 "dma_device_type": 2 00:27:16.965 } 00:27:16.965 ], 00:27:16.965 "driver_specific": { 00:27:16.965 "raid": { 00:27:16.965 "uuid": "14e91a19-54ed-4ecc-8bf3-4b1ecd841b7b", 00:27:16.965 "strip_size_kb": 0, 00:27:16.965 "state": "online", 00:27:16.965 "raid_level": "raid1", 00:27:16.965 "superblock": false, 00:27:16.965 "num_base_bdevs": 3, 00:27:16.965 "num_base_bdevs_discovered": 3, 00:27:16.965 "num_base_bdevs_operational": 3, 00:27:16.965 "base_bdevs_list": [ 00:27:16.965 { 00:27:16.965 "name": "BaseBdev1", 00:27:16.965 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:16.965 "is_configured": true, 00:27:16.965 "data_offset": 0, 00:27:16.965 "data_size": 65536 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "name": "BaseBdev2", 00:27:16.965 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:16.965 "is_configured": true, 00:27:16.965 "data_offset": 0, 00:27:16.965 "data_size": 65536 00:27:16.965 }, 00:27:16.965 { 00:27:16.965 "name": "BaseBdev3", 00:27:16.965 "uuid": "93b8eae8-7b33-44bd-8f4f-18c694a33edb", 00:27:16.965 "is_configured": true, 00:27:16.965 "data_offset": 0, 00:27:16.965 "data_size": 65536 00:27:16.965 } 00:27:16.965 ] 00:27:16.965 } 00:27:16.965 } 00:27:16.965 }' 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:16.965 BaseBdev2 00:27:16.965 BaseBdev3' 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:16.965 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:17.224 "name": "BaseBdev1", 00:27:17.224 "aliases": [ 00:27:17.224 "c270cd8d-6842-4081-9bfb-0dee44d321ab" 00:27:17.224 ], 00:27:17.224 "product_name": "Malloc disk", 00:27:17.224 "block_size": 512, 00:27:17.224 "num_blocks": 65536, 00:27:17.224 "uuid": "c270cd8d-6842-4081-9bfb-0dee44d321ab", 00:27:17.224 "assigned_rate_limits": { 00:27:17.224 "rw_ios_per_sec": 0, 00:27:17.224 "rw_mbytes_per_sec": 0, 00:27:17.224 "r_mbytes_per_sec": 0, 00:27:17.224 "w_mbytes_per_sec": 0 00:27:17.224 }, 00:27:17.224 "claimed": true, 00:27:17.224 "claim_type": "exclusive_write", 00:27:17.224 "zoned": false, 00:27:17.224 "supported_io_types": { 00:27:17.224 "read": true, 00:27:17.224 "write": true, 00:27:17.224 "unmap": true, 00:27:17.224 "write_zeroes": true, 00:27:17.224 "flush": true, 00:27:17.224 "reset": true, 00:27:17.224 "compare": false, 00:27:17.224 "compare_and_write": false, 00:27:17.224 "abort": true, 00:27:17.224 "nvme_admin": false, 00:27:17.224 "nvme_io": false 00:27:17.224 }, 00:27:17.224 "memory_domains": [ 00:27:17.224 { 00:27:17.224 "dma_device_id": "system", 00:27:17.224 "dma_device_type": 1 00:27:17.224 }, 00:27:17.224 { 00:27:17.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.224 "dma_device_type": 2 00:27:17.224 } 00:27:17.224 ], 00:27:17.224 "driver_specific": {} 00:27:17.224 }' 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.224 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.483 12:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:17.483 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:17.483 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:17.483 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:17.483 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:17.741 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:17.741 "name": "BaseBdev2", 00:27:17.741 "aliases": [ 00:27:17.741 "92fe509f-fa9e-4196-afa3-e96c96d65186" 00:27:17.741 ], 00:27:17.741 "product_name": "Malloc disk", 00:27:17.741 "block_size": 512, 00:27:17.741 "num_blocks": 65536, 00:27:17.741 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:17.741 "assigned_rate_limits": { 00:27:17.741 "rw_ios_per_sec": 0, 00:27:17.741 "rw_mbytes_per_sec": 0, 00:27:17.741 "r_mbytes_per_sec": 0, 00:27:17.741 "w_mbytes_per_sec": 0 00:27:17.741 }, 00:27:17.741 "claimed": true, 00:27:17.741 "claim_type": "exclusive_write", 00:27:17.741 "zoned": false, 00:27:17.741 "supported_io_types": { 00:27:17.741 "read": true, 00:27:17.741 "write": true, 00:27:17.741 "unmap": true, 00:27:17.741 "write_zeroes": true, 00:27:17.741 "flush": true, 00:27:17.741 "reset": true, 00:27:17.741 "compare": false, 00:27:17.741 "compare_and_write": false, 00:27:17.741 "abort": true, 00:27:17.741 "nvme_admin": false, 00:27:17.741 "nvme_io": false 00:27:17.741 }, 00:27:17.741 "memory_domains": [ 00:27:17.741 { 00:27:17.741 "dma_device_id": "system", 00:27:17.741 "dma_device_type": 1 00:27:17.741 }, 00:27:17.741 { 00:27:17.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:17.741 "dma_device_type": 2 00:27:17.741 } 00:27:17.741 ], 00:27:17.741 "driver_specific": {} 00:27:17.741 }' 00:27:17.741 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:17.741 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.000 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.258 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:18.258 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:18.258 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:27:18.258 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:18.258 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:18.259 "name": "BaseBdev3", 00:27:18.259 "aliases": [ 00:27:18.259 "93b8eae8-7b33-44bd-8f4f-18c694a33edb" 00:27:18.259 ], 00:27:18.259 "product_name": "Malloc disk", 00:27:18.259 "block_size": 512, 00:27:18.259 "num_blocks": 65536, 00:27:18.259 "uuid": "93b8eae8-7b33-44bd-8f4f-18c694a33edb", 00:27:18.259 "assigned_rate_limits": { 00:27:18.259 "rw_ios_per_sec": 0, 00:27:18.259 "rw_mbytes_per_sec": 0, 00:27:18.259 "r_mbytes_per_sec": 0, 00:27:18.259 "w_mbytes_per_sec": 0 00:27:18.259 }, 00:27:18.259 "claimed": true, 00:27:18.259 "claim_type": "exclusive_write", 00:27:18.259 "zoned": false, 00:27:18.259 "supported_io_types": { 00:27:18.259 "read": true, 00:27:18.259 "write": true, 00:27:18.259 "unmap": true, 00:27:18.259 "write_zeroes": true, 00:27:18.259 "flush": true, 00:27:18.259 "reset": true, 00:27:18.259 "compare": false, 00:27:18.259 "compare_and_write": false, 00:27:18.259 "abort": true, 00:27:18.259 "nvme_admin": false, 00:27:18.259 "nvme_io": false 00:27:18.259 }, 00:27:18.259 "memory_domains": [ 00:27:18.259 { 00:27:18.259 "dma_device_id": "system", 00:27:18.259 "dma_device_type": 1 00:27:18.259 }, 00:27:18.259 { 00:27:18.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:18.259 "dma_device_type": 2 00:27:18.259 } 00:27:18.259 ], 00:27:18.259 "driver_specific": {} 00:27:18.259 }' 00:27:18.259 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.517 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.517 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:18.517 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.517 12:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.517 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:18.517 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.517 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.517 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:18.517 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.776 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.776 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:18.776 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:19.035 [2024-06-07 12:29:42.541990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:19.035 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:19.036 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:19.036 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:19.036 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.036 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:19.294 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:19.294 "name": "Existed_Raid", 00:27:19.294 "uuid": "14e91a19-54ed-4ecc-8bf3-4b1ecd841b7b", 00:27:19.294 "strip_size_kb": 0, 00:27:19.294 "state": "online", 00:27:19.294 "raid_level": "raid1", 00:27:19.294 "superblock": false, 00:27:19.294 "num_base_bdevs": 3, 00:27:19.294 "num_base_bdevs_discovered": 2, 00:27:19.294 "num_base_bdevs_operational": 2, 00:27:19.294 "base_bdevs_list": [ 00:27:19.294 { 00:27:19.294 "name": null, 00:27:19.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.294 "is_configured": false, 00:27:19.294 "data_offset": 0, 00:27:19.294 "data_size": 65536 00:27:19.294 }, 00:27:19.294 { 00:27:19.294 "name": "BaseBdev2", 00:27:19.294 "uuid": "92fe509f-fa9e-4196-afa3-e96c96d65186", 00:27:19.294 "is_configured": true, 00:27:19.294 "data_offset": 0, 00:27:19.294 "data_size": 65536 00:27:19.294 }, 00:27:19.294 { 00:27:19.294 "name": "BaseBdev3", 00:27:19.294 "uuid": "93b8eae8-7b33-44bd-8f4f-18c694a33edb", 00:27:19.294 "is_configured": true, 00:27:19.294 "data_offset": 0, 00:27:19.294 "data_size": 65536 00:27:19.294 } 00:27:19.294 ] 00:27:19.294 }' 00:27:19.294 12:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:19.294 12:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:19.862 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:19.862 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:19.862 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:19.862 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.121 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:20.121 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:20.121 12:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:20.380 [2024-06-07 12:29:43.975972] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:20.380 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:20.380 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:20.380 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:20.639 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.639 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:20.639 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:20.639 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:27:20.897 [2024-06-07 12:29:44.491851] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:27:20.897 [2024-06-07 12:29:44.492196] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:20.897 [2024-06-07 12:29:44.514862] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:20.897 [2024-06-07 12:29:44.515115] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:20.897 [2024-06-07 12:29:44.515240] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:27:20.897 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:20.897 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:20.897 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:20.897 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:21.465 12:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:21.726 BaseBdev2 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:21.726 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:21.984 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:22.242 [ 00:27:22.242 { 00:27:22.242 "name": "BaseBdev2", 00:27:22.242 "aliases": [ 00:27:22.242 "7b786bb8-2b25-4098-9e8a-0341adb96bbb" 00:27:22.242 ], 00:27:22.242 "product_name": "Malloc disk", 00:27:22.242 "block_size": 512, 00:27:22.242 "num_blocks": 65536, 00:27:22.242 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:22.242 "assigned_rate_limits": { 00:27:22.242 "rw_ios_per_sec": 0, 00:27:22.242 "rw_mbytes_per_sec": 0, 00:27:22.242 "r_mbytes_per_sec": 0, 00:27:22.242 "w_mbytes_per_sec": 0 00:27:22.242 }, 00:27:22.242 "claimed": false, 00:27:22.242 "zoned": false, 00:27:22.242 "supported_io_types": { 00:27:22.242 "read": true, 00:27:22.242 "write": true, 00:27:22.242 "unmap": true, 00:27:22.242 "write_zeroes": true, 00:27:22.242 "flush": true, 00:27:22.242 "reset": true, 00:27:22.242 "compare": false, 00:27:22.242 "compare_and_write": false, 00:27:22.242 "abort": true, 00:27:22.242 "nvme_admin": false, 00:27:22.242 "nvme_io": false 00:27:22.242 }, 00:27:22.242 "memory_domains": [ 00:27:22.242 { 00:27:22.242 "dma_device_id": "system", 00:27:22.242 "dma_device_type": 1 00:27:22.242 }, 00:27:22.242 { 00:27:22.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.242 "dma_device_type": 2 00:27:22.242 } 00:27:22.242 ], 00:27:22.242 "driver_specific": {} 00:27:22.242 } 00:27:22.242 ] 00:27:22.501 12:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:22.501 12:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:27:22.501 12:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:22.501 12:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:27:22.501 BaseBdev3 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:22.501 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:22.759 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:27:23.017 [ 00:27:23.017 { 00:27:23.017 "name": "BaseBdev3", 00:27:23.017 "aliases": [ 00:27:23.017 "59355547-72a7-4f9f-81b7-4ce11b15fcd9" 00:27:23.017 ], 00:27:23.017 "product_name": "Malloc disk", 00:27:23.017 "block_size": 512, 00:27:23.017 "num_blocks": 65536, 00:27:23.017 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:23.017 "assigned_rate_limits": { 00:27:23.017 "rw_ios_per_sec": 0, 00:27:23.017 "rw_mbytes_per_sec": 0, 00:27:23.017 "r_mbytes_per_sec": 0, 00:27:23.017 "w_mbytes_per_sec": 0 00:27:23.017 }, 00:27:23.017 "claimed": false, 00:27:23.017 "zoned": false, 00:27:23.017 "supported_io_types": { 00:27:23.017 "read": true, 00:27:23.017 "write": true, 00:27:23.017 "unmap": true, 00:27:23.017 "write_zeroes": true, 00:27:23.017 "flush": true, 00:27:23.017 "reset": true, 00:27:23.017 "compare": false, 00:27:23.017 "compare_and_write": false, 00:27:23.017 "abort": true, 00:27:23.017 "nvme_admin": false, 00:27:23.017 "nvme_io": false 00:27:23.017 }, 00:27:23.017 "memory_domains": [ 00:27:23.017 { 00:27:23.017 "dma_device_id": "system", 00:27:23.017 "dma_device_type": 1 00:27:23.017 }, 00:27:23.017 { 00:27:23.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.017 "dma_device_type": 2 00:27:23.017 } 00:27:23.017 ], 00:27:23.017 "driver_specific": {} 00:27:23.017 } 00:27:23.017 ] 00:27:23.017 12:29:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:23.017 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:27:23.017 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:23.017 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:23.276 [2024-06-07 12:29:46.869753] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:23.276 [2024-06-07 12:29:46.870153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:23.276 [2024-06-07 12:29:46.870367] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:23.276 [2024-06-07 12:29:46.872778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.276 12:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:23.535 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.535 "name": "Existed_Raid", 00:27:23.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.535 "strip_size_kb": 0, 00:27:23.535 "state": "configuring", 00:27:23.535 "raid_level": "raid1", 00:27:23.535 "superblock": false, 00:27:23.535 "num_base_bdevs": 3, 00:27:23.535 "num_base_bdevs_discovered": 2, 00:27:23.535 "num_base_bdevs_operational": 3, 00:27:23.535 "base_bdevs_list": [ 00:27:23.535 { 00:27:23.535 "name": "BaseBdev1", 00:27:23.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.535 "is_configured": false, 00:27:23.535 "data_offset": 0, 00:27:23.535 "data_size": 0 00:27:23.535 }, 00:27:23.535 { 00:27:23.535 "name": "BaseBdev2", 00:27:23.535 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:23.535 "is_configured": true, 00:27:23.535 "data_offset": 0, 00:27:23.535 "data_size": 65536 00:27:23.535 }, 00:27:23.535 { 00:27:23.535 "name": "BaseBdev3", 00:27:23.535 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:23.535 "is_configured": true, 00:27:23.535 "data_offset": 0, 00:27:23.535 "data_size": 65536 00:27:23.535 } 00:27:23.535 ] 00:27:23.535 }' 00:27:23.535 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.535 12:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:24.103 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:24.363 [2024-06-07 12:29:47.905838] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:24.363 12:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.645 12:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.645 "name": "Existed_Raid", 00:27:24.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.645 "strip_size_kb": 0, 00:27:24.645 "state": "configuring", 00:27:24.645 "raid_level": "raid1", 00:27:24.645 "superblock": false, 00:27:24.645 "num_base_bdevs": 3, 00:27:24.645 "num_base_bdevs_discovered": 1, 00:27:24.645 "num_base_bdevs_operational": 3, 00:27:24.645 "base_bdevs_list": [ 00:27:24.645 { 00:27:24.645 "name": "BaseBdev1", 00:27:24.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.645 "is_configured": false, 00:27:24.645 "data_offset": 0, 00:27:24.645 "data_size": 0 00:27:24.645 }, 00:27:24.645 { 00:27:24.645 "name": null, 00:27:24.645 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:24.645 "is_configured": false, 00:27:24.645 "data_offset": 0, 00:27:24.645 "data_size": 65536 00:27:24.645 }, 00:27:24.645 { 00:27:24.645 "name": "BaseBdev3", 00:27:24.645 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:24.645 "is_configured": true, 00:27:24.645 "data_offset": 0, 00:27:24.645 "data_size": 65536 00:27:24.645 } 00:27:24.645 ] 00:27:24.645 }' 00:27:24.645 12:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.645 12:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:25.212 12:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:27:25.212 12:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.471 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:27:25.471 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:25.729 [2024-06-07 12:29:49.213475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:25.729 BaseBdev1 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:25.729 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:25.988 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:26.246 [ 00:27:26.246 { 00:27:26.246 "name": "BaseBdev1", 00:27:26.246 "aliases": [ 00:27:26.246 "07d94c20-c8c6-44bf-b86a-79868217e272" 00:27:26.246 ], 00:27:26.246 "product_name": "Malloc disk", 00:27:26.246 "block_size": 512, 00:27:26.246 "num_blocks": 65536, 00:27:26.246 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:26.246 "assigned_rate_limits": { 00:27:26.246 "rw_ios_per_sec": 0, 00:27:26.246 "rw_mbytes_per_sec": 0, 00:27:26.246 "r_mbytes_per_sec": 0, 00:27:26.246 "w_mbytes_per_sec": 0 00:27:26.246 }, 00:27:26.246 "claimed": true, 00:27:26.246 "claim_type": "exclusive_write", 00:27:26.246 "zoned": false, 00:27:26.246 "supported_io_types": { 00:27:26.246 "read": true, 00:27:26.246 "write": true, 00:27:26.246 "unmap": true, 00:27:26.246 "write_zeroes": true, 00:27:26.246 "flush": true, 00:27:26.246 "reset": true, 00:27:26.246 "compare": false, 00:27:26.246 "compare_and_write": false, 00:27:26.246 "abort": true, 00:27:26.246 "nvme_admin": false, 00:27:26.246 "nvme_io": false 00:27:26.246 }, 00:27:26.246 "memory_domains": [ 00:27:26.246 { 00:27:26.246 "dma_device_id": "system", 00:27:26.246 "dma_device_type": 1 00:27:26.246 }, 00:27:26.246 { 00:27:26.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:26.246 "dma_device_type": 2 00:27:26.246 } 00:27:26.246 ], 00:27:26.246 "driver_specific": {} 00:27:26.246 } 00:27:26.246 ] 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.246 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:26.505 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.505 "name": "Existed_Raid", 00:27:26.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.505 "strip_size_kb": 0, 00:27:26.505 "state": "configuring", 00:27:26.505 "raid_level": "raid1", 00:27:26.505 "superblock": false, 00:27:26.505 "num_base_bdevs": 3, 00:27:26.505 "num_base_bdevs_discovered": 2, 00:27:26.505 "num_base_bdevs_operational": 3, 00:27:26.505 "base_bdevs_list": [ 00:27:26.505 { 00:27:26.505 "name": "BaseBdev1", 00:27:26.505 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:26.505 "is_configured": true, 00:27:26.505 "data_offset": 0, 00:27:26.505 "data_size": 65536 00:27:26.505 }, 00:27:26.505 { 00:27:26.505 "name": null, 00:27:26.505 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:26.505 "is_configured": false, 00:27:26.505 "data_offset": 0, 00:27:26.505 "data_size": 65536 00:27:26.505 }, 00:27:26.505 { 00:27:26.505 "name": "BaseBdev3", 00:27:26.505 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:26.505 "is_configured": true, 00:27:26.505 "data_offset": 0, 00:27:26.505 "data_size": 65536 00:27:26.505 } 00:27:26.505 ] 00:27:26.505 }' 00:27:26.505 12:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.505 12:29:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:27.095 12:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.095 12:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:27:27.354 12:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:27:27.354 12:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:27:27.613 [2024-06-07 12:29:51.085022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:27.613 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.872 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.872 "name": "Existed_Raid", 00:27:27.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.872 "strip_size_kb": 0, 00:27:27.872 "state": "configuring", 00:27:27.872 "raid_level": "raid1", 00:27:27.872 "superblock": false, 00:27:27.872 "num_base_bdevs": 3, 00:27:27.872 "num_base_bdevs_discovered": 1, 00:27:27.872 "num_base_bdevs_operational": 3, 00:27:27.872 "base_bdevs_list": [ 00:27:27.872 { 00:27:27.872 "name": "BaseBdev1", 00:27:27.872 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:27.872 "is_configured": true, 00:27:27.872 "data_offset": 0, 00:27:27.872 "data_size": 65536 00:27:27.872 }, 00:27:27.872 { 00:27:27.872 "name": null, 00:27:27.872 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:27.872 "is_configured": false, 00:27:27.872 "data_offset": 0, 00:27:27.872 "data_size": 65536 00:27:27.872 }, 00:27:27.872 { 00:27:27.872 "name": null, 00:27:27.872 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:27.872 "is_configured": false, 00:27:27.872 "data_offset": 0, 00:27:27.872 "data_size": 65536 00:27:27.872 } 00:27:27.872 ] 00:27:27.872 }' 00:27:27.872 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.872 12:29:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:28.438 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.438 12:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:27:28.697 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:27:28.697 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:27:28.955 [2024-06-07 12:29:52.425273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:28.955 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.214 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.214 "name": "Existed_Raid", 00:27:29.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.214 "strip_size_kb": 0, 00:27:29.214 "state": "configuring", 00:27:29.214 "raid_level": "raid1", 00:27:29.214 "superblock": false, 00:27:29.214 "num_base_bdevs": 3, 00:27:29.214 "num_base_bdevs_discovered": 2, 00:27:29.214 "num_base_bdevs_operational": 3, 00:27:29.214 "base_bdevs_list": [ 00:27:29.214 { 00:27:29.214 "name": "BaseBdev1", 00:27:29.214 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:29.214 "is_configured": true, 00:27:29.214 "data_offset": 0, 00:27:29.214 "data_size": 65536 00:27:29.214 }, 00:27:29.214 { 00:27:29.214 "name": null, 00:27:29.214 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:29.214 "is_configured": false, 00:27:29.214 "data_offset": 0, 00:27:29.214 "data_size": 65536 00:27:29.214 }, 00:27:29.214 { 00:27:29.214 "name": "BaseBdev3", 00:27:29.214 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:29.214 "is_configured": true, 00:27:29.214 "data_offset": 0, 00:27:29.214 "data_size": 65536 00:27:29.214 } 00:27:29.214 ] 00:27:29.214 }' 00:27:29.214 12:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.214 12:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:29.780 12:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:27:29.780 12:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.039 12:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:27:30.039 12:29:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:30.607 [2024-06-07 12:29:53.969443] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.607 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:30.865 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:30.865 "name": "Existed_Raid", 00:27:30.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.865 "strip_size_kb": 0, 00:27:30.865 "state": "configuring", 00:27:30.865 "raid_level": "raid1", 00:27:30.865 "superblock": false, 00:27:30.865 "num_base_bdevs": 3, 00:27:30.865 "num_base_bdevs_discovered": 1, 00:27:30.865 "num_base_bdevs_operational": 3, 00:27:30.865 "base_bdevs_list": [ 00:27:30.865 { 00:27:30.865 "name": null, 00:27:30.865 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:30.865 "is_configured": false, 00:27:30.865 "data_offset": 0, 00:27:30.865 "data_size": 65536 00:27:30.865 }, 00:27:30.865 { 00:27:30.865 "name": null, 00:27:30.865 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:30.865 "is_configured": false, 00:27:30.865 "data_offset": 0, 00:27:30.865 "data_size": 65536 00:27:30.865 }, 00:27:30.865 { 00:27:30.865 "name": "BaseBdev3", 00:27:30.865 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:30.865 "is_configured": true, 00:27:30.865 "data_offset": 0, 00:27:30.865 "data_size": 65536 00:27:30.865 } 00:27:30.865 ] 00:27:30.865 }' 00:27:30.865 12:29:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:30.865 12:29:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:31.823 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.823 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:27:31.823 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:27:31.823 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:27:32.148 [2024-06-07 12:29:55.596625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.148 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:32.453 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.453 "name": "Existed_Raid", 00:27:32.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.453 "strip_size_kb": 0, 00:27:32.453 "state": "configuring", 00:27:32.453 "raid_level": "raid1", 00:27:32.453 "superblock": false, 00:27:32.453 "num_base_bdevs": 3, 00:27:32.453 "num_base_bdevs_discovered": 2, 00:27:32.453 "num_base_bdevs_operational": 3, 00:27:32.453 "base_bdevs_list": [ 00:27:32.453 { 00:27:32.453 "name": null, 00:27:32.453 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:32.453 "is_configured": false, 00:27:32.453 "data_offset": 0, 00:27:32.453 "data_size": 65536 00:27:32.453 }, 00:27:32.453 { 00:27:32.453 "name": "BaseBdev2", 00:27:32.453 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:32.453 "is_configured": true, 00:27:32.453 "data_offset": 0, 00:27:32.453 "data_size": 65536 00:27:32.453 }, 00:27:32.453 { 00:27:32.453 "name": "BaseBdev3", 00:27:32.453 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:32.453 "is_configured": true, 00:27:32.453 "data_offset": 0, 00:27:32.453 "data_size": 65536 00:27:32.453 } 00:27:32.453 ] 00:27:32.453 }' 00:27:32.453 12:29:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.453 12:29:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:33.021 12:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:27:33.021 12:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.279 12:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:27:33.279 12:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.279 12:29:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:27:33.537 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 07d94c20-c8c6-44bf-b86a-79868217e272 00:27:33.795 [2024-06-07 12:29:57.219942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:27:33.795 [2024-06-07 12:29:57.220264] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:27:33.795 [2024-06-07 12:29:57.220319] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:33.795 [2024-06-07 12:29:57.220491] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:27:33.795 [2024-06-07 12:29:57.220855] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:27:33.795 [2024-06-07 12:29:57.220975] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:27:33.795 [2024-06-07 12:29:57.221239] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:33.795 NewBaseBdev 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:33.795 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:34.053 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:27:34.312 [ 00:27:34.312 { 00:27:34.312 "name": "NewBaseBdev", 00:27:34.312 "aliases": [ 00:27:34.312 "07d94c20-c8c6-44bf-b86a-79868217e272" 00:27:34.312 ], 00:27:34.312 "product_name": "Malloc disk", 00:27:34.312 "block_size": 512, 00:27:34.312 "num_blocks": 65536, 00:27:34.312 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:34.312 "assigned_rate_limits": { 00:27:34.312 "rw_ios_per_sec": 0, 00:27:34.312 "rw_mbytes_per_sec": 0, 00:27:34.312 "r_mbytes_per_sec": 0, 00:27:34.312 "w_mbytes_per_sec": 0 00:27:34.312 }, 00:27:34.312 "claimed": true, 00:27:34.312 "claim_type": "exclusive_write", 00:27:34.312 "zoned": false, 00:27:34.312 "supported_io_types": { 00:27:34.312 "read": true, 00:27:34.312 "write": true, 00:27:34.312 "unmap": true, 00:27:34.312 "write_zeroes": true, 00:27:34.312 "flush": true, 00:27:34.312 "reset": true, 00:27:34.312 "compare": false, 00:27:34.312 "compare_and_write": false, 00:27:34.312 "abort": true, 00:27:34.312 "nvme_admin": false, 00:27:34.312 "nvme_io": false 00:27:34.312 }, 00:27:34.312 "memory_domains": [ 00:27:34.312 { 00:27:34.312 "dma_device_id": "system", 00:27:34.312 "dma_device_type": 1 00:27:34.312 }, 00:27:34.312 { 00:27:34.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.312 "dma_device_type": 2 00:27:34.312 } 00:27:34.312 ], 00:27:34.312 "driver_specific": {} 00:27:34.312 } 00:27:34.312 ] 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.312 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.313 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.313 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.313 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.313 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:34.571 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.571 "name": "Existed_Raid", 00:27:34.571 "uuid": "62de45fd-e53a-41ca-bfd4-6abf2bf1bc17", 00:27:34.571 "strip_size_kb": 0, 00:27:34.571 "state": "online", 00:27:34.571 "raid_level": "raid1", 00:27:34.571 "superblock": false, 00:27:34.571 "num_base_bdevs": 3, 00:27:34.571 "num_base_bdevs_discovered": 3, 00:27:34.571 "num_base_bdevs_operational": 3, 00:27:34.571 "base_bdevs_list": [ 00:27:34.571 { 00:27:34.571 "name": "NewBaseBdev", 00:27:34.571 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:34.571 "is_configured": true, 00:27:34.571 "data_offset": 0, 00:27:34.571 "data_size": 65536 00:27:34.571 }, 00:27:34.571 { 00:27:34.571 "name": "BaseBdev2", 00:27:34.571 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:34.571 "is_configured": true, 00:27:34.571 "data_offset": 0, 00:27:34.571 "data_size": 65536 00:27:34.571 }, 00:27:34.571 { 00:27:34.571 "name": "BaseBdev3", 00:27:34.571 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:34.571 "is_configured": true, 00:27:34.571 "data_offset": 0, 00:27:34.571 "data_size": 65536 00:27:34.571 } 00:27:34.571 ] 00:27:34.571 }' 00:27:34.571 12:29:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.571 12:29:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:35.137 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:35.395 [2024-06-07 12:29:58.840426] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:35.395 "name": "Existed_Raid", 00:27:35.395 "aliases": [ 00:27:35.395 "62de45fd-e53a-41ca-bfd4-6abf2bf1bc17" 00:27:35.395 ], 00:27:35.395 "product_name": "Raid Volume", 00:27:35.395 "block_size": 512, 00:27:35.395 "num_blocks": 65536, 00:27:35.395 "uuid": "62de45fd-e53a-41ca-bfd4-6abf2bf1bc17", 00:27:35.395 "assigned_rate_limits": { 00:27:35.395 "rw_ios_per_sec": 0, 00:27:35.395 "rw_mbytes_per_sec": 0, 00:27:35.395 "r_mbytes_per_sec": 0, 00:27:35.395 "w_mbytes_per_sec": 0 00:27:35.395 }, 00:27:35.395 "claimed": false, 00:27:35.395 "zoned": false, 00:27:35.395 "supported_io_types": { 00:27:35.395 "read": true, 00:27:35.395 "write": true, 00:27:35.395 "unmap": false, 00:27:35.395 "write_zeroes": true, 00:27:35.395 "flush": false, 00:27:35.395 "reset": true, 00:27:35.395 "compare": false, 00:27:35.395 "compare_and_write": false, 00:27:35.395 "abort": false, 00:27:35.395 "nvme_admin": false, 00:27:35.395 "nvme_io": false 00:27:35.395 }, 00:27:35.395 "memory_domains": [ 00:27:35.395 { 00:27:35.395 "dma_device_id": "system", 00:27:35.395 "dma_device_type": 1 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.395 "dma_device_type": 2 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "dma_device_id": "system", 00:27:35.395 "dma_device_type": 1 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.395 "dma_device_type": 2 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "dma_device_id": "system", 00:27:35.395 "dma_device_type": 1 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.395 "dma_device_type": 2 00:27:35.395 } 00:27:35.395 ], 00:27:35.395 "driver_specific": { 00:27:35.395 "raid": { 00:27:35.395 "uuid": "62de45fd-e53a-41ca-bfd4-6abf2bf1bc17", 00:27:35.395 "strip_size_kb": 0, 00:27:35.395 "state": "online", 00:27:35.395 "raid_level": "raid1", 00:27:35.395 "superblock": false, 00:27:35.395 "num_base_bdevs": 3, 00:27:35.395 "num_base_bdevs_discovered": 3, 00:27:35.395 "num_base_bdevs_operational": 3, 00:27:35.395 "base_bdevs_list": [ 00:27:35.395 { 00:27:35.395 "name": "NewBaseBdev", 00:27:35.395 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:35.395 "is_configured": true, 00:27:35.395 "data_offset": 0, 00:27:35.395 "data_size": 65536 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "name": "BaseBdev2", 00:27:35.395 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:35.395 "is_configured": true, 00:27:35.395 "data_offset": 0, 00:27:35.395 "data_size": 65536 00:27:35.395 }, 00:27:35.395 { 00:27:35.395 "name": "BaseBdev3", 00:27:35.395 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:35.395 "is_configured": true, 00:27:35.395 "data_offset": 0, 00:27:35.395 "data_size": 65536 00:27:35.395 } 00:27:35.395 ] 00:27:35.395 } 00:27:35.395 } 00:27:35.395 }' 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:27:35.395 BaseBdev2 00:27:35.395 BaseBdev3' 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:27:35.395 12:29:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:35.654 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:35.654 "name": "NewBaseBdev", 00:27:35.654 "aliases": [ 00:27:35.654 "07d94c20-c8c6-44bf-b86a-79868217e272" 00:27:35.654 ], 00:27:35.654 "product_name": "Malloc disk", 00:27:35.654 "block_size": 512, 00:27:35.654 "num_blocks": 65536, 00:27:35.654 "uuid": "07d94c20-c8c6-44bf-b86a-79868217e272", 00:27:35.654 "assigned_rate_limits": { 00:27:35.654 "rw_ios_per_sec": 0, 00:27:35.654 "rw_mbytes_per_sec": 0, 00:27:35.654 "r_mbytes_per_sec": 0, 00:27:35.654 "w_mbytes_per_sec": 0 00:27:35.654 }, 00:27:35.654 "claimed": true, 00:27:35.654 "claim_type": "exclusive_write", 00:27:35.654 "zoned": false, 00:27:35.654 "supported_io_types": { 00:27:35.654 "read": true, 00:27:35.654 "write": true, 00:27:35.654 "unmap": true, 00:27:35.654 "write_zeroes": true, 00:27:35.654 "flush": true, 00:27:35.654 "reset": true, 00:27:35.654 "compare": false, 00:27:35.654 "compare_and_write": false, 00:27:35.654 "abort": true, 00:27:35.654 "nvme_admin": false, 00:27:35.654 "nvme_io": false 00:27:35.654 }, 00:27:35.654 "memory_domains": [ 00:27:35.654 { 00:27:35.654 "dma_device_id": "system", 00:27:35.654 "dma_device_type": 1 00:27:35.654 }, 00:27:35.654 { 00:27:35.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.654 "dma_device_type": 2 00:27:35.654 } 00:27:35.654 ], 00:27:35.654 "driver_specific": {} 00:27:35.654 }' 00:27:35.654 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.654 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.654 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:35.654 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:35.912 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:36.170 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:36.170 "name": "BaseBdev2", 00:27:36.170 "aliases": [ 00:27:36.170 "7b786bb8-2b25-4098-9e8a-0341adb96bbb" 00:27:36.170 ], 00:27:36.170 "product_name": "Malloc disk", 00:27:36.170 "block_size": 512, 00:27:36.170 "num_blocks": 65536, 00:27:36.170 "uuid": "7b786bb8-2b25-4098-9e8a-0341adb96bbb", 00:27:36.170 "assigned_rate_limits": { 00:27:36.170 "rw_ios_per_sec": 0, 00:27:36.170 "rw_mbytes_per_sec": 0, 00:27:36.170 "r_mbytes_per_sec": 0, 00:27:36.170 "w_mbytes_per_sec": 0 00:27:36.170 }, 00:27:36.170 "claimed": true, 00:27:36.170 "claim_type": "exclusive_write", 00:27:36.170 "zoned": false, 00:27:36.170 "supported_io_types": { 00:27:36.170 "read": true, 00:27:36.170 "write": true, 00:27:36.170 "unmap": true, 00:27:36.170 "write_zeroes": true, 00:27:36.170 "flush": true, 00:27:36.170 "reset": true, 00:27:36.170 "compare": false, 00:27:36.170 "compare_and_write": false, 00:27:36.170 "abort": true, 00:27:36.170 "nvme_admin": false, 00:27:36.170 "nvme_io": false 00:27:36.170 }, 00:27:36.170 "memory_domains": [ 00:27:36.170 { 00:27:36.170 "dma_device_id": "system", 00:27:36.170 "dma_device_type": 1 00:27:36.170 }, 00:27:36.170 { 00:27:36.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:36.170 "dma_device_type": 2 00:27:36.170 } 00:27:36.170 ], 00:27:36.170 "driver_specific": {} 00:27:36.170 }' 00:27:36.170 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.170 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.428 12:29:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.428 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:36.428 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.428 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.686 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:36.686 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:36.686 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:27:36.686 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:36.946 "name": "BaseBdev3", 00:27:36.946 "aliases": [ 00:27:36.946 "59355547-72a7-4f9f-81b7-4ce11b15fcd9" 00:27:36.946 ], 00:27:36.946 "product_name": "Malloc disk", 00:27:36.946 "block_size": 512, 00:27:36.946 "num_blocks": 65536, 00:27:36.946 "uuid": "59355547-72a7-4f9f-81b7-4ce11b15fcd9", 00:27:36.946 "assigned_rate_limits": { 00:27:36.946 "rw_ios_per_sec": 0, 00:27:36.946 "rw_mbytes_per_sec": 0, 00:27:36.946 "r_mbytes_per_sec": 0, 00:27:36.946 "w_mbytes_per_sec": 0 00:27:36.946 }, 00:27:36.946 "claimed": true, 00:27:36.946 "claim_type": "exclusive_write", 00:27:36.946 "zoned": false, 00:27:36.946 "supported_io_types": { 00:27:36.946 "read": true, 00:27:36.946 "write": true, 00:27:36.946 "unmap": true, 00:27:36.946 "write_zeroes": true, 00:27:36.946 "flush": true, 00:27:36.946 "reset": true, 00:27:36.946 "compare": false, 00:27:36.946 "compare_and_write": false, 00:27:36.946 "abort": true, 00:27:36.946 "nvme_admin": false, 00:27:36.946 "nvme_io": false 00:27:36.946 }, 00:27:36.946 "memory_domains": [ 00:27:36.946 { 00:27:36.946 "dma_device_id": "system", 00:27:36.946 "dma_device_type": 1 00:27:36.946 }, 00:27:36.946 { 00:27:36.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:36.946 "dma_device_type": 2 00:27:36.946 } 00:27:36.946 ], 00:27:36.946 "driver_specific": {} 00:27:36.946 }' 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:36.946 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:37.205 12:30:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:37.463 [2024-06-07 12:30:01.008527] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:37.463 [2024-06-07 12:30:01.008807] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:37.463 [2024-06-07 12:30:01.008995] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:37.463 [2024-06-07 12:30:01.009316] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:37.463 [2024-06-07 12:30:01.009417] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 208389 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 208389 ']' 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 208389 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 208389 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 208389' 00:27:37.463 killing process with pid 208389 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 208389 00:27:37.463 [2024-06-07 12:30:01.054856] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:37.463 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 208389 00:27:37.722 [2024-06-07 12:30:01.116778] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:27:37.982 00:27:37.982 real 0m30.581s 00:27:37.982 user 0m55.862s 00:27:37.982 sys 0m4.987s 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:27:37.982 ************************************ 00:27:37.982 END TEST raid_state_function_test 00:27:37.982 ************************************ 00:27:37.982 12:30:01 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:27:37.982 12:30:01 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:37.982 12:30:01 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:37.982 12:30:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:37.982 ************************************ 00:27:37.982 START TEST raid_state_function_test_sb 00:27:37.982 ************************************ 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 true 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=209374 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 209374' 00:27:37.982 Process raid pid: 209374 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 209374 /var/tmp/spdk-raid.sock 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 209374 ']' 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:37.982 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:37.982 12:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:37.982 [2024-06-07 12:30:01.598042] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:27:37.982 [2024-06-07 12:30:01.598472] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:38.240 [2024-06-07 12:30:01.745310] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:38.240 [2024-06-07 12:30:01.864545] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:38.498 [2024-06-07 12:30:01.947638] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:39.065 12:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:39.065 12:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:27:39.065 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:39.323 [2024-06-07 12:30:02.872555] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:39.323 [2024-06-07 12:30:02.872877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:39.323 [2024-06-07 12:30:02.872988] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:39.323 [2024-06-07 12:30:02.873053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:39.323 [2024-06-07 12:30:02.873140] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:39.323 [2024-06-07 12:30:02.873237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.323 12:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:39.581 12:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.581 "name": "Existed_Raid", 00:27:39.581 "uuid": "9e95d995-1563-4ff5-9397-3d78968ed524", 00:27:39.581 "strip_size_kb": 0, 00:27:39.581 "state": "configuring", 00:27:39.581 "raid_level": "raid1", 00:27:39.581 "superblock": true, 00:27:39.581 "num_base_bdevs": 3, 00:27:39.581 "num_base_bdevs_discovered": 0, 00:27:39.581 "num_base_bdevs_operational": 3, 00:27:39.581 "base_bdevs_list": [ 00:27:39.581 { 00:27:39.581 "name": "BaseBdev1", 00:27:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.581 "is_configured": false, 00:27:39.581 "data_offset": 0, 00:27:39.581 "data_size": 0 00:27:39.581 }, 00:27:39.581 { 00:27:39.581 "name": "BaseBdev2", 00:27:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.581 "is_configured": false, 00:27:39.581 "data_offset": 0, 00:27:39.581 "data_size": 0 00:27:39.581 }, 00:27:39.581 { 00:27:39.581 "name": "BaseBdev3", 00:27:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.581 "is_configured": false, 00:27:39.581 "data_offset": 0, 00:27:39.581 "data_size": 0 00:27:39.581 } 00:27:39.581 ] 00:27:39.581 }' 00:27:39.581 12:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.581 12:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:40.147 12:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:40.404 [2024-06-07 12:30:03.992617] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:40.404 [2024-06-07 12:30:03.992909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:27:40.404 12:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:40.662 [2024-06-07 12:30:04.268713] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:40.662 [2024-06-07 12:30:04.269031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:40.662 [2024-06-07 12:30:04.269145] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:40.662 [2024-06-07 12:30:04.269205] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:40.662 [2024-06-07 12:30:04.269329] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:40.662 [2024-06-07 12:30:04.269398] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:40.662 12:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:41.228 [2024-06-07 12:30:04.592643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:41.228 BaseBdev1 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:41.228 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:41.487 12:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:41.745 [ 00:27:41.745 { 00:27:41.745 "name": "BaseBdev1", 00:27:41.745 "aliases": [ 00:27:41.745 "be694102-b1a4-4c83-88bc-355ebf780ce0" 00:27:41.745 ], 00:27:41.745 "product_name": "Malloc disk", 00:27:41.745 "block_size": 512, 00:27:41.745 "num_blocks": 65536, 00:27:41.745 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:41.745 "assigned_rate_limits": { 00:27:41.745 "rw_ios_per_sec": 0, 00:27:41.745 "rw_mbytes_per_sec": 0, 00:27:41.745 "r_mbytes_per_sec": 0, 00:27:41.745 "w_mbytes_per_sec": 0 00:27:41.745 }, 00:27:41.745 "claimed": true, 00:27:41.745 "claim_type": "exclusive_write", 00:27:41.745 "zoned": false, 00:27:41.745 "supported_io_types": { 00:27:41.745 "read": true, 00:27:41.745 "write": true, 00:27:41.745 "unmap": true, 00:27:41.745 "write_zeroes": true, 00:27:41.745 "flush": true, 00:27:41.745 "reset": true, 00:27:41.745 "compare": false, 00:27:41.745 "compare_and_write": false, 00:27:41.745 "abort": true, 00:27:41.745 "nvme_admin": false, 00:27:41.745 "nvme_io": false 00:27:41.745 }, 00:27:41.745 "memory_domains": [ 00:27:41.745 { 00:27:41.745 "dma_device_id": "system", 00:27:41.745 "dma_device_type": 1 00:27:41.745 }, 00:27:41.745 { 00:27:41.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.745 "dma_device_type": 2 00:27:41.745 } 00:27:41.745 ], 00:27:41.745 "driver_specific": {} 00:27:41.745 } 00:27:41.745 ] 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.745 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:42.005 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.005 "name": "Existed_Raid", 00:27:42.005 "uuid": "10196d38-0da1-402b-9f00-33e100f429c7", 00:27:42.005 "strip_size_kb": 0, 00:27:42.005 "state": "configuring", 00:27:42.005 "raid_level": "raid1", 00:27:42.005 "superblock": true, 00:27:42.005 "num_base_bdevs": 3, 00:27:42.005 "num_base_bdevs_discovered": 1, 00:27:42.005 "num_base_bdevs_operational": 3, 00:27:42.005 "base_bdevs_list": [ 00:27:42.005 { 00:27:42.005 "name": "BaseBdev1", 00:27:42.005 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:42.005 "is_configured": true, 00:27:42.005 "data_offset": 2048, 00:27:42.005 "data_size": 63488 00:27:42.005 }, 00:27:42.005 { 00:27:42.005 "name": "BaseBdev2", 00:27:42.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.005 "is_configured": false, 00:27:42.005 "data_offset": 0, 00:27:42.005 "data_size": 0 00:27:42.005 }, 00:27:42.005 { 00:27:42.005 "name": "BaseBdev3", 00:27:42.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.005 "is_configured": false, 00:27:42.005 "data_offset": 0, 00:27:42.005 "data_size": 0 00:27:42.005 } 00:27:42.005 ] 00:27:42.005 }' 00:27:42.005 12:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.005 12:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:42.590 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:42.856 [2024-06-07 12:30:06.417101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:42.856 [2024-06-07 12:30:06.417529] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:27:42.856 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:43.116 [2024-06-07 12:30:06.661303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:43.116 [2024-06-07 12:30:06.663715] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:43.116 [2024-06-07 12:30:06.663967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:43.116 [2024-06-07 12:30:06.664060] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:27:43.116 [2024-06-07 12:30:06.664129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.116 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:43.374 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.374 "name": "Existed_Raid", 00:27:43.374 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:43.374 "strip_size_kb": 0, 00:27:43.374 "state": "configuring", 00:27:43.374 "raid_level": "raid1", 00:27:43.374 "superblock": true, 00:27:43.374 "num_base_bdevs": 3, 00:27:43.374 "num_base_bdevs_discovered": 1, 00:27:43.374 "num_base_bdevs_operational": 3, 00:27:43.374 "base_bdevs_list": [ 00:27:43.374 { 00:27:43.374 "name": "BaseBdev1", 00:27:43.374 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:43.374 "is_configured": true, 00:27:43.374 "data_offset": 2048, 00:27:43.374 "data_size": 63488 00:27:43.374 }, 00:27:43.374 { 00:27:43.374 "name": "BaseBdev2", 00:27:43.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.374 "is_configured": false, 00:27:43.374 "data_offset": 0, 00:27:43.374 "data_size": 0 00:27:43.374 }, 00:27:43.374 { 00:27:43.374 "name": "BaseBdev3", 00:27:43.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.374 "is_configured": false, 00:27:43.374 "data_offset": 0, 00:27:43.374 "data_size": 0 00:27:43.374 } 00:27:43.374 ] 00:27:43.374 }' 00:27:43.374 12:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.374 12:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:43.941 12:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:44.198 [2024-06-07 12:30:07.785797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:44.199 BaseBdev2 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:44.199 12:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:44.764 12:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:45.072 [ 00:27:45.072 { 00:27:45.072 "name": "BaseBdev2", 00:27:45.072 "aliases": [ 00:27:45.072 "15521901-5092-4eb0-ab4e-173350223b1e" 00:27:45.072 ], 00:27:45.072 "product_name": "Malloc disk", 00:27:45.072 "block_size": 512, 00:27:45.072 "num_blocks": 65536, 00:27:45.072 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:45.072 "assigned_rate_limits": { 00:27:45.072 "rw_ios_per_sec": 0, 00:27:45.072 "rw_mbytes_per_sec": 0, 00:27:45.072 "r_mbytes_per_sec": 0, 00:27:45.072 "w_mbytes_per_sec": 0 00:27:45.072 }, 00:27:45.072 "claimed": true, 00:27:45.072 "claim_type": "exclusive_write", 00:27:45.072 "zoned": false, 00:27:45.072 "supported_io_types": { 00:27:45.072 "read": true, 00:27:45.072 "write": true, 00:27:45.072 "unmap": true, 00:27:45.072 "write_zeroes": true, 00:27:45.072 "flush": true, 00:27:45.072 "reset": true, 00:27:45.072 "compare": false, 00:27:45.072 "compare_and_write": false, 00:27:45.072 "abort": true, 00:27:45.072 "nvme_admin": false, 00:27:45.072 "nvme_io": false 00:27:45.072 }, 00:27:45.072 "memory_domains": [ 00:27:45.072 { 00:27:45.072 "dma_device_id": "system", 00:27:45.072 "dma_device_type": 1 00:27:45.072 }, 00:27:45.072 { 00:27:45.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:45.072 "dma_device_type": 2 00:27:45.072 } 00:27:45.072 ], 00:27:45.072 "driver_specific": {} 00:27:45.072 } 00:27:45.072 ] 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.072 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:45.332 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.332 "name": "Existed_Raid", 00:27:45.332 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:45.332 "strip_size_kb": 0, 00:27:45.332 "state": "configuring", 00:27:45.332 "raid_level": "raid1", 00:27:45.332 "superblock": true, 00:27:45.332 "num_base_bdevs": 3, 00:27:45.332 "num_base_bdevs_discovered": 2, 00:27:45.332 "num_base_bdevs_operational": 3, 00:27:45.332 "base_bdevs_list": [ 00:27:45.332 { 00:27:45.332 "name": "BaseBdev1", 00:27:45.332 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:45.332 "is_configured": true, 00:27:45.332 "data_offset": 2048, 00:27:45.332 "data_size": 63488 00:27:45.332 }, 00:27:45.332 { 00:27:45.332 "name": "BaseBdev2", 00:27:45.332 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:45.332 "is_configured": true, 00:27:45.332 "data_offset": 2048, 00:27:45.332 "data_size": 63488 00:27:45.332 }, 00:27:45.332 { 00:27:45.332 "name": "BaseBdev3", 00:27:45.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.332 "is_configured": false, 00:27:45.332 "data_offset": 0, 00:27:45.332 "data_size": 0 00:27:45.332 } 00:27:45.332 ] 00:27:45.332 }' 00:27:45.332 12:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.332 12:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:45.903 12:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:27:46.162 [2024-06-07 12:30:09.711593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:46.162 [2024-06-07 12:30:09.712126] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:27:46.162 [2024-06-07 12:30:09.712266] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:46.162 [2024-06-07 12:30:09.712453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000021f0 00:27:46.162 [2024-06-07 12:30:09.712956] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:27:46.162 [2024-06-07 12:30:09.713071] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:27:46.162 [2024-06-07 12:30:09.713307] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:46.162 BaseBdev3 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:46.162 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:46.421 12:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:27:46.680 [ 00:27:46.680 { 00:27:46.680 "name": "BaseBdev3", 00:27:46.680 "aliases": [ 00:27:46.680 "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b" 00:27:46.680 ], 00:27:46.680 "product_name": "Malloc disk", 00:27:46.680 "block_size": 512, 00:27:46.680 "num_blocks": 65536, 00:27:46.680 "uuid": "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b", 00:27:46.680 "assigned_rate_limits": { 00:27:46.680 "rw_ios_per_sec": 0, 00:27:46.680 "rw_mbytes_per_sec": 0, 00:27:46.680 "r_mbytes_per_sec": 0, 00:27:46.680 "w_mbytes_per_sec": 0 00:27:46.680 }, 00:27:46.680 "claimed": true, 00:27:46.680 "claim_type": "exclusive_write", 00:27:46.680 "zoned": false, 00:27:46.680 "supported_io_types": { 00:27:46.680 "read": true, 00:27:46.680 "write": true, 00:27:46.680 "unmap": true, 00:27:46.680 "write_zeroes": true, 00:27:46.680 "flush": true, 00:27:46.680 "reset": true, 00:27:46.680 "compare": false, 00:27:46.680 "compare_and_write": false, 00:27:46.680 "abort": true, 00:27:46.680 "nvme_admin": false, 00:27:46.680 "nvme_io": false 00:27:46.680 }, 00:27:46.680 "memory_domains": [ 00:27:46.680 { 00:27:46.680 "dma_device_id": "system", 00:27:46.680 "dma_device_type": 1 00:27:46.680 }, 00:27:46.680 { 00:27:46.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.680 "dma_device_type": 2 00:27:46.680 } 00:27:46.680 ], 00:27:46.680 "driver_specific": {} 00:27:46.680 } 00:27:46.680 ] 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.680 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:46.939 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:46.939 "name": "Existed_Raid", 00:27:46.939 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:46.939 "strip_size_kb": 0, 00:27:46.939 "state": "online", 00:27:46.939 "raid_level": "raid1", 00:27:46.939 "superblock": true, 00:27:46.939 "num_base_bdevs": 3, 00:27:46.939 "num_base_bdevs_discovered": 3, 00:27:46.939 "num_base_bdevs_operational": 3, 00:27:46.939 "base_bdevs_list": [ 00:27:46.939 { 00:27:46.939 "name": "BaseBdev1", 00:27:46.939 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:46.939 "is_configured": true, 00:27:46.939 "data_offset": 2048, 00:27:46.939 "data_size": 63488 00:27:46.939 }, 00:27:46.939 { 00:27:46.939 "name": "BaseBdev2", 00:27:46.939 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:46.939 "is_configured": true, 00:27:46.939 "data_offset": 2048, 00:27:46.939 "data_size": 63488 00:27:46.939 }, 00:27:46.939 { 00:27:46.939 "name": "BaseBdev3", 00:27:46.939 "uuid": "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b", 00:27:46.939 "is_configured": true, 00:27:46.939 "data_offset": 2048, 00:27:46.939 "data_size": 63488 00:27:46.939 } 00:27:46.939 ] 00:27:46.939 }' 00:27:46.939 12:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:46.939 12:30:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:47.506 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:47.507 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:47.766 [2024-06-07 12:30:11.320065] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:47.766 "name": "Existed_Raid", 00:27:47.766 "aliases": [ 00:27:47.766 "24282c4f-a965-44c6-978f-8339de308cf6" 00:27:47.766 ], 00:27:47.766 "product_name": "Raid Volume", 00:27:47.766 "block_size": 512, 00:27:47.766 "num_blocks": 63488, 00:27:47.766 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:47.766 "assigned_rate_limits": { 00:27:47.766 "rw_ios_per_sec": 0, 00:27:47.766 "rw_mbytes_per_sec": 0, 00:27:47.766 "r_mbytes_per_sec": 0, 00:27:47.766 "w_mbytes_per_sec": 0 00:27:47.766 }, 00:27:47.766 "claimed": false, 00:27:47.766 "zoned": false, 00:27:47.766 "supported_io_types": { 00:27:47.766 "read": true, 00:27:47.766 "write": true, 00:27:47.766 "unmap": false, 00:27:47.766 "write_zeroes": true, 00:27:47.766 "flush": false, 00:27:47.766 "reset": true, 00:27:47.766 "compare": false, 00:27:47.766 "compare_and_write": false, 00:27:47.766 "abort": false, 00:27:47.766 "nvme_admin": false, 00:27:47.766 "nvme_io": false 00:27:47.766 }, 00:27:47.766 "memory_domains": [ 00:27:47.766 { 00:27:47.766 "dma_device_id": "system", 00:27:47.766 "dma_device_type": 1 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.766 "dma_device_type": 2 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "dma_device_id": "system", 00:27:47.766 "dma_device_type": 1 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.766 "dma_device_type": 2 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "dma_device_id": "system", 00:27:47.766 "dma_device_type": 1 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.766 "dma_device_type": 2 00:27:47.766 } 00:27:47.766 ], 00:27:47.766 "driver_specific": { 00:27:47.766 "raid": { 00:27:47.766 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:47.766 "strip_size_kb": 0, 00:27:47.766 "state": "online", 00:27:47.766 "raid_level": "raid1", 00:27:47.766 "superblock": true, 00:27:47.766 "num_base_bdevs": 3, 00:27:47.766 "num_base_bdevs_discovered": 3, 00:27:47.766 "num_base_bdevs_operational": 3, 00:27:47.766 "base_bdevs_list": [ 00:27:47.766 { 00:27:47.766 "name": "BaseBdev1", 00:27:47.766 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:47.766 "is_configured": true, 00:27:47.766 "data_offset": 2048, 00:27:47.766 "data_size": 63488 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "name": "BaseBdev2", 00:27:47.766 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:47.766 "is_configured": true, 00:27:47.766 "data_offset": 2048, 00:27:47.766 "data_size": 63488 00:27:47.766 }, 00:27:47.766 { 00:27:47.766 "name": "BaseBdev3", 00:27:47.766 "uuid": "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b", 00:27:47.766 "is_configured": true, 00:27:47.766 "data_offset": 2048, 00:27:47.766 "data_size": 63488 00:27:47.766 } 00:27:47.766 ] 00:27:47.766 } 00:27:47.766 } 00:27:47.766 }' 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:47.766 BaseBdev2 00:27:47.766 BaseBdev3' 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:47.766 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:48.025 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:48.025 "name": "BaseBdev1", 00:27:48.025 "aliases": [ 00:27:48.025 "be694102-b1a4-4c83-88bc-355ebf780ce0" 00:27:48.025 ], 00:27:48.025 "product_name": "Malloc disk", 00:27:48.025 "block_size": 512, 00:27:48.025 "num_blocks": 65536, 00:27:48.025 "uuid": "be694102-b1a4-4c83-88bc-355ebf780ce0", 00:27:48.025 "assigned_rate_limits": { 00:27:48.025 "rw_ios_per_sec": 0, 00:27:48.025 "rw_mbytes_per_sec": 0, 00:27:48.025 "r_mbytes_per_sec": 0, 00:27:48.025 "w_mbytes_per_sec": 0 00:27:48.025 }, 00:27:48.025 "claimed": true, 00:27:48.025 "claim_type": "exclusive_write", 00:27:48.025 "zoned": false, 00:27:48.025 "supported_io_types": { 00:27:48.025 "read": true, 00:27:48.025 "write": true, 00:27:48.025 "unmap": true, 00:27:48.025 "write_zeroes": true, 00:27:48.025 "flush": true, 00:27:48.025 "reset": true, 00:27:48.025 "compare": false, 00:27:48.025 "compare_and_write": false, 00:27:48.025 "abort": true, 00:27:48.025 "nvme_admin": false, 00:27:48.025 "nvme_io": false 00:27:48.025 }, 00:27:48.025 "memory_domains": [ 00:27:48.025 { 00:27:48.025 "dma_device_id": "system", 00:27:48.025 "dma_device_type": 1 00:27:48.025 }, 00:27:48.025 { 00:27:48.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:48.025 "dma_device_type": 2 00:27:48.025 } 00:27:48.025 ], 00:27:48.025 "driver_specific": {} 00:27:48.025 }' 00:27:48.025 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:48.283 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.540 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.540 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:48.540 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:48.540 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:48.541 12:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:48.836 "name": "BaseBdev2", 00:27:48.836 "aliases": [ 00:27:48.836 "15521901-5092-4eb0-ab4e-173350223b1e" 00:27:48.836 ], 00:27:48.836 "product_name": "Malloc disk", 00:27:48.836 "block_size": 512, 00:27:48.836 "num_blocks": 65536, 00:27:48.836 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:48.836 "assigned_rate_limits": { 00:27:48.836 "rw_ios_per_sec": 0, 00:27:48.836 "rw_mbytes_per_sec": 0, 00:27:48.836 "r_mbytes_per_sec": 0, 00:27:48.836 "w_mbytes_per_sec": 0 00:27:48.836 }, 00:27:48.836 "claimed": true, 00:27:48.836 "claim_type": "exclusive_write", 00:27:48.836 "zoned": false, 00:27:48.836 "supported_io_types": { 00:27:48.836 "read": true, 00:27:48.836 "write": true, 00:27:48.836 "unmap": true, 00:27:48.836 "write_zeroes": true, 00:27:48.836 "flush": true, 00:27:48.836 "reset": true, 00:27:48.836 "compare": false, 00:27:48.836 "compare_and_write": false, 00:27:48.836 "abort": true, 00:27:48.836 "nvme_admin": false, 00:27:48.836 "nvme_io": false 00:27:48.836 }, 00:27:48.836 "memory_domains": [ 00:27:48.836 { 00:27:48.836 "dma_device_id": "system", 00:27:48.836 "dma_device_type": 1 00:27:48.836 }, 00:27:48.836 { 00:27:48.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:48.836 "dma_device_type": 2 00:27:48.836 } 00:27:48.836 ], 00:27:48.836 "driver_specific": {} 00:27:48.836 }' 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:48.836 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:27:49.096 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:49.355 "name": "BaseBdev3", 00:27:49.355 "aliases": [ 00:27:49.355 "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b" 00:27:49.355 ], 00:27:49.355 "product_name": "Malloc disk", 00:27:49.355 "block_size": 512, 00:27:49.355 "num_blocks": 65536, 00:27:49.355 "uuid": "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b", 00:27:49.355 "assigned_rate_limits": { 00:27:49.355 "rw_ios_per_sec": 0, 00:27:49.355 "rw_mbytes_per_sec": 0, 00:27:49.355 "r_mbytes_per_sec": 0, 00:27:49.355 "w_mbytes_per_sec": 0 00:27:49.355 }, 00:27:49.355 "claimed": true, 00:27:49.355 "claim_type": "exclusive_write", 00:27:49.355 "zoned": false, 00:27:49.355 "supported_io_types": { 00:27:49.355 "read": true, 00:27:49.355 "write": true, 00:27:49.355 "unmap": true, 00:27:49.355 "write_zeroes": true, 00:27:49.355 "flush": true, 00:27:49.355 "reset": true, 00:27:49.355 "compare": false, 00:27:49.355 "compare_and_write": false, 00:27:49.355 "abort": true, 00:27:49.355 "nvme_admin": false, 00:27:49.355 "nvme_io": false 00:27:49.355 }, 00:27:49.355 "memory_domains": [ 00:27:49.355 { 00:27:49.355 "dma_device_id": "system", 00:27:49.355 "dma_device_type": 1 00:27:49.355 }, 00:27:49.355 { 00:27:49.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:49.355 "dma_device_type": 2 00:27:49.355 } 00:27:49.355 ], 00:27:49.355 "driver_specific": {} 00:27:49.355 }' 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:49.355 12:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:49.631 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:49.890 [2024-06-07 12:30:13.491071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.149 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:50.406 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.406 "name": "Existed_Raid", 00:27:50.406 "uuid": "24282c4f-a965-44c6-978f-8339de308cf6", 00:27:50.406 "strip_size_kb": 0, 00:27:50.406 "state": "online", 00:27:50.406 "raid_level": "raid1", 00:27:50.406 "superblock": true, 00:27:50.406 "num_base_bdevs": 3, 00:27:50.406 "num_base_bdevs_discovered": 2, 00:27:50.406 "num_base_bdevs_operational": 2, 00:27:50.406 "base_bdevs_list": [ 00:27:50.406 { 00:27:50.406 "name": null, 00:27:50.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.406 "is_configured": false, 00:27:50.406 "data_offset": 2048, 00:27:50.406 "data_size": 63488 00:27:50.406 }, 00:27:50.406 { 00:27:50.406 "name": "BaseBdev2", 00:27:50.407 "uuid": "15521901-5092-4eb0-ab4e-173350223b1e", 00:27:50.407 "is_configured": true, 00:27:50.407 "data_offset": 2048, 00:27:50.407 "data_size": 63488 00:27:50.407 }, 00:27:50.407 { 00:27:50.407 "name": "BaseBdev3", 00:27:50.407 "uuid": "f9cca75e-c2be-448e-9f2c-e8fb8ac1523b", 00:27:50.407 "is_configured": true, 00:27:50.407 "data_offset": 2048, 00:27:50.407 "data_size": 63488 00:27:50.407 } 00:27:50.407 ] 00:27:50.407 }' 00:27:50.407 12:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.407 12:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:50.973 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:50.973 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:50.973 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.973 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:51.232 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:51.232 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:51.232 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:51.490 [2024-06-07 12:30:14.922613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:51.490 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:51.490 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:51.490 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:51.490 12:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.749 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:51.749 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:51.749 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:27:52.007 [2024-06-07 12:30:15.441336] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:27:52.007 [2024-06-07 12:30:15.441464] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:52.007 [2024-06-07 12:30:15.464725] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:52.007 [2024-06-07 12:30:15.464788] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:52.007 [2024-06-07 12:30:15.464800] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:27:52.007 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:52.007 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:52.007 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.008 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:52.271 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:27:52.563 BaseBdev2 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:52.563 12:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:52.822 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:52.822 [ 00:27:52.822 { 00:27:52.822 "name": "BaseBdev2", 00:27:52.822 "aliases": [ 00:27:52.822 "bba2b23b-2d5d-403e-88c9-ca8762792f0d" 00:27:52.822 ], 00:27:52.822 "product_name": "Malloc disk", 00:27:52.822 "block_size": 512, 00:27:52.822 "num_blocks": 65536, 00:27:52.822 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:27:52.822 "assigned_rate_limits": { 00:27:52.822 "rw_ios_per_sec": 0, 00:27:52.822 "rw_mbytes_per_sec": 0, 00:27:52.822 "r_mbytes_per_sec": 0, 00:27:52.822 "w_mbytes_per_sec": 0 00:27:52.822 }, 00:27:52.822 "claimed": false, 00:27:52.822 "zoned": false, 00:27:52.822 "supported_io_types": { 00:27:52.822 "read": true, 00:27:52.822 "write": true, 00:27:52.822 "unmap": true, 00:27:52.822 "write_zeroes": true, 00:27:52.822 "flush": true, 00:27:52.822 "reset": true, 00:27:52.822 "compare": false, 00:27:52.822 "compare_and_write": false, 00:27:52.822 "abort": true, 00:27:52.822 "nvme_admin": false, 00:27:52.822 "nvme_io": false 00:27:52.822 }, 00:27:52.822 "memory_domains": [ 00:27:52.822 { 00:27:52.822 "dma_device_id": "system", 00:27:52.822 "dma_device_type": 1 00:27:52.822 }, 00:27:52.822 { 00:27:52.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.822 "dma_device_type": 2 00:27:52.822 } 00:27:52.822 ], 00:27:52.822 "driver_specific": {} 00:27:52.822 } 00:27:52.822 ] 00:27:52.822 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:52.822 12:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:27:52.822 12:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:52.822 12:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:27:53.081 BaseBdev3 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:53.081 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:53.340 12:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:27:53.600 [ 00:27:53.600 { 00:27:53.600 "name": "BaseBdev3", 00:27:53.600 "aliases": [ 00:27:53.600 "fad9c534-2cdd-407d-b46e-8a11509a7cf0" 00:27:53.600 ], 00:27:53.600 "product_name": "Malloc disk", 00:27:53.600 "block_size": 512, 00:27:53.600 "num_blocks": 65536, 00:27:53.600 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:27:53.600 "assigned_rate_limits": { 00:27:53.600 "rw_ios_per_sec": 0, 00:27:53.600 "rw_mbytes_per_sec": 0, 00:27:53.600 "r_mbytes_per_sec": 0, 00:27:53.600 "w_mbytes_per_sec": 0 00:27:53.600 }, 00:27:53.600 "claimed": false, 00:27:53.600 "zoned": false, 00:27:53.600 "supported_io_types": { 00:27:53.600 "read": true, 00:27:53.600 "write": true, 00:27:53.600 "unmap": true, 00:27:53.600 "write_zeroes": true, 00:27:53.600 "flush": true, 00:27:53.600 "reset": true, 00:27:53.600 "compare": false, 00:27:53.600 "compare_and_write": false, 00:27:53.600 "abort": true, 00:27:53.600 "nvme_admin": false, 00:27:53.600 "nvme_io": false 00:27:53.600 }, 00:27:53.600 "memory_domains": [ 00:27:53.600 { 00:27:53.600 "dma_device_id": "system", 00:27:53.600 "dma_device_type": 1 00:27:53.600 }, 00:27:53.600 { 00:27:53.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.600 "dma_device_type": 2 00:27:53.600 } 00:27:53.600 ], 00:27:53.600 "driver_specific": {} 00:27:53.600 } 00:27:53.600 ] 00:27:53.600 12:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:53.600 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:27:53.600 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:27:53.600 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:27:53.858 [2024-06-07 12:30:17.325658] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:53.858 [2024-06-07 12:30:17.325802] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:53.858 [2024-06-07 12:30:17.325837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:53.858 [2024-06-07 12:30:17.327834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:53.858 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:53.858 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:53.858 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:53.858 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.858 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.859 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.118 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.118 "name": "Existed_Raid", 00:27:54.118 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:27:54.118 "strip_size_kb": 0, 00:27:54.118 "state": "configuring", 00:27:54.118 "raid_level": "raid1", 00:27:54.118 "superblock": true, 00:27:54.118 "num_base_bdevs": 3, 00:27:54.118 "num_base_bdevs_discovered": 2, 00:27:54.118 "num_base_bdevs_operational": 3, 00:27:54.118 "base_bdevs_list": [ 00:27:54.118 { 00:27:54.118 "name": "BaseBdev1", 00:27:54.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.118 "is_configured": false, 00:27:54.118 "data_offset": 0, 00:27:54.118 "data_size": 0 00:27:54.118 }, 00:27:54.118 { 00:27:54.118 "name": "BaseBdev2", 00:27:54.118 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:27:54.118 "is_configured": true, 00:27:54.118 "data_offset": 2048, 00:27:54.118 "data_size": 63488 00:27:54.118 }, 00:27:54.118 { 00:27:54.118 "name": "BaseBdev3", 00:27:54.118 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:27:54.118 "is_configured": true, 00:27:54.118 "data_offset": 2048, 00:27:54.118 "data_size": 63488 00:27:54.118 } 00:27:54.118 ] 00:27:54.118 }' 00:27:54.118 12:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.118 12:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:54.685 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:54.943 [2024-06-07 12:30:18.333796] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.943 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:55.202 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.202 "name": "Existed_Raid", 00:27:55.202 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:27:55.202 "strip_size_kb": 0, 00:27:55.202 "state": "configuring", 00:27:55.202 "raid_level": "raid1", 00:27:55.202 "superblock": true, 00:27:55.202 "num_base_bdevs": 3, 00:27:55.202 "num_base_bdevs_discovered": 1, 00:27:55.202 "num_base_bdevs_operational": 3, 00:27:55.202 "base_bdevs_list": [ 00:27:55.202 { 00:27:55.202 "name": "BaseBdev1", 00:27:55.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.202 "is_configured": false, 00:27:55.202 "data_offset": 0, 00:27:55.202 "data_size": 0 00:27:55.202 }, 00:27:55.202 { 00:27:55.202 "name": null, 00:27:55.202 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:27:55.202 "is_configured": false, 00:27:55.202 "data_offset": 2048, 00:27:55.202 "data_size": 63488 00:27:55.202 }, 00:27:55.202 { 00:27:55.202 "name": "BaseBdev3", 00:27:55.202 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:27:55.202 "is_configured": true, 00:27:55.202 "data_offset": 2048, 00:27:55.202 "data_size": 63488 00:27:55.202 } 00:27:55.202 ] 00:27:55.202 }' 00:27:55.202 12:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.202 12:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:55.769 12:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.769 12:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:27:55.769 12:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:27:55.769 12:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:27:56.028 [2024-06-07 12:30:19.643817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:56.028 BaseBdev1 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:56.028 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:56.286 12:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:56.545 [ 00:27:56.545 { 00:27:56.545 "name": "BaseBdev1", 00:27:56.545 "aliases": [ 00:27:56.545 "cc58423f-0df6-4daa-bfc0-bd1d7564e468" 00:27:56.545 ], 00:27:56.545 "product_name": "Malloc disk", 00:27:56.545 "block_size": 512, 00:27:56.545 "num_blocks": 65536, 00:27:56.545 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:27:56.545 "assigned_rate_limits": { 00:27:56.545 "rw_ios_per_sec": 0, 00:27:56.545 "rw_mbytes_per_sec": 0, 00:27:56.545 "r_mbytes_per_sec": 0, 00:27:56.545 "w_mbytes_per_sec": 0 00:27:56.545 }, 00:27:56.545 "claimed": true, 00:27:56.545 "claim_type": "exclusive_write", 00:27:56.545 "zoned": false, 00:27:56.545 "supported_io_types": { 00:27:56.545 "read": true, 00:27:56.545 "write": true, 00:27:56.545 "unmap": true, 00:27:56.545 "write_zeroes": true, 00:27:56.545 "flush": true, 00:27:56.545 "reset": true, 00:27:56.545 "compare": false, 00:27:56.545 "compare_and_write": false, 00:27:56.545 "abort": true, 00:27:56.545 "nvme_admin": false, 00:27:56.545 "nvme_io": false 00:27:56.545 }, 00:27:56.545 "memory_domains": [ 00:27:56.545 { 00:27:56.545 "dma_device_id": "system", 00:27:56.545 "dma_device_type": 1 00:27:56.545 }, 00:27:56.545 { 00:27:56.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.545 "dma_device_type": 2 00:27:56.545 } 00:27:56.545 ], 00:27:56.545 "driver_specific": {} 00:27:56.545 } 00:27:56.545 ] 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.545 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:56.804 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:56.804 "name": "Existed_Raid", 00:27:56.804 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:27:56.804 "strip_size_kb": 0, 00:27:56.804 "state": "configuring", 00:27:56.804 "raid_level": "raid1", 00:27:56.804 "superblock": true, 00:27:56.804 "num_base_bdevs": 3, 00:27:56.804 "num_base_bdevs_discovered": 2, 00:27:56.804 "num_base_bdevs_operational": 3, 00:27:56.804 "base_bdevs_list": [ 00:27:56.804 { 00:27:56.804 "name": "BaseBdev1", 00:27:56.804 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:27:56.804 "is_configured": true, 00:27:56.804 "data_offset": 2048, 00:27:56.804 "data_size": 63488 00:27:56.804 }, 00:27:56.804 { 00:27:56.804 "name": null, 00:27:56.804 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:27:56.804 "is_configured": false, 00:27:56.804 "data_offset": 2048, 00:27:56.804 "data_size": 63488 00:27:56.804 }, 00:27:56.804 { 00:27:56.804 "name": "BaseBdev3", 00:27:56.804 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:27:56.804 "is_configured": true, 00:27:56.804 "data_offset": 2048, 00:27:56.804 "data_size": 63488 00:27:56.804 } 00:27:56.804 ] 00:27:56.804 }' 00:27:56.804 12:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:56.804 12:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:57.437 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.437 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:27:58.004 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:27:58.004 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:27:58.263 [2024-06-07 12:30:21.732163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:58.263 12:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.522 12:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.522 "name": "Existed_Raid", 00:27:58.522 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:27:58.522 "strip_size_kb": 0, 00:27:58.522 "state": "configuring", 00:27:58.522 "raid_level": "raid1", 00:27:58.522 "superblock": true, 00:27:58.522 "num_base_bdevs": 3, 00:27:58.522 "num_base_bdevs_discovered": 1, 00:27:58.522 "num_base_bdevs_operational": 3, 00:27:58.522 "base_bdevs_list": [ 00:27:58.522 { 00:27:58.522 "name": "BaseBdev1", 00:27:58.522 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:27:58.522 "is_configured": true, 00:27:58.522 "data_offset": 2048, 00:27:58.522 "data_size": 63488 00:27:58.522 }, 00:27:58.522 { 00:27:58.522 "name": null, 00:27:58.522 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:27:58.522 "is_configured": false, 00:27:58.522 "data_offset": 2048, 00:27:58.522 "data_size": 63488 00:27:58.522 }, 00:27:58.522 { 00:27:58.522 "name": null, 00:27:58.522 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:27:58.522 "is_configured": false, 00:27:58.522 "data_offset": 2048, 00:27:58.522 "data_size": 63488 00:27:58.522 } 00:27:58.522 ] 00:27:58.522 }' 00:27:58.522 12:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.522 12:30:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:59.460 12:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.460 12:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:27:59.460 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:27:59.461 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:27:59.719 [2024-06-07 12:30:23.348521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.023 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:00.282 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.282 "name": "Existed_Raid", 00:28:00.282 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:00.282 "strip_size_kb": 0, 00:28:00.282 "state": "configuring", 00:28:00.282 "raid_level": "raid1", 00:28:00.282 "superblock": true, 00:28:00.282 "num_base_bdevs": 3, 00:28:00.282 "num_base_bdevs_discovered": 2, 00:28:00.282 "num_base_bdevs_operational": 3, 00:28:00.282 "base_bdevs_list": [ 00:28:00.282 { 00:28:00.282 "name": "BaseBdev1", 00:28:00.282 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:00.282 "is_configured": true, 00:28:00.282 "data_offset": 2048, 00:28:00.282 "data_size": 63488 00:28:00.282 }, 00:28:00.282 { 00:28:00.282 "name": null, 00:28:00.282 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:00.282 "is_configured": false, 00:28:00.282 "data_offset": 2048, 00:28:00.282 "data_size": 63488 00:28:00.282 }, 00:28:00.282 { 00:28:00.282 "name": "BaseBdev3", 00:28:00.282 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:00.282 "is_configured": true, 00:28:00.282 "data_offset": 2048, 00:28:00.282 "data_size": 63488 00:28:00.282 } 00:28:00.282 ] 00:28:00.282 }' 00:28:00.282 12:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.282 12:30:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:00.854 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.854 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:28:01.113 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:28:01.113 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:01.430 [2024-06-07 12:30:24.904729] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.430 12:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:01.747 12:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.747 "name": "Existed_Raid", 00:28:01.747 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:01.747 "strip_size_kb": 0, 00:28:01.747 "state": "configuring", 00:28:01.747 "raid_level": "raid1", 00:28:01.747 "superblock": true, 00:28:01.747 "num_base_bdevs": 3, 00:28:01.747 "num_base_bdevs_discovered": 1, 00:28:01.747 "num_base_bdevs_operational": 3, 00:28:01.747 "base_bdevs_list": [ 00:28:01.747 { 00:28:01.747 "name": null, 00:28:01.747 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:01.747 "is_configured": false, 00:28:01.747 "data_offset": 2048, 00:28:01.747 "data_size": 63488 00:28:01.747 }, 00:28:01.747 { 00:28:01.747 "name": null, 00:28:01.747 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:01.747 "is_configured": false, 00:28:01.747 "data_offset": 2048, 00:28:01.747 "data_size": 63488 00:28:01.747 }, 00:28:01.747 { 00:28:01.747 "name": "BaseBdev3", 00:28:01.747 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:01.747 "is_configured": true, 00:28:01.747 "data_offset": 2048, 00:28:01.747 "data_size": 63488 00:28:01.747 } 00:28:01.747 ] 00:28:01.747 }' 00:28:01.747 12:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.747 12:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:02.392 12:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.392 12:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:28:02.650 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:28:02.650 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:28:02.936 [2024-06-07 12:30:26.506025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.936 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.937 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:03.197 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.197 "name": "Existed_Raid", 00:28:03.197 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:03.197 "strip_size_kb": 0, 00:28:03.197 "state": "configuring", 00:28:03.197 "raid_level": "raid1", 00:28:03.197 "superblock": true, 00:28:03.197 "num_base_bdevs": 3, 00:28:03.197 "num_base_bdevs_discovered": 2, 00:28:03.197 "num_base_bdevs_operational": 3, 00:28:03.197 "base_bdevs_list": [ 00:28:03.197 { 00:28:03.197 "name": null, 00:28:03.197 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:03.197 "is_configured": false, 00:28:03.197 "data_offset": 2048, 00:28:03.197 "data_size": 63488 00:28:03.197 }, 00:28:03.197 { 00:28:03.197 "name": "BaseBdev2", 00:28:03.197 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:03.197 "is_configured": true, 00:28:03.197 "data_offset": 2048, 00:28:03.197 "data_size": 63488 00:28:03.197 }, 00:28:03.197 { 00:28:03.197 "name": "BaseBdev3", 00:28:03.197 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:03.197 "is_configured": true, 00:28:03.197 "data_offset": 2048, 00:28:03.197 "data_size": 63488 00:28:03.197 } 00:28:03.197 ] 00:28:03.197 }' 00:28:03.197 12:30:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.197 12:30:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:04.146 12:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.146 12:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:28:04.146 12:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:28:04.146 12:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.146 12:30:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:28:04.712 12:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cc58423f-0df6-4daa-bfc0-bd1d7564e468 00:28:04.971 [2024-06-07 12:30:28.448456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:28:04.971 [2024-06-07 12:30:28.448635] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:28:04.971 [2024-06-07 12:30:28.448648] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:04.971 [2024-06-07 12:30:28.448717] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:28:04.971 [2024-06-07 12:30:28.448973] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:28:04.971 [2024-06-07 12:30:28.448987] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000007880 00:28:04.971 [2024-06-07 12:30:28.449059] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:04.971 NewBaseBdev 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:04.971 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:05.229 12:30:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:28:05.487 [ 00:28:05.487 { 00:28:05.487 "name": "NewBaseBdev", 00:28:05.487 "aliases": [ 00:28:05.487 "cc58423f-0df6-4daa-bfc0-bd1d7564e468" 00:28:05.487 ], 00:28:05.487 "product_name": "Malloc disk", 00:28:05.487 "block_size": 512, 00:28:05.487 "num_blocks": 65536, 00:28:05.487 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:05.487 "assigned_rate_limits": { 00:28:05.487 "rw_ios_per_sec": 0, 00:28:05.487 "rw_mbytes_per_sec": 0, 00:28:05.487 "r_mbytes_per_sec": 0, 00:28:05.487 "w_mbytes_per_sec": 0 00:28:05.487 }, 00:28:05.487 "claimed": true, 00:28:05.487 "claim_type": "exclusive_write", 00:28:05.487 "zoned": false, 00:28:05.487 "supported_io_types": { 00:28:05.487 "read": true, 00:28:05.487 "write": true, 00:28:05.487 "unmap": true, 00:28:05.487 "write_zeroes": true, 00:28:05.487 "flush": true, 00:28:05.487 "reset": true, 00:28:05.487 "compare": false, 00:28:05.487 "compare_and_write": false, 00:28:05.487 "abort": true, 00:28:05.487 "nvme_admin": false, 00:28:05.487 "nvme_io": false 00:28:05.487 }, 00:28:05.487 "memory_domains": [ 00:28:05.487 { 00:28:05.487 "dma_device_id": "system", 00:28:05.487 "dma_device_type": 1 00:28:05.487 }, 00:28:05.487 { 00:28:05.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.487 "dma_device_type": 2 00:28:05.487 } 00:28:05.487 ], 00:28:05.487 "driver_specific": {} 00:28:05.487 } 00:28:05.487 ] 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.487 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:05.762 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.762 "name": "Existed_Raid", 00:28:05.762 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:05.762 "strip_size_kb": 0, 00:28:05.762 "state": "online", 00:28:05.762 "raid_level": "raid1", 00:28:05.762 "superblock": true, 00:28:05.762 "num_base_bdevs": 3, 00:28:05.762 "num_base_bdevs_discovered": 3, 00:28:05.762 "num_base_bdevs_operational": 3, 00:28:05.762 "base_bdevs_list": [ 00:28:05.762 { 00:28:05.762 "name": "NewBaseBdev", 00:28:05.762 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:05.762 "is_configured": true, 00:28:05.762 "data_offset": 2048, 00:28:05.762 "data_size": 63488 00:28:05.762 }, 00:28:05.762 { 00:28:05.762 "name": "BaseBdev2", 00:28:05.762 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:05.762 "is_configured": true, 00:28:05.762 "data_offset": 2048, 00:28:05.762 "data_size": 63488 00:28:05.762 }, 00:28:05.762 { 00:28:05.762 "name": "BaseBdev3", 00:28:05.762 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:05.762 "is_configured": true, 00:28:05.762 "data_offset": 2048, 00:28:05.762 "data_size": 63488 00:28:05.762 } 00:28:05.762 ] 00:28:05.762 }' 00:28:05.762 12:30:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.762 12:30:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:06.695 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:28:06.696 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:06.696 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:06.696 [2024-06-07 12:30:30.293079] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:06.696 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:06.696 "name": "Existed_Raid", 00:28:06.696 "aliases": [ 00:28:06.696 "9553314d-c607-4fc3-a107-d83cbc019b7f" 00:28:06.696 ], 00:28:06.696 "product_name": "Raid Volume", 00:28:06.696 "block_size": 512, 00:28:06.696 "num_blocks": 63488, 00:28:06.696 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:06.696 "assigned_rate_limits": { 00:28:06.696 "rw_ios_per_sec": 0, 00:28:06.696 "rw_mbytes_per_sec": 0, 00:28:06.696 "r_mbytes_per_sec": 0, 00:28:06.696 "w_mbytes_per_sec": 0 00:28:06.696 }, 00:28:06.696 "claimed": false, 00:28:06.696 "zoned": false, 00:28:06.696 "supported_io_types": { 00:28:06.696 "read": true, 00:28:06.696 "write": true, 00:28:06.696 "unmap": false, 00:28:06.696 "write_zeroes": true, 00:28:06.696 "flush": false, 00:28:06.696 "reset": true, 00:28:06.696 "compare": false, 00:28:06.696 "compare_and_write": false, 00:28:06.696 "abort": false, 00:28:06.696 "nvme_admin": false, 00:28:06.696 "nvme_io": false 00:28:06.696 }, 00:28:06.696 "memory_domains": [ 00:28:06.696 { 00:28:06.696 "dma_device_id": "system", 00:28:06.696 "dma_device_type": 1 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:06.696 "dma_device_type": 2 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "dma_device_id": "system", 00:28:06.696 "dma_device_type": 1 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:06.696 "dma_device_type": 2 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "dma_device_id": "system", 00:28:06.696 "dma_device_type": 1 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:06.696 "dma_device_type": 2 00:28:06.696 } 00:28:06.696 ], 00:28:06.696 "driver_specific": { 00:28:06.696 "raid": { 00:28:06.696 "uuid": "9553314d-c607-4fc3-a107-d83cbc019b7f", 00:28:06.696 "strip_size_kb": 0, 00:28:06.696 "state": "online", 00:28:06.696 "raid_level": "raid1", 00:28:06.696 "superblock": true, 00:28:06.696 "num_base_bdevs": 3, 00:28:06.696 "num_base_bdevs_discovered": 3, 00:28:06.696 "num_base_bdevs_operational": 3, 00:28:06.696 "base_bdevs_list": [ 00:28:06.696 { 00:28:06.696 "name": "NewBaseBdev", 00:28:06.696 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:06.696 "is_configured": true, 00:28:06.696 "data_offset": 2048, 00:28:06.696 "data_size": 63488 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "name": "BaseBdev2", 00:28:06.696 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:06.696 "is_configured": true, 00:28:06.696 "data_offset": 2048, 00:28:06.696 "data_size": 63488 00:28:06.696 }, 00:28:06.696 { 00:28:06.696 "name": "BaseBdev3", 00:28:06.696 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:06.696 "is_configured": true, 00:28:06.696 "data_offset": 2048, 00:28:06.696 "data_size": 63488 00:28:06.696 } 00:28:06.696 ] 00:28:06.696 } 00:28:06.696 } 00:28:06.696 }' 00:28:06.696 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:06.954 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:28:06.954 BaseBdev2 00:28:06.954 BaseBdev3' 00:28:06.954 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:06.954 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:28:06.954 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:07.212 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:07.212 "name": "NewBaseBdev", 00:28:07.212 "aliases": [ 00:28:07.212 "cc58423f-0df6-4daa-bfc0-bd1d7564e468" 00:28:07.212 ], 00:28:07.212 "product_name": "Malloc disk", 00:28:07.212 "block_size": 512, 00:28:07.212 "num_blocks": 65536, 00:28:07.212 "uuid": "cc58423f-0df6-4daa-bfc0-bd1d7564e468", 00:28:07.212 "assigned_rate_limits": { 00:28:07.212 "rw_ios_per_sec": 0, 00:28:07.212 "rw_mbytes_per_sec": 0, 00:28:07.212 "r_mbytes_per_sec": 0, 00:28:07.212 "w_mbytes_per_sec": 0 00:28:07.212 }, 00:28:07.212 "claimed": true, 00:28:07.212 "claim_type": "exclusive_write", 00:28:07.212 "zoned": false, 00:28:07.212 "supported_io_types": { 00:28:07.212 "read": true, 00:28:07.212 "write": true, 00:28:07.212 "unmap": true, 00:28:07.212 "write_zeroes": true, 00:28:07.212 "flush": true, 00:28:07.212 "reset": true, 00:28:07.212 "compare": false, 00:28:07.212 "compare_and_write": false, 00:28:07.212 "abort": true, 00:28:07.212 "nvme_admin": false, 00:28:07.212 "nvme_io": false 00:28:07.212 }, 00:28:07.212 "memory_domains": [ 00:28:07.212 { 00:28:07.212 "dma_device_id": "system", 00:28:07.212 "dma_device_type": 1 00:28:07.212 }, 00:28:07.212 { 00:28:07.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:07.212 "dma_device_type": 2 00:28:07.212 } 00:28:07.212 ], 00:28:07.212 "driver_specific": {} 00:28:07.212 }' 00:28:07.212 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:07.212 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:07.470 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:07.470 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:07.470 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:07.470 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:07.470 12:30:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:07.470 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:07.470 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:07.470 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:07.470 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:07.729 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:07.729 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:07.729 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:07.729 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:07.987 "name": "BaseBdev2", 00:28:07.987 "aliases": [ 00:28:07.987 "bba2b23b-2d5d-403e-88c9-ca8762792f0d" 00:28:07.987 ], 00:28:07.987 "product_name": "Malloc disk", 00:28:07.987 "block_size": 512, 00:28:07.987 "num_blocks": 65536, 00:28:07.987 "uuid": "bba2b23b-2d5d-403e-88c9-ca8762792f0d", 00:28:07.987 "assigned_rate_limits": { 00:28:07.987 "rw_ios_per_sec": 0, 00:28:07.987 "rw_mbytes_per_sec": 0, 00:28:07.987 "r_mbytes_per_sec": 0, 00:28:07.987 "w_mbytes_per_sec": 0 00:28:07.987 }, 00:28:07.987 "claimed": true, 00:28:07.987 "claim_type": "exclusive_write", 00:28:07.987 "zoned": false, 00:28:07.987 "supported_io_types": { 00:28:07.987 "read": true, 00:28:07.987 "write": true, 00:28:07.987 "unmap": true, 00:28:07.987 "write_zeroes": true, 00:28:07.987 "flush": true, 00:28:07.987 "reset": true, 00:28:07.987 "compare": false, 00:28:07.987 "compare_and_write": false, 00:28:07.987 "abort": true, 00:28:07.987 "nvme_admin": false, 00:28:07.987 "nvme_io": false 00:28:07.987 }, 00:28:07.987 "memory_domains": [ 00:28:07.987 { 00:28:07.987 "dma_device_id": "system", 00:28:07.987 "dma_device_type": 1 00:28:07.987 }, 00:28:07.987 { 00:28:07.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:07.987 "dma_device_type": 2 00:28:07.987 } 00:28:07.987 ], 00:28:07.987 "driver_specific": {} 00:28:07.987 }' 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:07.987 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:08.245 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:28:08.246 12:30:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:08.812 "name": "BaseBdev3", 00:28:08.812 "aliases": [ 00:28:08.812 "fad9c534-2cdd-407d-b46e-8a11509a7cf0" 00:28:08.812 ], 00:28:08.812 "product_name": "Malloc disk", 00:28:08.812 "block_size": 512, 00:28:08.812 "num_blocks": 65536, 00:28:08.812 "uuid": "fad9c534-2cdd-407d-b46e-8a11509a7cf0", 00:28:08.812 "assigned_rate_limits": { 00:28:08.812 "rw_ios_per_sec": 0, 00:28:08.812 "rw_mbytes_per_sec": 0, 00:28:08.812 "r_mbytes_per_sec": 0, 00:28:08.812 "w_mbytes_per_sec": 0 00:28:08.812 }, 00:28:08.812 "claimed": true, 00:28:08.812 "claim_type": "exclusive_write", 00:28:08.812 "zoned": false, 00:28:08.812 "supported_io_types": { 00:28:08.812 "read": true, 00:28:08.812 "write": true, 00:28:08.812 "unmap": true, 00:28:08.812 "write_zeroes": true, 00:28:08.812 "flush": true, 00:28:08.812 "reset": true, 00:28:08.812 "compare": false, 00:28:08.812 "compare_and_write": false, 00:28:08.812 "abort": true, 00:28:08.812 "nvme_admin": false, 00:28:08.812 "nvme_io": false 00:28:08.812 }, 00:28:08.812 "memory_domains": [ 00:28:08.812 { 00:28:08.812 "dma_device_id": "system", 00:28:08.812 "dma_device_type": 1 00:28:08.812 }, 00:28:08.812 { 00:28:08.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.812 "dma_device_type": 2 00:28:08.812 } 00:28:08.812 ], 00:28:08.812 "driver_specific": {} 00:28:08.812 }' 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:08.812 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.070 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:09.070 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.070 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.070 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:09.070 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:09.328 [2024-06-07 12:30:32.739137] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:09.328 [2024-06-07 12:30:32.739431] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:09.328 [2024-06-07 12:30:32.739610] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:09.328 [2024-06-07 12:30:32.739904] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:09.328 [2024-06-07 12:30:32.739993] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name Existed_Raid, state offline 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 209374 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 209374 ']' 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 209374 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 209374 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 209374' 00:28:09.328 killing process with pid 209374 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 209374 00:28:09.328 12:30:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 209374 00:28:09.328 [2024-06-07 12:30:32.793317] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:09.328 [2024-06-07 12:30:32.854766] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:09.589 12:30:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:28:09.589 00:28:09.589 real 0m31.674s 00:28:09.589 user 0m58.101s 00:28:09.589 sys 0m5.113s 00:28:09.589 12:30:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:09.589 12:30:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:09.589 ************************************ 00:28:09.589 END TEST raid_state_function_test_sb 00:28:09.589 ************************************ 00:28:09.849 12:30:33 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:28:09.849 12:30:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:28:09.849 12:30:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:09.849 12:30:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:09.849 ************************************ 00:28:09.849 START TEST raid_superblock_test 00:28:09.849 ************************************ 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 3 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=210371 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 210371 /var/tmp/spdk-raid.sock 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 210371 ']' 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:09.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:09.849 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:09.849 [2024-06-07 12:30:33.324920] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:28:09.849 [2024-06-07 12:30:33.325595] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid210371 ] 00:28:09.849 [2024-06-07 12:30:33.475851] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:10.106 [2024-06-07 12:30:33.598095] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:10.106 [2024-06-07 12:30:33.693555] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:10.364 12:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:28:10.621 malloc1 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:10.621 [2024-06-07 12:30:34.236877] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:10.621 [2024-06-07 12:30:34.237262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.621 [2024-06-07 12:30:34.237453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:28:10.621 [2024-06-07 12:30:34.237623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.621 [2024-06-07 12:30:34.240406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.621 [2024-06-07 12:30:34.240619] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:10.621 pt1 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:10.621 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:28:10.880 malloc2 00:28:10.880 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:11.196 [2024-06-07 12:30:34.764653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:11.196 [2024-06-07 12:30:34.765015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.196 [2024-06-07 12:30:34.765100] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:28:11.196 [2024-06-07 12:30:34.765249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.196 [2024-06-07 12:30:34.767486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.196 [2024-06-07 12:30:34.767680] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:11.196 pt2 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:11.196 12:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:28:11.453 malloc3 00:28:11.453 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:28:11.711 [2024-06-07 12:30:35.230030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:28:11.711 [2024-06-07 12:30:35.230436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.711 [2024-06-07 12:30:35.230639] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:28:11.711 [2024-06-07 12:30:35.230837] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.711 [2024-06-07 12:30:35.233527] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.711 [2024-06-07 12:30:35.233730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:28:11.711 pt3 00:28:11.711 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:11.711 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:11.711 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:28:11.969 [2024-06-07 12:30:35.550159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:11.969 [2024-06-07 12:30:35.552486] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:11.969 [2024-06-07 12:30:35.552662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:28:11.969 [2024-06-07 12:30:35.552916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007880 00:28:11.969 [2024-06-07 12:30:35.553028] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:11.969 [2024-06-07 12:30:35.553248] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:28:11.969 [2024-06-07 12:30:35.553715] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007880 00:28:11.969 [2024-06-07 12:30:35.553839] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007880 00:28:11.969 [2024-06-07 12:30:35.554136] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.969 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.227 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.227 "name": "raid_bdev1", 00:28:12.227 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:12.227 "strip_size_kb": 0, 00:28:12.227 "state": "online", 00:28:12.227 "raid_level": "raid1", 00:28:12.227 "superblock": true, 00:28:12.227 "num_base_bdevs": 3, 00:28:12.227 "num_base_bdevs_discovered": 3, 00:28:12.227 "num_base_bdevs_operational": 3, 00:28:12.227 "base_bdevs_list": [ 00:28:12.227 { 00:28:12.227 "name": "pt1", 00:28:12.228 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:12.228 "is_configured": true, 00:28:12.228 "data_offset": 2048, 00:28:12.228 "data_size": 63488 00:28:12.228 }, 00:28:12.228 { 00:28:12.228 "name": "pt2", 00:28:12.228 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:12.228 "is_configured": true, 00:28:12.228 "data_offset": 2048, 00:28:12.228 "data_size": 63488 00:28:12.228 }, 00:28:12.228 { 00:28:12.228 "name": "pt3", 00:28:12.228 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:12.228 "is_configured": true, 00:28:12.228 "data_offset": 2048, 00:28:12.228 "data_size": 63488 00:28:12.228 } 00:28:12.228 ] 00:28:12.228 }' 00:28:12.228 12:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.228 12:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:12.793 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:13.051 [2024-06-07 12:30:36.590413] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:13.051 "name": "raid_bdev1", 00:28:13.051 "aliases": [ 00:28:13.051 "d3406d4f-b96b-44fa-8d5a-533bd1347eb0" 00:28:13.051 ], 00:28:13.051 "product_name": "Raid Volume", 00:28:13.051 "block_size": 512, 00:28:13.051 "num_blocks": 63488, 00:28:13.051 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:13.051 "assigned_rate_limits": { 00:28:13.051 "rw_ios_per_sec": 0, 00:28:13.051 "rw_mbytes_per_sec": 0, 00:28:13.051 "r_mbytes_per_sec": 0, 00:28:13.051 "w_mbytes_per_sec": 0 00:28:13.051 }, 00:28:13.051 "claimed": false, 00:28:13.051 "zoned": false, 00:28:13.051 "supported_io_types": { 00:28:13.051 "read": true, 00:28:13.051 "write": true, 00:28:13.051 "unmap": false, 00:28:13.051 "write_zeroes": true, 00:28:13.051 "flush": false, 00:28:13.051 "reset": true, 00:28:13.051 "compare": false, 00:28:13.051 "compare_and_write": false, 00:28:13.051 "abort": false, 00:28:13.051 "nvme_admin": false, 00:28:13.051 "nvme_io": false 00:28:13.051 }, 00:28:13.051 "memory_domains": [ 00:28:13.051 { 00:28:13.051 "dma_device_id": "system", 00:28:13.051 "dma_device_type": 1 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.051 "dma_device_type": 2 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "dma_device_id": "system", 00:28:13.051 "dma_device_type": 1 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.051 "dma_device_type": 2 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "dma_device_id": "system", 00:28:13.051 "dma_device_type": 1 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.051 "dma_device_type": 2 00:28:13.051 } 00:28:13.051 ], 00:28:13.051 "driver_specific": { 00:28:13.051 "raid": { 00:28:13.051 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:13.051 "strip_size_kb": 0, 00:28:13.051 "state": "online", 00:28:13.051 "raid_level": "raid1", 00:28:13.051 "superblock": true, 00:28:13.051 "num_base_bdevs": 3, 00:28:13.051 "num_base_bdevs_discovered": 3, 00:28:13.051 "num_base_bdevs_operational": 3, 00:28:13.051 "base_bdevs_list": [ 00:28:13.051 { 00:28:13.051 "name": "pt1", 00:28:13.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:13.051 "is_configured": true, 00:28:13.051 "data_offset": 2048, 00:28:13.051 "data_size": 63488 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "name": "pt2", 00:28:13.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:13.051 "is_configured": true, 00:28:13.051 "data_offset": 2048, 00:28:13.051 "data_size": 63488 00:28:13.051 }, 00:28:13.051 { 00:28:13.051 "name": "pt3", 00:28:13.051 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:13.051 "is_configured": true, 00:28:13.051 "data_offset": 2048, 00:28:13.051 "data_size": 63488 00:28:13.051 } 00:28:13.051 ] 00:28:13.051 } 00:28:13.051 } 00:28:13.051 }' 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:13.051 pt2 00:28:13.051 pt3' 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:13.051 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:13.309 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:13.309 "name": "pt1", 00:28:13.309 "aliases": [ 00:28:13.309 "00000000-0000-0000-0000-000000000001" 00:28:13.309 ], 00:28:13.309 "product_name": "passthru", 00:28:13.309 "block_size": 512, 00:28:13.309 "num_blocks": 65536, 00:28:13.309 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:13.309 "assigned_rate_limits": { 00:28:13.309 "rw_ios_per_sec": 0, 00:28:13.309 "rw_mbytes_per_sec": 0, 00:28:13.309 "r_mbytes_per_sec": 0, 00:28:13.310 "w_mbytes_per_sec": 0 00:28:13.310 }, 00:28:13.310 "claimed": true, 00:28:13.310 "claim_type": "exclusive_write", 00:28:13.310 "zoned": false, 00:28:13.310 "supported_io_types": { 00:28:13.310 "read": true, 00:28:13.310 "write": true, 00:28:13.310 "unmap": true, 00:28:13.310 "write_zeroes": true, 00:28:13.310 "flush": true, 00:28:13.310 "reset": true, 00:28:13.310 "compare": false, 00:28:13.310 "compare_and_write": false, 00:28:13.310 "abort": true, 00:28:13.310 "nvme_admin": false, 00:28:13.310 "nvme_io": false 00:28:13.310 }, 00:28:13.310 "memory_domains": [ 00:28:13.310 { 00:28:13.310 "dma_device_id": "system", 00:28:13.310 "dma_device_type": 1 00:28:13.310 }, 00:28:13.310 { 00:28:13.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.310 "dma_device_type": 2 00:28:13.310 } 00:28:13.310 ], 00:28:13.310 "driver_specific": { 00:28:13.310 "passthru": { 00:28:13.310 "name": "pt1", 00:28:13.310 "base_bdev_name": "malloc1" 00:28:13.310 } 00:28:13.310 } 00:28:13.310 }' 00:28:13.310 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:13.310 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:13.569 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:13.569 12:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.569 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.827 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:13.827 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:13.827 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:13.827 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:14.085 "name": "pt2", 00:28:14.085 "aliases": [ 00:28:14.085 "00000000-0000-0000-0000-000000000002" 00:28:14.085 ], 00:28:14.085 "product_name": "passthru", 00:28:14.085 "block_size": 512, 00:28:14.085 "num_blocks": 65536, 00:28:14.085 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:14.085 "assigned_rate_limits": { 00:28:14.085 "rw_ios_per_sec": 0, 00:28:14.085 "rw_mbytes_per_sec": 0, 00:28:14.085 "r_mbytes_per_sec": 0, 00:28:14.085 "w_mbytes_per_sec": 0 00:28:14.085 }, 00:28:14.085 "claimed": true, 00:28:14.085 "claim_type": "exclusive_write", 00:28:14.085 "zoned": false, 00:28:14.085 "supported_io_types": { 00:28:14.085 "read": true, 00:28:14.085 "write": true, 00:28:14.085 "unmap": true, 00:28:14.085 "write_zeroes": true, 00:28:14.085 "flush": true, 00:28:14.085 "reset": true, 00:28:14.085 "compare": false, 00:28:14.085 "compare_and_write": false, 00:28:14.085 "abort": true, 00:28:14.085 "nvme_admin": false, 00:28:14.085 "nvme_io": false 00:28:14.085 }, 00:28:14.085 "memory_domains": [ 00:28:14.085 { 00:28:14.085 "dma_device_id": "system", 00:28:14.085 "dma_device_type": 1 00:28:14.085 }, 00:28:14.085 { 00:28:14.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:14.085 "dma_device_type": 2 00:28:14.085 } 00:28:14.085 ], 00:28:14.085 "driver_specific": { 00:28:14.085 "passthru": { 00:28:14.085 "name": "pt2", 00:28:14.085 "base_bdev_name": "malloc2" 00:28:14.085 } 00:28:14.085 } 00:28:14.085 }' 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.085 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:28:14.342 12:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:14.600 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:14.600 "name": "pt3", 00:28:14.600 "aliases": [ 00:28:14.600 "00000000-0000-0000-0000-000000000003" 00:28:14.600 ], 00:28:14.600 "product_name": "passthru", 00:28:14.600 "block_size": 512, 00:28:14.600 "num_blocks": 65536, 00:28:14.600 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:14.600 "assigned_rate_limits": { 00:28:14.600 "rw_ios_per_sec": 0, 00:28:14.600 "rw_mbytes_per_sec": 0, 00:28:14.600 "r_mbytes_per_sec": 0, 00:28:14.600 "w_mbytes_per_sec": 0 00:28:14.600 }, 00:28:14.600 "claimed": true, 00:28:14.600 "claim_type": "exclusive_write", 00:28:14.600 "zoned": false, 00:28:14.600 "supported_io_types": { 00:28:14.600 "read": true, 00:28:14.600 "write": true, 00:28:14.600 "unmap": true, 00:28:14.600 "write_zeroes": true, 00:28:14.600 "flush": true, 00:28:14.600 "reset": true, 00:28:14.600 "compare": false, 00:28:14.600 "compare_and_write": false, 00:28:14.600 "abort": true, 00:28:14.600 "nvme_admin": false, 00:28:14.600 "nvme_io": false 00:28:14.600 }, 00:28:14.600 "memory_domains": [ 00:28:14.600 { 00:28:14.600 "dma_device_id": "system", 00:28:14.600 "dma_device_type": 1 00:28:14.600 }, 00:28:14.600 { 00:28:14.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:14.600 "dma_device_type": 2 00:28:14.600 } 00:28:14.600 ], 00:28:14.600 "driver_specific": { 00:28:14.600 "passthru": { 00:28:14.600 "name": "pt3", 00:28:14.600 "base_bdev_name": "malloc3" 00:28:14.600 } 00:28:14.600 } 00:28:14.600 }' 00:28:14.600 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.601 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.601 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:14.601 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:14.859 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:15.117 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:15.117 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:15.117 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:15.375 [2024-06-07 12:30:38.798751] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:15.375 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d3406d4f-b96b-44fa-8d5a-533bd1347eb0 00:28:15.375 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d3406d4f-b96b-44fa-8d5a-533bd1347eb0 ']' 00:28:15.375 12:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:15.634 [2024-06-07 12:30:39.070639] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:15.634 [2024-06-07 12:30:39.070909] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:15.634 [2024-06-07 12:30:39.071084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:15.634 [2024-06-07 12:30:39.071174] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:15.634 [2024-06-07 12:30:39.071403] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007880 name raid_bdev1, state offline 00:28:15.634 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:15.634 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.903 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:15.903 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:15.903 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:15.903 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:16.194 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:16.194 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:16.194 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:16.194 12:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:28:16.762 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:28:16.763 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:28:17.020 [2024-06-07 12:30:40.630819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:17.020 [2024-06-07 12:30:40.633096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:17.020 [2024-06-07 12:30:40.633298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:28:17.020 [2024-06-07 12:30:40.633371] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:17.020 [2024-06-07 12:30:40.633660] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:17.021 [2024-06-07 12:30:40.633725] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:28:17.021 [2024-06-07 12:30:40.634015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:17.021 [2024-06-07 12:30:40.634058] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state configuring 00:28:17.021 request: 00:28:17.021 { 00:28:17.021 "name": "raid_bdev1", 00:28:17.021 "raid_level": "raid1", 00:28:17.021 "base_bdevs": [ 00:28:17.021 "malloc1", 00:28:17.021 "malloc2", 00:28:17.021 "malloc3" 00:28:17.021 ], 00:28:17.021 "superblock": false, 00:28:17.021 "method": "bdev_raid_create", 00:28:17.021 "req_id": 1 00:28:17.021 } 00:28:17.021 Got JSON-RPC error response 00:28:17.021 response: 00:28:17.021 { 00:28:17.021 "code": -17, 00:28:17.021 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:17.021 } 00:28:17.021 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:28:17.021 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:28:17.021 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:28:17.021 12:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:28:17.278 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.278 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:17.536 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:17.536 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:17.536 12:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:17.793 [2024-06-07 12:30:41.230823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:17.793 [2024-06-07 12:30:41.231150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:17.793 [2024-06-07 12:30:41.231238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:28:17.793 [2024-06-07 12:30:41.231339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:17.793 [2024-06-07 12:30:41.233685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:17.793 [2024-06-07 12:30:41.233854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:17.793 [2024-06-07 12:30:41.234065] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:17.793 [2024-06-07 12:30:41.234208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:17.793 pt1 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.793 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.051 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:18.051 "name": "raid_bdev1", 00:28:18.051 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:18.051 "strip_size_kb": 0, 00:28:18.051 "state": "configuring", 00:28:18.051 "raid_level": "raid1", 00:28:18.051 "superblock": true, 00:28:18.051 "num_base_bdevs": 3, 00:28:18.051 "num_base_bdevs_discovered": 1, 00:28:18.051 "num_base_bdevs_operational": 3, 00:28:18.051 "base_bdevs_list": [ 00:28:18.051 { 00:28:18.051 "name": "pt1", 00:28:18.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:18.051 "is_configured": true, 00:28:18.051 "data_offset": 2048, 00:28:18.051 "data_size": 63488 00:28:18.051 }, 00:28:18.051 { 00:28:18.051 "name": null, 00:28:18.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:18.051 "is_configured": false, 00:28:18.051 "data_offset": 2048, 00:28:18.051 "data_size": 63488 00:28:18.051 }, 00:28:18.051 { 00:28:18.051 "name": null, 00:28:18.051 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:18.051 "is_configured": false, 00:28:18.051 "data_offset": 2048, 00:28:18.051 "data_size": 63488 00:28:18.051 } 00:28:18.051 ] 00:28:18.051 }' 00:28:18.051 12:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:18.051 12:30:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:18.694 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:28:18.694 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:18.694 [2024-06-07 12:30:42.330979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:18.694 [2024-06-07 12:30:42.331301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.694 [2024-06-07 12:30:42.331386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:28:18.694 [2024-06-07 12:30:42.331488] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.694 [2024-06-07 12:30:42.331936] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.694 [2024-06-07 12:30:42.332080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:18.694 [2024-06-07 12:30:42.332285] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:18.694 [2024-06-07 12:30:42.332401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:18.694 pt2 00:28:18.958 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:19.216 [2024-06-07 12:30:42.615037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.216 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.475 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.475 "name": "raid_bdev1", 00:28:19.475 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:19.475 "strip_size_kb": 0, 00:28:19.475 "state": "configuring", 00:28:19.475 "raid_level": "raid1", 00:28:19.475 "superblock": true, 00:28:19.475 "num_base_bdevs": 3, 00:28:19.475 "num_base_bdevs_discovered": 1, 00:28:19.475 "num_base_bdevs_operational": 3, 00:28:19.475 "base_bdevs_list": [ 00:28:19.475 { 00:28:19.475 "name": "pt1", 00:28:19.475 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:19.475 "is_configured": true, 00:28:19.475 "data_offset": 2048, 00:28:19.475 "data_size": 63488 00:28:19.475 }, 00:28:19.475 { 00:28:19.475 "name": null, 00:28:19.475 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:19.475 "is_configured": false, 00:28:19.475 "data_offset": 2048, 00:28:19.475 "data_size": 63488 00:28:19.475 }, 00:28:19.475 { 00:28:19.475 "name": null, 00:28:19.475 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:19.475 "is_configured": false, 00:28:19.475 "data_offset": 2048, 00:28:19.475 "data_size": 63488 00:28:19.475 } 00:28:19.475 ] 00:28:19.475 }' 00:28:19.475 12:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.475 12:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:20.040 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:20.040 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:20.040 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:20.040 [2024-06-07 12:30:43.679129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:20.040 [2024-06-07 12:30:43.679265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.040 [2024-06-07 12:30:43.679298] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:28:20.041 [2024-06-07 12:30:43.679330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.041 [2024-06-07 12:30:43.679724] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.041 [2024-06-07 12:30:43.679760] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:20.041 [2024-06-07 12:30:43.679843] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:20.041 [2024-06-07 12:30:43.679880] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:20.041 pt2 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:28:20.299 [2024-06-07 12:30:43.919175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:28:20.299 [2024-06-07 12:30:43.919296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.299 [2024-06-07 12:30:43.919335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009380 00:28:20.299 [2024-06-07 12:30:43.919369] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.299 [2024-06-07 12:30:43.919731] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.299 [2024-06-07 12:30:43.919783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:28:20.299 [2024-06-07 12:30:43.919871] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:28:20.299 [2024-06-07 12:30:43.919892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:28:20.299 [2024-06-07 12:30:43.919992] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:28:20.299 [2024-06-07 12:30:43.920003] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:20.299 [2024-06-07 12:30:43.920057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:28:20.299 [2024-06-07 12:30:43.920298] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:28:20.299 [2024-06-07 12:30:43.920317] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:28:20.299 [2024-06-07 12:30:43.920393] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.299 pt3 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.299 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.557 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.557 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.557 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.557 12:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.815 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.815 "name": "raid_bdev1", 00:28:20.815 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:20.815 "strip_size_kb": 0, 00:28:20.815 "state": "online", 00:28:20.815 "raid_level": "raid1", 00:28:20.815 "superblock": true, 00:28:20.815 "num_base_bdevs": 3, 00:28:20.815 "num_base_bdevs_discovered": 3, 00:28:20.815 "num_base_bdevs_operational": 3, 00:28:20.815 "base_bdevs_list": [ 00:28:20.815 { 00:28:20.815 "name": "pt1", 00:28:20.815 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:20.815 "is_configured": true, 00:28:20.815 "data_offset": 2048, 00:28:20.815 "data_size": 63488 00:28:20.815 }, 00:28:20.815 { 00:28:20.815 "name": "pt2", 00:28:20.815 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:20.815 "is_configured": true, 00:28:20.815 "data_offset": 2048, 00:28:20.815 "data_size": 63488 00:28:20.815 }, 00:28:20.815 { 00:28:20.815 "name": "pt3", 00:28:20.815 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:20.815 "is_configured": true, 00:28:20.815 "data_offset": 2048, 00:28:20.815 "data_size": 63488 00:28:20.815 } 00:28:20.815 ] 00:28:20.815 }' 00:28:20.815 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.815 12:30:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:21.383 12:30:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:21.383 [2024-06-07 12:30:44.987410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:21.383 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:21.383 "name": "raid_bdev1", 00:28:21.383 "aliases": [ 00:28:21.383 "d3406d4f-b96b-44fa-8d5a-533bd1347eb0" 00:28:21.383 ], 00:28:21.383 "product_name": "Raid Volume", 00:28:21.383 "block_size": 512, 00:28:21.383 "num_blocks": 63488, 00:28:21.383 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:21.383 "assigned_rate_limits": { 00:28:21.383 "rw_ios_per_sec": 0, 00:28:21.383 "rw_mbytes_per_sec": 0, 00:28:21.383 "r_mbytes_per_sec": 0, 00:28:21.383 "w_mbytes_per_sec": 0 00:28:21.383 }, 00:28:21.383 "claimed": false, 00:28:21.383 "zoned": false, 00:28:21.383 "supported_io_types": { 00:28:21.383 "read": true, 00:28:21.383 "write": true, 00:28:21.383 "unmap": false, 00:28:21.383 "write_zeroes": true, 00:28:21.383 "flush": false, 00:28:21.383 "reset": true, 00:28:21.383 "compare": false, 00:28:21.383 "compare_and_write": false, 00:28:21.383 "abort": false, 00:28:21.383 "nvme_admin": false, 00:28:21.383 "nvme_io": false 00:28:21.383 }, 00:28:21.383 "memory_domains": [ 00:28:21.383 { 00:28:21.383 "dma_device_id": "system", 00:28:21.384 "dma_device_type": 1 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.384 "dma_device_type": 2 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "dma_device_id": "system", 00:28:21.384 "dma_device_type": 1 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.384 "dma_device_type": 2 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "dma_device_id": "system", 00:28:21.384 "dma_device_type": 1 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.384 "dma_device_type": 2 00:28:21.384 } 00:28:21.384 ], 00:28:21.384 "driver_specific": { 00:28:21.384 "raid": { 00:28:21.384 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:21.384 "strip_size_kb": 0, 00:28:21.384 "state": "online", 00:28:21.384 "raid_level": "raid1", 00:28:21.384 "superblock": true, 00:28:21.384 "num_base_bdevs": 3, 00:28:21.384 "num_base_bdevs_discovered": 3, 00:28:21.384 "num_base_bdevs_operational": 3, 00:28:21.384 "base_bdevs_list": [ 00:28:21.384 { 00:28:21.384 "name": "pt1", 00:28:21.384 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:21.384 "is_configured": true, 00:28:21.384 "data_offset": 2048, 00:28:21.384 "data_size": 63488 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "name": "pt2", 00:28:21.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:21.384 "is_configured": true, 00:28:21.384 "data_offset": 2048, 00:28:21.384 "data_size": 63488 00:28:21.384 }, 00:28:21.384 { 00:28:21.384 "name": "pt3", 00:28:21.384 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:21.384 "is_configured": true, 00:28:21.384 "data_offset": 2048, 00:28:21.384 "data_size": 63488 00:28:21.384 } 00:28:21.384 ] 00:28:21.384 } 00:28:21.384 } 00:28:21.384 }' 00:28:21.384 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:21.643 pt2 00:28:21.643 pt3' 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:21.643 "name": "pt1", 00:28:21.643 "aliases": [ 00:28:21.643 "00000000-0000-0000-0000-000000000001" 00:28:21.643 ], 00:28:21.643 "product_name": "passthru", 00:28:21.643 "block_size": 512, 00:28:21.643 "num_blocks": 65536, 00:28:21.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:21.643 "assigned_rate_limits": { 00:28:21.643 "rw_ios_per_sec": 0, 00:28:21.643 "rw_mbytes_per_sec": 0, 00:28:21.643 "r_mbytes_per_sec": 0, 00:28:21.643 "w_mbytes_per_sec": 0 00:28:21.643 }, 00:28:21.643 "claimed": true, 00:28:21.643 "claim_type": "exclusive_write", 00:28:21.643 "zoned": false, 00:28:21.643 "supported_io_types": { 00:28:21.643 "read": true, 00:28:21.643 "write": true, 00:28:21.643 "unmap": true, 00:28:21.643 "write_zeroes": true, 00:28:21.643 "flush": true, 00:28:21.643 "reset": true, 00:28:21.643 "compare": false, 00:28:21.643 "compare_and_write": false, 00:28:21.643 "abort": true, 00:28:21.643 "nvme_admin": false, 00:28:21.643 "nvme_io": false 00:28:21.643 }, 00:28:21.643 "memory_domains": [ 00:28:21.643 { 00:28:21.643 "dma_device_id": "system", 00:28:21.643 "dma_device_type": 1 00:28:21.643 }, 00:28:21.643 { 00:28:21.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.643 "dma_device_type": 2 00:28:21.643 } 00:28:21.643 ], 00:28:21.643 "driver_specific": { 00:28:21.643 "passthru": { 00:28:21.643 "name": "pt1", 00:28:21.643 "base_bdev_name": "malloc1" 00:28:21.643 } 00:28:21.643 } 00:28:21.643 }' 00:28:21.643 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:21.902 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:22.160 "name": "pt2", 00:28:22.160 "aliases": [ 00:28:22.160 "00000000-0000-0000-0000-000000000002" 00:28:22.160 ], 00:28:22.160 "product_name": "passthru", 00:28:22.160 "block_size": 512, 00:28:22.160 "num_blocks": 65536, 00:28:22.160 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:22.160 "assigned_rate_limits": { 00:28:22.160 "rw_ios_per_sec": 0, 00:28:22.160 "rw_mbytes_per_sec": 0, 00:28:22.160 "r_mbytes_per_sec": 0, 00:28:22.160 "w_mbytes_per_sec": 0 00:28:22.160 }, 00:28:22.160 "claimed": true, 00:28:22.160 "claim_type": "exclusive_write", 00:28:22.160 "zoned": false, 00:28:22.160 "supported_io_types": { 00:28:22.160 "read": true, 00:28:22.160 "write": true, 00:28:22.160 "unmap": true, 00:28:22.160 "write_zeroes": true, 00:28:22.160 "flush": true, 00:28:22.160 "reset": true, 00:28:22.160 "compare": false, 00:28:22.160 "compare_and_write": false, 00:28:22.160 "abort": true, 00:28:22.160 "nvme_admin": false, 00:28:22.160 "nvme_io": false 00:28:22.160 }, 00:28:22.160 "memory_domains": [ 00:28:22.160 { 00:28:22.160 "dma_device_id": "system", 00:28:22.160 "dma_device_type": 1 00:28:22.160 }, 00:28:22.160 { 00:28:22.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.160 "dma_device_type": 2 00:28:22.160 } 00:28:22.160 ], 00:28:22.160 "driver_specific": { 00:28:22.160 "passthru": { 00:28:22.160 "name": "pt2", 00:28:22.160 "base_bdev_name": "malloc2" 00:28:22.160 } 00:28:22.160 } 00:28:22.160 }' 00:28:22.160 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:22.418 12:30:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.418 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.418 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:22.418 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.675 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.675 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:22.675 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:22.675 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:28:22.675 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:22.933 "name": "pt3", 00:28:22.933 "aliases": [ 00:28:22.933 "00000000-0000-0000-0000-000000000003" 00:28:22.933 ], 00:28:22.933 "product_name": "passthru", 00:28:22.933 "block_size": 512, 00:28:22.933 "num_blocks": 65536, 00:28:22.933 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:22.933 "assigned_rate_limits": { 00:28:22.933 "rw_ios_per_sec": 0, 00:28:22.933 "rw_mbytes_per_sec": 0, 00:28:22.933 "r_mbytes_per_sec": 0, 00:28:22.933 "w_mbytes_per_sec": 0 00:28:22.933 }, 00:28:22.933 "claimed": true, 00:28:22.933 "claim_type": "exclusive_write", 00:28:22.933 "zoned": false, 00:28:22.933 "supported_io_types": { 00:28:22.933 "read": true, 00:28:22.933 "write": true, 00:28:22.933 "unmap": true, 00:28:22.933 "write_zeroes": true, 00:28:22.933 "flush": true, 00:28:22.933 "reset": true, 00:28:22.933 "compare": false, 00:28:22.933 "compare_and_write": false, 00:28:22.933 "abort": true, 00:28:22.933 "nvme_admin": false, 00:28:22.933 "nvme_io": false 00:28:22.933 }, 00:28:22.933 "memory_domains": [ 00:28:22.933 { 00:28:22.933 "dma_device_id": "system", 00:28:22.933 "dma_device_type": 1 00:28:22.933 }, 00:28:22.933 { 00:28:22.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.933 "dma_device_type": 2 00:28:22.933 } 00:28:22.933 ], 00:28:22.933 "driver_specific": { 00:28:22.933 "passthru": { 00:28:22.933 "name": "pt3", 00:28:22.933 "base_bdev_name": "malloc3" 00:28:22.933 } 00:28:22.933 } 00:28:22.933 }' 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:22.933 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:23.190 12:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:23.450 [2024-06-07 12:30:47.007772] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:23.450 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d3406d4f-b96b-44fa-8d5a-533bd1347eb0 '!=' d3406d4f-b96b-44fa-8d5a-533bd1347eb0 ']' 00:28:23.450 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:23.450 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:23.450 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:28:23.450 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:23.712 [2024-06-07 12:30:47.215658] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.712 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.978 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.978 "name": "raid_bdev1", 00:28:23.978 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:23.978 "strip_size_kb": 0, 00:28:23.978 "state": "online", 00:28:23.978 "raid_level": "raid1", 00:28:23.978 "superblock": true, 00:28:23.978 "num_base_bdevs": 3, 00:28:23.978 "num_base_bdevs_discovered": 2, 00:28:23.978 "num_base_bdevs_operational": 2, 00:28:23.978 "base_bdevs_list": [ 00:28:23.978 { 00:28:23.978 "name": null, 00:28:23.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.978 "is_configured": false, 00:28:23.978 "data_offset": 2048, 00:28:23.978 "data_size": 63488 00:28:23.978 }, 00:28:23.978 { 00:28:23.978 "name": "pt2", 00:28:23.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:23.978 "is_configured": true, 00:28:23.978 "data_offset": 2048, 00:28:23.978 "data_size": 63488 00:28:23.978 }, 00:28:23.978 { 00:28:23.978 "name": "pt3", 00:28:23.978 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:23.978 "is_configured": true, 00:28:23.978 "data_offset": 2048, 00:28:23.978 "data_size": 63488 00:28:23.978 } 00:28:23.978 ] 00:28:23.978 }' 00:28:23.978 12:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.978 12:30:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:24.570 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:24.843 [2024-06-07 12:30:48.343742] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:24.843 [2024-06-07 12:30:48.343794] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:24.843 [2024-06-07 12:30:48.343877] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:24.843 [2024-06-07 12:30:48.343932] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:24.843 [2024-06-07 12:30:48.343943] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:28:24.843 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.843 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:25.119 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:25.119 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:25.119 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:25.119 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:25.119 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:25.398 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:25.398 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:25.398 12:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:28:25.662 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:25.662 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:25.662 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:25.662 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:25.662 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:25.919 [2024-06-07 12:30:49.439933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:25.919 [2024-06-07 12:30:49.440059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.919 [2024-06-07 12:30:49.440123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009680 00:28:25.919 [2024-06-07 12:30:49.440151] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.919 [2024-06-07 12:30:49.442598] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.919 [2024-06-07 12:30:49.442666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:25.919 [2024-06-07 12:30:49.442769] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:25.919 [2024-06-07 12:30:49.442811] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:25.919 pt2 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.919 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.177 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.177 "name": "raid_bdev1", 00:28:26.177 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:26.177 "strip_size_kb": 0, 00:28:26.177 "state": "configuring", 00:28:26.177 "raid_level": "raid1", 00:28:26.177 "superblock": true, 00:28:26.177 "num_base_bdevs": 3, 00:28:26.177 "num_base_bdevs_discovered": 1, 00:28:26.177 "num_base_bdevs_operational": 2, 00:28:26.177 "base_bdevs_list": [ 00:28:26.177 { 00:28:26.177 "name": null, 00:28:26.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.177 "is_configured": false, 00:28:26.177 "data_offset": 2048, 00:28:26.178 "data_size": 63488 00:28:26.178 }, 00:28:26.178 { 00:28:26.178 "name": "pt2", 00:28:26.178 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:26.178 "is_configured": true, 00:28:26.178 "data_offset": 2048, 00:28:26.178 "data_size": 63488 00:28:26.178 }, 00:28:26.178 { 00:28:26.178 "name": null, 00:28:26.178 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:26.178 "is_configured": false, 00:28:26.178 "data_offset": 2048, 00:28:26.178 "data_size": 63488 00:28:26.178 } 00:28:26.178 ] 00:28:26.178 }' 00:28:26.178 12:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.178 12:30:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:28:27.112 [2024-06-07 12:30:50.652069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:28:27.112 [2024-06-07 12:30:50.652207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:27.112 [2024-06-07 12:30:50.652265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:28:27.112 [2024-06-07 12:30:50.652295] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:27.112 [2024-06-07 12:30:50.652702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:27.112 [2024-06-07 12:30:50.652732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:28:27.112 [2024-06-07 12:30:50.652828] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:28:27.112 [2024-06-07 12:30:50.652849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:28:27.112 [2024-06-07 12:30:50.652929] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009c80 00:28:27.112 [2024-06-07 12:30:50.652939] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:27.112 [2024-06-07 12:30:50.652995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:28:27.112 [2024-06-07 12:30:50.653220] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009c80 00:28:27.112 [2024-06-07 12:30:50.653260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009c80 00:28:27.112 [2024-06-07 12:30:50.653333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:27.112 pt3 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.112 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.370 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.370 "name": "raid_bdev1", 00:28:27.370 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:27.370 "strip_size_kb": 0, 00:28:27.370 "state": "online", 00:28:27.370 "raid_level": "raid1", 00:28:27.370 "superblock": true, 00:28:27.370 "num_base_bdevs": 3, 00:28:27.370 "num_base_bdevs_discovered": 2, 00:28:27.370 "num_base_bdevs_operational": 2, 00:28:27.370 "base_bdevs_list": [ 00:28:27.370 { 00:28:27.370 "name": null, 00:28:27.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.370 "is_configured": false, 00:28:27.370 "data_offset": 2048, 00:28:27.370 "data_size": 63488 00:28:27.370 }, 00:28:27.370 { 00:28:27.370 "name": "pt2", 00:28:27.370 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:27.370 "is_configured": true, 00:28:27.370 "data_offset": 2048, 00:28:27.370 "data_size": 63488 00:28:27.370 }, 00:28:27.370 { 00:28:27.370 "name": "pt3", 00:28:27.370 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:27.370 "is_configured": true, 00:28:27.370 "data_offset": 2048, 00:28:27.370 "data_size": 63488 00:28:27.370 } 00:28:27.370 ] 00:28:27.370 }' 00:28:27.370 12:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.370 12:30:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:28.303 12:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:28.303 [2024-06-07 12:30:51.812176] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:28.303 [2024-06-07 12:30:51.812238] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:28.303 [2024-06-07 12:30:51.812308] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:28.303 [2024-06-07 12:30:51.812359] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:28.303 [2024-06-07 12:30:51.812369] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009c80 name raid_bdev1, state offline 00:28:28.303 12:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:28.303 12:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.562 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:28.562 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:28.562 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:28:28.562 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:28:28.562 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:28:28.819 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:29.078 [2024-06-07 12:30:52.596251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:29.078 [2024-06-07 12:30:52.596373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:29.078 [2024-06-07 12:30:52.596421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a280 00:28:29.078 [2024-06-07 12:30:52.596448] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:29.078 [2024-06-07 12:30:52.598753] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:29.078 [2024-06-07 12:30:52.598827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:29.078 [2024-06-07 12:30:52.598921] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:29.078 [2024-06-07 12:30:52.598952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:29.078 [2024-06-07 12:30:52.599097] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:29.078 [2024-06-07 12:30:52.599110] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:29.078 [2024-06-07 12:30:52.599136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000a880 name raid_bdev1, state configuring 00:28:29.078 [2024-06-07 12:30:52.599204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:29.078 pt1 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.078 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.336 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.336 "name": "raid_bdev1", 00:28:29.336 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:29.336 "strip_size_kb": 0, 00:28:29.336 "state": "configuring", 00:28:29.336 "raid_level": "raid1", 00:28:29.336 "superblock": true, 00:28:29.336 "num_base_bdevs": 3, 00:28:29.336 "num_base_bdevs_discovered": 1, 00:28:29.336 "num_base_bdevs_operational": 2, 00:28:29.336 "base_bdevs_list": [ 00:28:29.336 { 00:28:29.336 "name": null, 00:28:29.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.336 "is_configured": false, 00:28:29.336 "data_offset": 2048, 00:28:29.336 "data_size": 63488 00:28:29.336 }, 00:28:29.336 { 00:28:29.336 "name": "pt2", 00:28:29.336 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:29.336 "is_configured": true, 00:28:29.336 "data_offset": 2048, 00:28:29.336 "data_size": 63488 00:28:29.336 }, 00:28:29.336 { 00:28:29.336 "name": null, 00:28:29.336 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:29.336 "is_configured": false, 00:28:29.336 "data_offset": 2048, 00:28:29.336 "data_size": 63488 00:28:29.336 } 00:28:29.336 ] 00:28:29.336 }' 00:28:29.336 12:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.336 12:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:29.902 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:29.902 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:28:30.160 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:28:30.160 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:28:30.417 [2024-06-07 12:30:53.960555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:28:30.417 [2024-06-07 12:30:53.960753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.417 [2024-06-07 12:30:53.960818] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000ae80 00:28:30.417 [2024-06-07 12:30:53.960877] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.417 [2024-06-07 12:30:53.961460] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.417 [2024-06-07 12:30:53.961550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:28:30.417 [2024-06-07 12:30:53.961693] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:28:30.417 [2024-06-07 12:30:53.961735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:28:30.417 [2024-06-07 12:30:53.961843] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600000ab80 00:28:30.418 [2024-06-07 12:30:53.961860] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:30.418 [2024-06-07 12:30:53.961919] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002d50 00:28:30.418 [2024-06-07 12:30:53.962141] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600000ab80 00:28:30.418 [2024-06-07 12:30:53.962160] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600000ab80 00:28:30.418 [2024-06-07 12:30:53.962241] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.418 pt3 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.418 12:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.676 12:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.676 "name": "raid_bdev1", 00:28:30.676 "uuid": "d3406d4f-b96b-44fa-8d5a-533bd1347eb0", 00:28:30.676 "strip_size_kb": 0, 00:28:30.676 "state": "online", 00:28:30.676 "raid_level": "raid1", 00:28:30.676 "superblock": true, 00:28:30.676 "num_base_bdevs": 3, 00:28:30.676 "num_base_bdevs_discovered": 2, 00:28:30.676 "num_base_bdevs_operational": 2, 00:28:30.676 "base_bdevs_list": [ 00:28:30.676 { 00:28:30.676 "name": null, 00:28:30.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.676 "is_configured": false, 00:28:30.676 "data_offset": 2048, 00:28:30.676 "data_size": 63488 00:28:30.676 }, 00:28:30.676 { 00:28:30.676 "name": "pt2", 00:28:30.676 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:30.676 "is_configured": true, 00:28:30.676 "data_offset": 2048, 00:28:30.676 "data_size": 63488 00:28:30.676 }, 00:28:30.676 { 00:28:30.676 "name": "pt3", 00:28:30.676 "uuid": "00000000-0000-0000-0000-000000000003", 00:28:30.676 "is_configured": true, 00:28:30.676 "data_offset": 2048, 00:28:30.676 "data_size": 63488 00:28:30.676 } 00:28:30.676 ] 00:28:30.676 }' 00:28:30.676 12:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.676 12:30:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:31.608 12:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:31.608 12:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:31.608 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:31.608 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:31.608 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:31.866 [2024-06-07 12:30:55.500784] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' d3406d4f-b96b-44fa-8d5a-533bd1347eb0 '!=' d3406d4f-b96b-44fa-8d5a-533bd1347eb0 ']' 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 210371 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 210371 ']' 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 210371 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 210371 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 210371' 00:28:32.124 killing process with pid 210371 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 210371 00:28:32.124 [2024-06-07 12:30:55.551203] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:32.124 [2024-06-07 12:30:55.551273] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:32.124 [2024-06-07 12:30:55.551323] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:32.124 [2024-06-07 12:30:55.551333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000ab80 name raid_bdev1, state offline 00:28:32.124 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 210371 00:28:32.124 [2024-06-07 12:30:55.614460] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:32.381 12:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:28:32.381 00:28:32.381 real 0m22.684s 00:28:32.381 user 0m41.907s 00:28:32.381 sys 0m3.811s 00:28:32.381 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:32.381 12:30:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:28:32.381 ************************************ 00:28:32.381 END TEST raid_superblock_test 00:28:32.381 ************************************ 00:28:32.381 12:30:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:28:32.381 12:30:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:32.381 12:30:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:32.381 12:30:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:32.640 ************************************ 00:28:32.640 START TEST raid_read_error_test 00:28:32.640 ************************************ 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 read 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3x7B4wlYO7 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=211107 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 211107 /var/tmp/spdk-raid.sock 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 211107 ']' 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:32.640 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:32.640 12:30:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:32.640 [2024-06-07 12:30:56.095708] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:28:32.640 [2024-06-07 12:30:56.096093] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid211107 ] 00:28:32.640 [2024-06-07 12:30:56.251336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:32.897 [2024-06-07 12:30:56.378753] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:32.897 [2024-06-07 12:30:56.461680] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:33.463 12:30:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:33.463 12:30:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:33.463 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:33.463 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:33.720 BaseBdev1_malloc 00:28:33.720 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:33.978 true 00:28:33.978 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:34.236 [2024-06-07 12:30:57.723069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:34.236 [2024-06-07 12:30:57.723205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.236 [2024-06-07 12:30:57.723271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:28:34.236 [2024-06-07 12:30:57.723354] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.236 [2024-06-07 12:30:57.725816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.236 [2024-06-07 12:30:57.725874] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:34.236 BaseBdev1 00:28:34.236 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:34.236 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:34.493 BaseBdev2_malloc 00:28:34.493 12:30:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:34.796 true 00:28:34.796 12:30:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:34.796 [2024-06-07 12:30:58.394589] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:34.796 [2024-06-07 12:30:58.394694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.796 [2024-06-07 12:30:58.394747] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:28:34.796 [2024-06-07 12:30:58.394805] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.796 [2024-06-07 12:30:58.397119] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.796 [2024-06-07 12:30:58.397174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:34.796 BaseBdev2 00:28:34.796 12:30:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:34.796 12:30:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:28:35.054 BaseBdev3_malloc 00:28:35.054 12:30:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:28:35.312 true 00:28:35.312 12:30:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:28:35.570 [2024-06-07 12:30:59.073187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:28:35.570 [2024-06-07 12:30:59.073362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.570 [2024-06-07 12:30:59.073441] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:28:35.570 [2024-06-07 12:30:59.073528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.570 [2024-06-07 12:30:59.076426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.570 [2024-06-07 12:30:59.076520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:28:35.570 BaseBdev3 00:28:35.570 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:28:35.828 [2024-06-07 12:30:59.297379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:35.828 [2024-06-07 12:30:59.299440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:35.828 [2024-06-07 12:30:59.299511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:35.828 [2024-06-07 12:30:59.299725] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:28:35.828 [2024-06-07 12:30:59.299763] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:35.828 [2024-06-07 12:30:59.299930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:28:35.828 [2024-06-07 12:30:59.300329] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:28:35.828 [2024-06-07 12:30:59.300350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:28:35.828 [2024-06-07 12:30:59.300496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.828 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.085 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.085 "name": "raid_bdev1", 00:28:36.085 "uuid": "ae111d48-e081-4269-9806-b13230194967", 00:28:36.085 "strip_size_kb": 0, 00:28:36.085 "state": "online", 00:28:36.085 "raid_level": "raid1", 00:28:36.085 "superblock": true, 00:28:36.085 "num_base_bdevs": 3, 00:28:36.085 "num_base_bdevs_discovered": 3, 00:28:36.085 "num_base_bdevs_operational": 3, 00:28:36.085 "base_bdevs_list": [ 00:28:36.085 { 00:28:36.085 "name": "BaseBdev1", 00:28:36.085 "uuid": "182f5990-4633-564d-a03d-7cdde8c2f44e", 00:28:36.085 "is_configured": true, 00:28:36.085 "data_offset": 2048, 00:28:36.085 "data_size": 63488 00:28:36.085 }, 00:28:36.085 { 00:28:36.085 "name": "BaseBdev2", 00:28:36.085 "uuid": "af04f1ca-839c-5fc4-bdef-d2523ee9c190", 00:28:36.085 "is_configured": true, 00:28:36.085 "data_offset": 2048, 00:28:36.085 "data_size": 63488 00:28:36.085 }, 00:28:36.085 { 00:28:36.085 "name": "BaseBdev3", 00:28:36.085 "uuid": "99164d03-859e-5a08-af48-60abbfca78e4", 00:28:36.085 "is_configured": true, 00:28:36.085 "data_offset": 2048, 00:28:36.085 "data_size": 63488 00:28:36.085 } 00:28:36.085 ] 00:28:36.085 }' 00:28:36.085 12:30:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.085 12:30:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:37.018 12:31:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:37.018 12:31:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:37.018 [2024-06-07 12:31:00.422619] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:28:37.952 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.209 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.210 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.210 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.210 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.210 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.467 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.467 "name": "raid_bdev1", 00:28:38.467 "uuid": "ae111d48-e081-4269-9806-b13230194967", 00:28:38.467 "strip_size_kb": 0, 00:28:38.467 "state": "online", 00:28:38.467 "raid_level": "raid1", 00:28:38.467 "superblock": true, 00:28:38.467 "num_base_bdevs": 3, 00:28:38.467 "num_base_bdevs_discovered": 3, 00:28:38.467 "num_base_bdevs_operational": 3, 00:28:38.467 "base_bdevs_list": [ 00:28:38.467 { 00:28:38.467 "name": "BaseBdev1", 00:28:38.467 "uuid": "182f5990-4633-564d-a03d-7cdde8c2f44e", 00:28:38.467 "is_configured": true, 00:28:38.467 "data_offset": 2048, 00:28:38.467 "data_size": 63488 00:28:38.467 }, 00:28:38.467 { 00:28:38.467 "name": "BaseBdev2", 00:28:38.467 "uuid": "af04f1ca-839c-5fc4-bdef-d2523ee9c190", 00:28:38.467 "is_configured": true, 00:28:38.467 "data_offset": 2048, 00:28:38.467 "data_size": 63488 00:28:38.467 }, 00:28:38.467 { 00:28:38.467 "name": "BaseBdev3", 00:28:38.467 "uuid": "99164d03-859e-5a08-af48-60abbfca78e4", 00:28:38.467 "is_configured": true, 00:28:38.467 "data_offset": 2048, 00:28:38.467 "data_size": 63488 00:28:38.467 } 00:28:38.467 ] 00:28:38.467 }' 00:28:38.467 12:31:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.467 12:31:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:39.400 12:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:39.400 [2024-06-07 12:31:03.003409] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:39.400 [2024-06-07 12:31:03.003461] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:39.400 [2024-06-07 12:31:03.004802] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:39.400 [2024-06-07 12:31:03.004858] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:39.400 [2024-06-07 12:31:03.004921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:39.400 [2024-06-07 12:31:03.004932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:28:39.400 0 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 211107 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 211107 ']' 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 211107 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:39.400 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 211107 00:28:39.657 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:39.657 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:39.657 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 211107' 00:28:39.657 killing process with pid 211107 00:28:39.657 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 211107 00:28:39.657 [2024-06-07 12:31:03.056906] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:39.657 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 211107 00:28:39.657 [2024-06-07 12:31:03.107196] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3x7B4wlYO7 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:28:39.914 00:28:39.914 real 0m7.455s 00:28:39.914 user 0m11.794s 00:28:39.914 sys 0m1.133s 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:39.914 12:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:39.914 ************************************ 00:28:39.914 END TEST raid_read_error_test 00:28:39.914 ************************************ 00:28:39.914 12:31:03 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:28:39.914 12:31:03 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:39.914 12:31:03 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:39.914 12:31:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:39.914 ************************************ 00:28:39.914 START TEST raid_write_error_test 00:28:40.171 ************************************ 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 write 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.A7fNc5Ex7h 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=211306 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 211306 /var/tmp/spdk-raid.sock 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 211306 ']' 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:40.171 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:40.171 12:31:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:40.171 [2024-06-07 12:31:03.615306] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:28:40.171 [2024-06-07 12:31:03.615616] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid211306 ] 00:28:40.171 [2024-06-07 12:31:03.763012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:40.428 [2024-06-07 12:31:03.857641] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:40.428 [2024-06-07 12:31:03.952797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:40.992 12:31:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:40.992 12:31:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:28:40.992 12:31:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:40.992 12:31:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:41.250 BaseBdev1_malloc 00:28:41.250 12:31:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:28:41.507 true 00:28:41.507 12:31:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:28:41.764 [2024-06-07 12:31:05.323184] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:28:41.764 [2024-06-07 12:31:05.323334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:41.764 [2024-06-07 12:31:05.323392] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:28:41.764 [2024-06-07 12:31:05.323451] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:41.765 [2024-06-07 12:31:05.325971] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:41.765 [2024-06-07 12:31:05.326045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:41.765 BaseBdev1 00:28:41.765 12:31:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:41.765 12:31:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:42.022 BaseBdev2_malloc 00:28:42.022 12:31:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:28:42.279 true 00:28:42.280 12:31:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:28:42.564 [2024-06-07 12:31:06.034493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:28:42.564 [2024-06-07 12:31:06.034602] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.564 [2024-06-07 12:31:06.034653] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:28:42.564 [2024-06-07 12:31:06.034707] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.564 [2024-06-07 12:31:06.036976] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.564 [2024-06-07 12:31:06.037033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:42.565 BaseBdev2 00:28:42.565 12:31:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:28:42.565 12:31:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:28:42.845 BaseBdev3_malloc 00:28:42.845 12:31:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:28:43.102 true 00:28:43.102 12:31:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:28:43.359 [2024-06-07 12:31:06.773225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:28:43.359 [2024-06-07 12:31:06.773396] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.359 [2024-06-07 12:31:06.773445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:28:43.359 [2024-06-07 12:31:06.773517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.359 [2024-06-07 12:31:06.775804] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.359 [2024-06-07 12:31:06.775873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:28:43.359 BaseBdev3 00:28:43.359 12:31:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:28:43.359 [2024-06-07 12:31:06.989385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:43.359 [2024-06-07 12:31:06.991531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:43.359 [2024-06-07 12:31:06.991599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:43.359 [2024-06-07 12:31:06.991763] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:28:43.359 [2024-06-07 12:31:06.991776] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:43.359 [2024-06-07 12:31:06.991925] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:28:43.359 [2024-06-07 12:31:06.992317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:28:43.359 [2024-06-07 12:31:06.992334] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008180 00:28:43.359 [2024-06-07 12:31:06.992474] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.617 "name": "raid_bdev1", 00:28:43.617 "uuid": "9dd20cff-6dfb-4484-91d8-ad91125751e2", 00:28:43.617 "strip_size_kb": 0, 00:28:43.617 "state": "online", 00:28:43.617 "raid_level": "raid1", 00:28:43.617 "superblock": true, 00:28:43.617 "num_base_bdevs": 3, 00:28:43.617 "num_base_bdevs_discovered": 3, 00:28:43.617 "num_base_bdevs_operational": 3, 00:28:43.617 "base_bdevs_list": [ 00:28:43.617 { 00:28:43.617 "name": "BaseBdev1", 00:28:43.617 "uuid": "97363fc8-e48c-5360-808d-d1df0ca8770f", 00:28:43.617 "is_configured": true, 00:28:43.617 "data_offset": 2048, 00:28:43.617 "data_size": 63488 00:28:43.617 }, 00:28:43.617 { 00:28:43.617 "name": "BaseBdev2", 00:28:43.617 "uuid": "dc261567-7941-50c3-9737-4953478491c0", 00:28:43.617 "is_configured": true, 00:28:43.617 "data_offset": 2048, 00:28:43.617 "data_size": 63488 00:28:43.617 }, 00:28:43.617 { 00:28:43.617 "name": "BaseBdev3", 00:28:43.617 "uuid": "461f9c00-0374-5ae8-8991-65aa42a26d68", 00:28:43.617 "is_configured": true, 00:28:43.617 "data_offset": 2048, 00:28:43.617 "data_size": 63488 00:28:43.617 } 00:28:43.617 ] 00:28:43.617 }' 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.617 12:31:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:44.181 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:28:44.181 12:31:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:44.438 [2024-06-07 12:31:07.881723] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:28:45.372 12:31:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:28:45.630 [2024-06-07 12:31:09.020458] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:28:45.630 [2024-06-07 12:31:09.020576] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:45.630 [2024-06-07 12:31:09.020798] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002600 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.630 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.888 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.888 "name": "raid_bdev1", 00:28:45.888 "uuid": "9dd20cff-6dfb-4484-91d8-ad91125751e2", 00:28:45.888 "strip_size_kb": 0, 00:28:45.888 "state": "online", 00:28:45.888 "raid_level": "raid1", 00:28:45.888 "superblock": true, 00:28:45.888 "num_base_bdevs": 3, 00:28:45.888 "num_base_bdevs_discovered": 2, 00:28:45.888 "num_base_bdevs_operational": 2, 00:28:45.888 "base_bdevs_list": [ 00:28:45.888 { 00:28:45.888 "name": null, 00:28:45.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:45.888 "is_configured": false, 00:28:45.888 "data_offset": 2048, 00:28:45.888 "data_size": 63488 00:28:45.888 }, 00:28:45.888 { 00:28:45.888 "name": "BaseBdev2", 00:28:45.888 "uuid": "dc261567-7941-50c3-9737-4953478491c0", 00:28:45.888 "is_configured": true, 00:28:45.888 "data_offset": 2048, 00:28:45.888 "data_size": 63488 00:28:45.888 }, 00:28:45.888 { 00:28:45.888 "name": "BaseBdev3", 00:28:45.888 "uuid": "461f9c00-0374-5ae8-8991-65aa42a26d68", 00:28:45.888 "is_configured": true, 00:28:45.888 "data_offset": 2048, 00:28:45.888 "data_size": 63488 00:28:45.888 } 00:28:45.888 ] 00:28:45.888 }' 00:28:45.888 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.888 12:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:46.453 12:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:46.711 [2024-06-07 12:31:10.167491] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.711 [2024-06-07 12:31:10.167575] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:46.711 [2024-06-07 12:31:10.168895] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:46.711 [2024-06-07 12:31:10.168963] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:46.711 [2024-06-07 12:31:10.169018] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:46.711 [2024-06-07 12:31:10.169029] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name raid_bdev1, state offline 00:28:46.711 0 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 211306 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 211306 ']' 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 211306 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 211306 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:46.711 killing process with pid 211306 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 211306' 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 211306 00:28:46.711 [2024-06-07 12:31:10.213915] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:46.711 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 211306 00:28:46.711 [2024-06-07 12:31:10.263405] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.A7fNc5Ex7h 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:28:47.277 00:28:47.277 real 0m7.086s 00:28:47.277 user 0m11.010s 00:28:47.277 sys 0m1.194s 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:47.277 12:31:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:28:47.277 ************************************ 00:28:47.277 END TEST raid_write_error_test 00:28:47.277 ************************************ 00:28:47.277 12:31:10 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:28:47.277 12:31:10 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:28:47.277 12:31:10 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:28:47.277 12:31:10 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:47.277 12:31:10 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:47.277 12:31:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:47.277 ************************************ 00:28:47.277 START TEST raid_state_function_test 00:28:47.277 ************************************ 00:28:47.277 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 false 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=211497 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 211497' 00:28:47.278 Process raid pid: 211497 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 211497 /var/tmp/spdk-raid.sock 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 211497 ']' 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:47.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:47.278 12:31:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:47.278 [2024-06-07 12:31:10.764566] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:28:47.278 [2024-06-07 12:31:10.764919] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:47.278 [2024-06-07 12:31:10.913470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:47.536 [2024-06-07 12:31:11.012022] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:47.536 [2024-06-07 12:31:11.098678] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:47.536 12:31:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:47.536 12:31:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:28:47.536 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:28:48.103 [2024-06-07 12:31:11.472294] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:48.103 [2024-06-07 12:31:11.472394] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:48.103 [2024-06-07 12:31:11.472407] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:48.103 [2024-06-07 12:31:11.472432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:48.103 [2024-06-07 12:31:11.472440] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:28:48.103 [2024-06-07 12:31:11.472491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:28:48.103 [2024-06-07 12:31:11.472501] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:28:48.103 [2024-06-07 12:31:11.472528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.103 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:48.360 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.360 "name": "Existed_Raid", 00:28:48.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.361 "strip_size_kb": 64, 00:28:48.361 "state": "configuring", 00:28:48.361 "raid_level": "raid0", 00:28:48.361 "superblock": false, 00:28:48.361 "num_base_bdevs": 4, 00:28:48.361 "num_base_bdevs_discovered": 0, 00:28:48.361 "num_base_bdevs_operational": 4, 00:28:48.361 "base_bdevs_list": [ 00:28:48.361 { 00:28:48.361 "name": "BaseBdev1", 00:28:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.361 "is_configured": false, 00:28:48.361 "data_offset": 0, 00:28:48.361 "data_size": 0 00:28:48.361 }, 00:28:48.361 { 00:28:48.361 "name": "BaseBdev2", 00:28:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.361 "is_configured": false, 00:28:48.361 "data_offset": 0, 00:28:48.361 "data_size": 0 00:28:48.361 }, 00:28:48.361 { 00:28:48.361 "name": "BaseBdev3", 00:28:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.361 "is_configured": false, 00:28:48.361 "data_offset": 0, 00:28:48.361 "data_size": 0 00:28:48.361 }, 00:28:48.361 { 00:28:48.361 "name": "BaseBdev4", 00:28:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.361 "is_configured": false, 00:28:48.361 "data_offset": 0, 00:28:48.361 "data_size": 0 00:28:48.361 } 00:28:48.361 ] 00:28:48.361 }' 00:28:48.361 12:31:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.361 12:31:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:49.294 12:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:49.294 [2024-06-07 12:31:12.896378] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:49.294 [2024-06-07 12:31:12.896447] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:28:49.294 12:31:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:28:49.551 [2024-06-07 12:31:13.196485] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:49.551 [2024-06-07 12:31:13.196586] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:49.551 [2024-06-07 12:31:13.196600] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:49.552 [2024-06-07 12:31:13.196651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:49.552 [2024-06-07 12:31:13.196660] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:28:49.552 [2024-06-07 12:31:13.196684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:28:49.552 [2024-06-07 12:31:13.196693] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:28:49.552 [2024-06-07 12:31:13.196730] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:28:49.815 12:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:28:50.092 [2024-06-07 12:31:13.508678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:50.092 BaseBdev1 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:50.092 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:50.350 12:31:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:50.608 [ 00:28:50.608 { 00:28:50.608 "name": "BaseBdev1", 00:28:50.608 "aliases": [ 00:28:50.608 "8c1fed55-fd15-4f59-ada8-3c96d8c2f943" 00:28:50.608 ], 00:28:50.608 "product_name": "Malloc disk", 00:28:50.608 "block_size": 512, 00:28:50.608 "num_blocks": 65536, 00:28:50.608 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:50.608 "assigned_rate_limits": { 00:28:50.608 "rw_ios_per_sec": 0, 00:28:50.608 "rw_mbytes_per_sec": 0, 00:28:50.608 "r_mbytes_per_sec": 0, 00:28:50.608 "w_mbytes_per_sec": 0 00:28:50.608 }, 00:28:50.608 "claimed": true, 00:28:50.608 "claim_type": "exclusive_write", 00:28:50.608 "zoned": false, 00:28:50.608 "supported_io_types": { 00:28:50.608 "read": true, 00:28:50.608 "write": true, 00:28:50.608 "unmap": true, 00:28:50.608 "write_zeroes": true, 00:28:50.608 "flush": true, 00:28:50.608 "reset": true, 00:28:50.608 "compare": false, 00:28:50.608 "compare_and_write": false, 00:28:50.608 "abort": true, 00:28:50.608 "nvme_admin": false, 00:28:50.608 "nvme_io": false 00:28:50.608 }, 00:28:50.608 "memory_domains": [ 00:28:50.608 { 00:28:50.608 "dma_device_id": "system", 00:28:50.608 "dma_device_type": 1 00:28:50.608 }, 00:28:50.608 { 00:28:50.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:50.608 "dma_device_type": 2 00:28:50.608 } 00:28:50.608 ], 00:28:50.608 "driver_specific": {} 00:28:50.608 } 00:28:50.608 ] 00:28:50.608 12:31:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:50.608 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:28:50.608 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:50.608 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.609 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:50.867 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.867 "name": "Existed_Raid", 00:28:50.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.867 "strip_size_kb": 64, 00:28:50.867 "state": "configuring", 00:28:50.867 "raid_level": "raid0", 00:28:50.867 "superblock": false, 00:28:50.867 "num_base_bdevs": 4, 00:28:50.867 "num_base_bdevs_discovered": 1, 00:28:50.867 "num_base_bdevs_operational": 4, 00:28:50.867 "base_bdevs_list": [ 00:28:50.867 { 00:28:50.867 "name": "BaseBdev1", 00:28:50.867 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:50.867 "is_configured": true, 00:28:50.867 "data_offset": 0, 00:28:50.867 "data_size": 65536 00:28:50.867 }, 00:28:50.867 { 00:28:50.867 "name": "BaseBdev2", 00:28:50.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.867 "is_configured": false, 00:28:50.867 "data_offset": 0, 00:28:50.867 "data_size": 0 00:28:50.867 }, 00:28:50.867 { 00:28:50.867 "name": "BaseBdev3", 00:28:50.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.867 "is_configured": false, 00:28:50.867 "data_offset": 0, 00:28:50.867 "data_size": 0 00:28:50.867 }, 00:28:50.867 { 00:28:50.867 "name": "BaseBdev4", 00:28:50.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.867 "is_configured": false, 00:28:50.867 "data_offset": 0, 00:28:50.867 "data_size": 0 00:28:50.867 } 00:28:50.867 ] 00:28:50.867 }' 00:28:50.867 12:31:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.867 12:31:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:51.433 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:51.691 [2024-06-07 12:31:15.225009] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:51.691 [2024-06-07 12:31:15.225103] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:28:51.691 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:28:52.258 [2024-06-07 12:31:15.621145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:52.258 [2024-06-07 12:31:15.623158] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:52.258 [2024-06-07 12:31:15.623719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:52.258 [2024-06-07 12:31:15.623748] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:28:52.258 [2024-06-07 12:31:15.623845] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:28:52.258 [2024-06-07 12:31:15.623861] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:28:52.258 [2024-06-07 12:31:15.623943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.258 12:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:52.516 12:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.516 "name": "Existed_Raid", 00:28:52.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.516 "strip_size_kb": 64, 00:28:52.516 "state": "configuring", 00:28:52.516 "raid_level": "raid0", 00:28:52.516 "superblock": false, 00:28:52.516 "num_base_bdevs": 4, 00:28:52.516 "num_base_bdevs_discovered": 1, 00:28:52.516 "num_base_bdevs_operational": 4, 00:28:52.516 "base_bdevs_list": [ 00:28:52.516 { 00:28:52.516 "name": "BaseBdev1", 00:28:52.516 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:52.516 "is_configured": true, 00:28:52.516 "data_offset": 0, 00:28:52.516 "data_size": 65536 00:28:52.516 }, 00:28:52.516 { 00:28:52.516 "name": "BaseBdev2", 00:28:52.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.516 "is_configured": false, 00:28:52.516 "data_offset": 0, 00:28:52.516 "data_size": 0 00:28:52.516 }, 00:28:52.516 { 00:28:52.516 "name": "BaseBdev3", 00:28:52.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.516 "is_configured": false, 00:28:52.516 "data_offset": 0, 00:28:52.516 "data_size": 0 00:28:52.516 }, 00:28:52.516 { 00:28:52.516 "name": "BaseBdev4", 00:28:52.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.516 "is_configured": false, 00:28:52.516 "data_offset": 0, 00:28:52.516 "data_size": 0 00:28:52.516 } 00:28:52.516 ] 00:28:52.516 }' 00:28:52.516 12:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.516 12:31:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:53.094 12:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:28:53.660 [2024-06-07 12:31:17.003039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:53.660 BaseBdev2 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:53.660 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:53.918 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:54.175 [ 00:28:54.175 { 00:28:54.175 "name": "BaseBdev2", 00:28:54.175 "aliases": [ 00:28:54.175 "f05fa604-33b9-475a-be6e-c4ec25b67c00" 00:28:54.175 ], 00:28:54.175 "product_name": "Malloc disk", 00:28:54.175 "block_size": 512, 00:28:54.175 "num_blocks": 65536, 00:28:54.175 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:54.175 "assigned_rate_limits": { 00:28:54.175 "rw_ios_per_sec": 0, 00:28:54.175 "rw_mbytes_per_sec": 0, 00:28:54.175 "r_mbytes_per_sec": 0, 00:28:54.175 "w_mbytes_per_sec": 0 00:28:54.175 }, 00:28:54.175 "claimed": true, 00:28:54.175 "claim_type": "exclusive_write", 00:28:54.175 "zoned": false, 00:28:54.175 "supported_io_types": { 00:28:54.175 "read": true, 00:28:54.175 "write": true, 00:28:54.175 "unmap": true, 00:28:54.175 "write_zeroes": true, 00:28:54.175 "flush": true, 00:28:54.175 "reset": true, 00:28:54.175 "compare": false, 00:28:54.175 "compare_and_write": false, 00:28:54.175 "abort": true, 00:28:54.175 "nvme_admin": false, 00:28:54.175 "nvme_io": false 00:28:54.175 }, 00:28:54.175 "memory_domains": [ 00:28:54.175 { 00:28:54.175 "dma_device_id": "system", 00:28:54.175 "dma_device_type": 1 00:28:54.175 }, 00:28:54.175 { 00:28:54.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:54.175 "dma_device_type": 2 00:28:54.175 } 00:28:54.175 ], 00:28:54.175 "driver_specific": {} 00:28:54.175 } 00:28:54.175 ] 00:28:54.175 12:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:54.175 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:54.175 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.176 12:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:54.741 12:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.741 "name": "Existed_Raid", 00:28:54.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.741 "strip_size_kb": 64, 00:28:54.741 "state": "configuring", 00:28:54.741 "raid_level": "raid0", 00:28:54.741 "superblock": false, 00:28:54.741 "num_base_bdevs": 4, 00:28:54.741 "num_base_bdevs_discovered": 2, 00:28:54.741 "num_base_bdevs_operational": 4, 00:28:54.741 "base_bdevs_list": [ 00:28:54.741 { 00:28:54.741 "name": "BaseBdev1", 00:28:54.741 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:54.741 "is_configured": true, 00:28:54.741 "data_offset": 0, 00:28:54.741 "data_size": 65536 00:28:54.741 }, 00:28:54.741 { 00:28:54.741 "name": "BaseBdev2", 00:28:54.741 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:54.741 "is_configured": true, 00:28:54.741 "data_offset": 0, 00:28:54.741 "data_size": 65536 00:28:54.741 }, 00:28:54.742 { 00:28:54.742 "name": "BaseBdev3", 00:28:54.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.742 "is_configured": false, 00:28:54.742 "data_offset": 0, 00:28:54.742 "data_size": 0 00:28:54.742 }, 00:28:54.742 { 00:28:54.742 "name": "BaseBdev4", 00:28:54.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.742 "is_configured": false, 00:28:54.742 "data_offset": 0, 00:28:54.742 "data_size": 0 00:28:54.742 } 00:28:54.742 ] 00:28:54.742 }' 00:28:54.742 12:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.742 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:28:55.308 [2024-06-07 12:31:18.916881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:55.308 BaseBdev3 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:55.308 12:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:55.566 12:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:28:55.825 [ 00:28:55.825 { 00:28:55.825 "name": "BaseBdev3", 00:28:55.825 "aliases": [ 00:28:55.825 "0ea3034a-45af-434e-bc9c-04f2b8387d8a" 00:28:55.825 ], 00:28:55.825 "product_name": "Malloc disk", 00:28:55.825 "block_size": 512, 00:28:55.825 "num_blocks": 65536, 00:28:55.825 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:28:55.825 "assigned_rate_limits": { 00:28:55.825 "rw_ios_per_sec": 0, 00:28:55.825 "rw_mbytes_per_sec": 0, 00:28:55.825 "r_mbytes_per_sec": 0, 00:28:55.825 "w_mbytes_per_sec": 0 00:28:55.825 }, 00:28:55.825 "claimed": true, 00:28:55.825 "claim_type": "exclusive_write", 00:28:55.825 "zoned": false, 00:28:55.825 "supported_io_types": { 00:28:55.825 "read": true, 00:28:55.825 "write": true, 00:28:55.825 "unmap": true, 00:28:55.825 "write_zeroes": true, 00:28:55.825 "flush": true, 00:28:55.825 "reset": true, 00:28:55.825 "compare": false, 00:28:55.825 "compare_and_write": false, 00:28:55.825 "abort": true, 00:28:55.825 "nvme_admin": false, 00:28:55.825 "nvme_io": false 00:28:55.825 }, 00:28:55.825 "memory_domains": [ 00:28:55.825 { 00:28:55.825 "dma_device_id": "system", 00:28:55.825 "dma_device_type": 1 00:28:55.825 }, 00:28:55.825 { 00:28:55.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:55.825 "dma_device_type": 2 00:28:55.825 } 00:28:55.825 ], 00:28:55.825 "driver_specific": {} 00:28:55.825 } 00:28:55.825 ] 00:28:55.825 12:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.826 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:56.085 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.085 "name": "Existed_Raid", 00:28:56.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.085 "strip_size_kb": 64, 00:28:56.085 "state": "configuring", 00:28:56.085 "raid_level": "raid0", 00:28:56.085 "superblock": false, 00:28:56.085 "num_base_bdevs": 4, 00:28:56.085 "num_base_bdevs_discovered": 3, 00:28:56.085 "num_base_bdevs_operational": 4, 00:28:56.085 "base_bdevs_list": [ 00:28:56.085 { 00:28:56.085 "name": "BaseBdev1", 00:28:56.085 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:56.085 "is_configured": true, 00:28:56.085 "data_offset": 0, 00:28:56.085 "data_size": 65536 00:28:56.085 }, 00:28:56.085 { 00:28:56.085 "name": "BaseBdev2", 00:28:56.085 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:56.085 "is_configured": true, 00:28:56.085 "data_offset": 0, 00:28:56.085 "data_size": 65536 00:28:56.085 }, 00:28:56.085 { 00:28:56.085 "name": "BaseBdev3", 00:28:56.085 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:28:56.085 "is_configured": true, 00:28:56.085 "data_offset": 0, 00:28:56.085 "data_size": 65536 00:28:56.085 }, 00:28:56.085 { 00:28:56.085 "name": "BaseBdev4", 00:28:56.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.085 "is_configured": false, 00:28:56.085 "data_offset": 0, 00:28:56.085 "data_size": 0 00:28:56.085 } 00:28:56.085 ] 00:28:56.085 }' 00:28:56.085 12:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.085 12:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:28:56.653 [2024-06-07 12:31:20.270459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:28:56.653 [2024-06-07 12:31:20.270515] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:28:56.653 [2024-06-07 12:31:20.270524] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:28:56.653 [2024-06-07 12:31:20.270634] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:28:56.653 [2024-06-07 12:31:20.270911] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:28:56.653 [2024-06-07 12:31:20.270921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:28:56.653 [2024-06-07 12:31:20.271124] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:56.653 BaseBdev4 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:56.653 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:28:57.220 [ 00:28:57.220 { 00:28:57.220 "name": "BaseBdev4", 00:28:57.220 "aliases": [ 00:28:57.220 "e4bf9b86-3e7e-468a-8bb6-678c9c488d03" 00:28:57.220 ], 00:28:57.220 "product_name": "Malloc disk", 00:28:57.220 "block_size": 512, 00:28:57.220 "num_blocks": 65536, 00:28:57.220 "uuid": "e4bf9b86-3e7e-468a-8bb6-678c9c488d03", 00:28:57.220 "assigned_rate_limits": { 00:28:57.220 "rw_ios_per_sec": 0, 00:28:57.220 "rw_mbytes_per_sec": 0, 00:28:57.220 "r_mbytes_per_sec": 0, 00:28:57.220 "w_mbytes_per_sec": 0 00:28:57.220 }, 00:28:57.220 "claimed": true, 00:28:57.220 "claim_type": "exclusive_write", 00:28:57.220 "zoned": false, 00:28:57.220 "supported_io_types": { 00:28:57.220 "read": true, 00:28:57.220 "write": true, 00:28:57.220 "unmap": true, 00:28:57.220 "write_zeroes": true, 00:28:57.220 "flush": true, 00:28:57.220 "reset": true, 00:28:57.220 "compare": false, 00:28:57.220 "compare_and_write": false, 00:28:57.220 "abort": true, 00:28:57.220 "nvme_admin": false, 00:28:57.220 "nvme_io": false 00:28:57.220 }, 00:28:57.220 "memory_domains": [ 00:28:57.220 { 00:28:57.220 "dma_device_id": "system", 00:28:57.220 "dma_device_type": 1 00:28:57.220 }, 00:28:57.220 { 00:28:57.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.220 "dma_device_type": 2 00:28:57.220 } 00:28:57.220 ], 00:28:57.220 "driver_specific": {} 00:28:57.220 } 00:28:57.220 ] 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.220 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.221 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.221 12:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:57.479 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.479 "name": "Existed_Raid", 00:28:57.479 "uuid": "31aa147e-6636-49e3-9b3a-45b287867999", 00:28:57.479 "strip_size_kb": 64, 00:28:57.479 "state": "online", 00:28:57.479 "raid_level": "raid0", 00:28:57.479 "superblock": false, 00:28:57.479 "num_base_bdevs": 4, 00:28:57.479 "num_base_bdevs_discovered": 4, 00:28:57.479 "num_base_bdevs_operational": 4, 00:28:57.479 "base_bdevs_list": [ 00:28:57.479 { 00:28:57.479 "name": "BaseBdev1", 00:28:57.479 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:57.479 "is_configured": true, 00:28:57.479 "data_offset": 0, 00:28:57.479 "data_size": 65536 00:28:57.479 }, 00:28:57.479 { 00:28:57.479 "name": "BaseBdev2", 00:28:57.479 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:57.479 "is_configured": true, 00:28:57.479 "data_offset": 0, 00:28:57.479 "data_size": 65536 00:28:57.479 }, 00:28:57.479 { 00:28:57.479 "name": "BaseBdev3", 00:28:57.479 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:28:57.479 "is_configured": true, 00:28:57.479 "data_offset": 0, 00:28:57.479 "data_size": 65536 00:28:57.479 }, 00:28:57.479 { 00:28:57.479 "name": "BaseBdev4", 00:28:57.479 "uuid": "e4bf9b86-3e7e-468a-8bb6-678c9c488d03", 00:28:57.479 "is_configured": true, 00:28:57.479 "data_offset": 0, 00:28:57.479 "data_size": 65536 00:28:57.479 } 00:28:57.479 ] 00:28:57.479 }' 00:28:57.479 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.479 12:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:58.049 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:58.307 [2024-06-07 12:31:21.890947] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:58.307 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:58.307 "name": "Existed_Raid", 00:28:58.307 "aliases": [ 00:28:58.307 "31aa147e-6636-49e3-9b3a-45b287867999" 00:28:58.307 ], 00:28:58.307 "product_name": "Raid Volume", 00:28:58.307 "block_size": 512, 00:28:58.307 "num_blocks": 262144, 00:28:58.307 "uuid": "31aa147e-6636-49e3-9b3a-45b287867999", 00:28:58.307 "assigned_rate_limits": { 00:28:58.307 "rw_ios_per_sec": 0, 00:28:58.307 "rw_mbytes_per_sec": 0, 00:28:58.307 "r_mbytes_per_sec": 0, 00:28:58.307 "w_mbytes_per_sec": 0 00:28:58.307 }, 00:28:58.307 "claimed": false, 00:28:58.307 "zoned": false, 00:28:58.307 "supported_io_types": { 00:28:58.307 "read": true, 00:28:58.307 "write": true, 00:28:58.307 "unmap": true, 00:28:58.307 "write_zeroes": true, 00:28:58.307 "flush": true, 00:28:58.307 "reset": true, 00:28:58.308 "compare": false, 00:28:58.308 "compare_and_write": false, 00:28:58.308 "abort": false, 00:28:58.308 "nvme_admin": false, 00:28:58.308 "nvme_io": false 00:28:58.308 }, 00:28:58.308 "memory_domains": [ 00:28:58.308 { 00:28:58.308 "dma_device_id": "system", 00:28:58.308 "dma_device_type": 1 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.308 "dma_device_type": 2 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "system", 00:28:58.308 "dma_device_type": 1 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.308 "dma_device_type": 2 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "system", 00:28:58.308 "dma_device_type": 1 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.308 "dma_device_type": 2 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "system", 00:28:58.308 "dma_device_type": 1 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.308 "dma_device_type": 2 00:28:58.308 } 00:28:58.308 ], 00:28:58.308 "driver_specific": { 00:28:58.308 "raid": { 00:28:58.308 "uuid": "31aa147e-6636-49e3-9b3a-45b287867999", 00:28:58.308 "strip_size_kb": 64, 00:28:58.308 "state": "online", 00:28:58.308 "raid_level": "raid0", 00:28:58.308 "superblock": false, 00:28:58.308 "num_base_bdevs": 4, 00:28:58.308 "num_base_bdevs_discovered": 4, 00:28:58.308 "num_base_bdevs_operational": 4, 00:28:58.308 "base_bdevs_list": [ 00:28:58.308 { 00:28:58.308 "name": "BaseBdev1", 00:28:58.308 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:58.308 "is_configured": true, 00:28:58.308 "data_offset": 0, 00:28:58.308 "data_size": 65536 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "name": "BaseBdev2", 00:28:58.308 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:58.308 "is_configured": true, 00:28:58.308 "data_offset": 0, 00:28:58.308 "data_size": 65536 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "name": "BaseBdev3", 00:28:58.308 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:28:58.308 "is_configured": true, 00:28:58.308 "data_offset": 0, 00:28:58.308 "data_size": 65536 00:28:58.308 }, 00:28:58.308 { 00:28:58.308 "name": "BaseBdev4", 00:28:58.308 "uuid": "e4bf9b86-3e7e-468a-8bb6-678c9c488d03", 00:28:58.308 "is_configured": true, 00:28:58.308 "data_offset": 0, 00:28:58.308 "data_size": 65536 00:28:58.308 } 00:28:58.308 ] 00:28:58.308 } 00:28:58.308 } 00:28:58.308 }' 00:28:58.308 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:58.565 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:58.565 BaseBdev2 00:28:58.565 BaseBdev3 00:28:58.565 BaseBdev4' 00:28:58.565 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:58.565 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:58.565 12:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:58.823 "name": "BaseBdev1", 00:28:58.823 "aliases": [ 00:28:58.823 "8c1fed55-fd15-4f59-ada8-3c96d8c2f943" 00:28:58.823 ], 00:28:58.823 "product_name": "Malloc disk", 00:28:58.823 "block_size": 512, 00:28:58.823 "num_blocks": 65536, 00:28:58.823 "uuid": "8c1fed55-fd15-4f59-ada8-3c96d8c2f943", 00:28:58.823 "assigned_rate_limits": { 00:28:58.823 "rw_ios_per_sec": 0, 00:28:58.823 "rw_mbytes_per_sec": 0, 00:28:58.823 "r_mbytes_per_sec": 0, 00:28:58.823 "w_mbytes_per_sec": 0 00:28:58.823 }, 00:28:58.823 "claimed": true, 00:28:58.823 "claim_type": "exclusive_write", 00:28:58.823 "zoned": false, 00:28:58.823 "supported_io_types": { 00:28:58.823 "read": true, 00:28:58.823 "write": true, 00:28:58.823 "unmap": true, 00:28:58.823 "write_zeroes": true, 00:28:58.823 "flush": true, 00:28:58.823 "reset": true, 00:28:58.823 "compare": false, 00:28:58.823 "compare_and_write": false, 00:28:58.823 "abort": true, 00:28:58.823 "nvme_admin": false, 00:28:58.823 "nvme_io": false 00:28:58.823 }, 00:28:58.823 "memory_domains": [ 00:28:58.823 { 00:28:58.823 "dma_device_id": "system", 00:28:58.823 "dma_device_type": 1 00:28:58.823 }, 00:28:58.823 { 00:28:58.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.823 "dma_device_type": 2 00:28:58.823 } 00:28:58.823 ], 00:28:58.823 "driver_specific": {} 00:28:58.823 }' 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:58.823 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:59.082 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:59.082 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:59.082 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:59.082 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:59.340 "name": "BaseBdev2", 00:28:59.340 "aliases": [ 00:28:59.340 "f05fa604-33b9-475a-be6e-c4ec25b67c00" 00:28:59.340 ], 00:28:59.340 "product_name": "Malloc disk", 00:28:59.340 "block_size": 512, 00:28:59.340 "num_blocks": 65536, 00:28:59.340 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:28:59.340 "assigned_rate_limits": { 00:28:59.340 "rw_ios_per_sec": 0, 00:28:59.340 "rw_mbytes_per_sec": 0, 00:28:59.340 "r_mbytes_per_sec": 0, 00:28:59.340 "w_mbytes_per_sec": 0 00:28:59.340 }, 00:28:59.340 "claimed": true, 00:28:59.340 "claim_type": "exclusive_write", 00:28:59.340 "zoned": false, 00:28:59.340 "supported_io_types": { 00:28:59.340 "read": true, 00:28:59.340 "write": true, 00:28:59.340 "unmap": true, 00:28:59.340 "write_zeroes": true, 00:28:59.340 "flush": true, 00:28:59.340 "reset": true, 00:28:59.340 "compare": false, 00:28:59.340 "compare_and_write": false, 00:28:59.340 "abort": true, 00:28:59.340 "nvme_admin": false, 00:28:59.340 "nvme_io": false 00:28:59.340 }, 00:28:59.340 "memory_domains": [ 00:28:59.340 { 00:28:59.340 "dma_device_id": "system", 00:28:59.340 "dma_device_type": 1 00:28:59.340 }, 00:28:59.340 { 00:28:59.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:59.340 "dma_device_type": 2 00:28:59.340 } 00:28:59.340 ], 00:28:59.340 "driver_specific": {} 00:28:59.340 }' 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:59.340 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:59.603 12:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:59.603 "name": "BaseBdev3", 00:28:59.603 "aliases": [ 00:28:59.603 "0ea3034a-45af-434e-bc9c-04f2b8387d8a" 00:28:59.603 ], 00:28:59.603 "product_name": "Malloc disk", 00:28:59.603 "block_size": 512, 00:28:59.603 "num_blocks": 65536, 00:28:59.603 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:28:59.603 "assigned_rate_limits": { 00:28:59.603 "rw_ios_per_sec": 0, 00:28:59.603 "rw_mbytes_per_sec": 0, 00:28:59.603 "r_mbytes_per_sec": 0, 00:28:59.603 "w_mbytes_per_sec": 0 00:28:59.603 }, 00:28:59.603 "claimed": true, 00:28:59.603 "claim_type": "exclusive_write", 00:28:59.603 "zoned": false, 00:28:59.603 "supported_io_types": { 00:28:59.603 "read": true, 00:28:59.603 "write": true, 00:28:59.603 "unmap": true, 00:28:59.603 "write_zeroes": true, 00:28:59.603 "flush": true, 00:28:59.603 "reset": true, 00:28:59.603 "compare": false, 00:28:59.603 "compare_and_write": false, 00:28:59.603 "abort": true, 00:28:59.603 "nvme_admin": false, 00:28:59.603 "nvme_io": false 00:28:59.603 }, 00:28:59.603 "memory_domains": [ 00:28:59.603 { 00:28:59.603 "dma_device_id": "system", 00:28:59.603 "dma_device_type": 1 00:28:59.603 }, 00:28:59.603 { 00:28:59.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:59.603 "dma_device_type": 2 00:28:59.603 } 00:28:59.603 ], 00:28:59.603 "driver_specific": {} 00:28:59.603 }' 00:28:59.603 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:59.864 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:29:00.122 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:00.381 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:00.381 "name": "BaseBdev4", 00:29:00.381 "aliases": [ 00:29:00.381 "e4bf9b86-3e7e-468a-8bb6-678c9c488d03" 00:29:00.381 ], 00:29:00.381 "product_name": "Malloc disk", 00:29:00.381 "block_size": 512, 00:29:00.381 "num_blocks": 65536, 00:29:00.381 "uuid": "e4bf9b86-3e7e-468a-8bb6-678c9c488d03", 00:29:00.381 "assigned_rate_limits": { 00:29:00.381 "rw_ios_per_sec": 0, 00:29:00.381 "rw_mbytes_per_sec": 0, 00:29:00.381 "r_mbytes_per_sec": 0, 00:29:00.381 "w_mbytes_per_sec": 0 00:29:00.381 }, 00:29:00.381 "claimed": true, 00:29:00.381 "claim_type": "exclusive_write", 00:29:00.381 "zoned": false, 00:29:00.381 "supported_io_types": { 00:29:00.381 "read": true, 00:29:00.381 "write": true, 00:29:00.381 "unmap": true, 00:29:00.381 "write_zeroes": true, 00:29:00.381 "flush": true, 00:29:00.381 "reset": true, 00:29:00.381 "compare": false, 00:29:00.381 "compare_and_write": false, 00:29:00.381 "abort": true, 00:29:00.381 "nvme_admin": false, 00:29:00.381 "nvme_io": false 00:29:00.381 }, 00:29:00.381 "memory_domains": [ 00:29:00.381 { 00:29:00.381 "dma_device_id": "system", 00:29:00.381 "dma_device_type": 1 00:29:00.381 }, 00:29:00.381 { 00:29:00.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:00.381 "dma_device_type": 2 00:29:00.381 } 00:29:00.381 ], 00:29:00.382 "driver_specific": {} 00:29:00.382 }' 00:29:00.382 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:00.382 12:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:00.382 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:00.382 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:00.641 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:01.209 [2024-06-07 12:31:24.563257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:01.209 [2024-06-07 12:31:24.563318] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:01.209 [2024-06-07 12:31:24.563398] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:01.209 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.468 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.468 "name": "Existed_Raid", 00:29:01.468 "uuid": "31aa147e-6636-49e3-9b3a-45b287867999", 00:29:01.468 "strip_size_kb": 64, 00:29:01.468 "state": "offline", 00:29:01.468 "raid_level": "raid0", 00:29:01.468 "superblock": false, 00:29:01.468 "num_base_bdevs": 4, 00:29:01.468 "num_base_bdevs_discovered": 3, 00:29:01.468 "num_base_bdevs_operational": 3, 00:29:01.468 "base_bdevs_list": [ 00:29:01.468 { 00:29:01.468 "name": null, 00:29:01.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.468 "is_configured": false, 00:29:01.468 "data_offset": 0, 00:29:01.468 "data_size": 65536 00:29:01.468 }, 00:29:01.468 { 00:29:01.468 "name": "BaseBdev2", 00:29:01.468 "uuid": "f05fa604-33b9-475a-be6e-c4ec25b67c00", 00:29:01.468 "is_configured": true, 00:29:01.469 "data_offset": 0, 00:29:01.469 "data_size": 65536 00:29:01.469 }, 00:29:01.469 { 00:29:01.469 "name": "BaseBdev3", 00:29:01.469 "uuid": "0ea3034a-45af-434e-bc9c-04f2b8387d8a", 00:29:01.469 "is_configured": true, 00:29:01.469 "data_offset": 0, 00:29:01.469 "data_size": 65536 00:29:01.469 }, 00:29:01.469 { 00:29:01.469 "name": "BaseBdev4", 00:29:01.469 "uuid": "e4bf9b86-3e7e-468a-8bb6-678c9c488d03", 00:29:01.469 "is_configured": true, 00:29:01.469 "data_offset": 0, 00:29:01.469 "data_size": 65536 00:29:01.469 } 00:29:01.469 ] 00:29:01.469 }' 00:29:01.469 12:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.469 12:31:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:02.035 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:02.035 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:02.035 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.035 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:02.293 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:02.293 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:02.293 12:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:02.551 [2024-06-07 12:31:25.970366] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:02.551 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:02.551 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:02.551 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.551 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:02.809 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:02.809 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:02.809 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:29:03.068 [2024-06-07 12:31:26.487678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:03.068 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:03.068 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:03.068 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.068 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:03.326 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:03.326 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:03.326 12:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:29:03.585 [2024-06-07 12:31:27.069364] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:29:03.585 [2024-06-07 12:31:27.069437] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:29:03.585 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:03.585 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:03.585 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:03.585 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:03.842 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:04.100 BaseBdev2 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:04.101 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:04.361 12:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:04.361 [ 00:29:04.361 { 00:29:04.361 "name": "BaseBdev2", 00:29:04.361 "aliases": [ 00:29:04.361 "bd5d6128-46cd-4871-9ead-29c5339562f4" 00:29:04.361 ], 00:29:04.361 "product_name": "Malloc disk", 00:29:04.361 "block_size": 512, 00:29:04.361 "num_blocks": 65536, 00:29:04.361 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:04.361 "assigned_rate_limits": { 00:29:04.361 "rw_ios_per_sec": 0, 00:29:04.361 "rw_mbytes_per_sec": 0, 00:29:04.361 "r_mbytes_per_sec": 0, 00:29:04.361 "w_mbytes_per_sec": 0 00:29:04.361 }, 00:29:04.361 "claimed": false, 00:29:04.361 "zoned": false, 00:29:04.361 "supported_io_types": { 00:29:04.361 "read": true, 00:29:04.361 "write": true, 00:29:04.361 "unmap": true, 00:29:04.361 "write_zeroes": true, 00:29:04.361 "flush": true, 00:29:04.361 "reset": true, 00:29:04.361 "compare": false, 00:29:04.361 "compare_and_write": false, 00:29:04.361 "abort": true, 00:29:04.361 "nvme_admin": false, 00:29:04.361 "nvme_io": false 00:29:04.361 }, 00:29:04.361 "memory_domains": [ 00:29:04.361 { 00:29:04.361 "dma_device_id": "system", 00:29:04.361 "dma_device_type": 1 00:29:04.361 }, 00:29:04.361 { 00:29:04.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:04.361 "dma_device_type": 2 00:29:04.361 } 00:29:04.361 ], 00:29:04.361 "driver_specific": {} 00:29:04.361 } 00:29:04.361 ] 00:29:04.618 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:04.618 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:04.618 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:04.618 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:29:04.618 BaseBdev3 00:29:04.618 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:04.619 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:04.876 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:29:05.133 [ 00:29:05.133 { 00:29:05.133 "name": "BaseBdev3", 00:29:05.133 "aliases": [ 00:29:05.133 "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d" 00:29:05.133 ], 00:29:05.133 "product_name": "Malloc disk", 00:29:05.133 "block_size": 512, 00:29:05.133 "num_blocks": 65536, 00:29:05.133 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:05.133 "assigned_rate_limits": { 00:29:05.133 "rw_ios_per_sec": 0, 00:29:05.133 "rw_mbytes_per_sec": 0, 00:29:05.133 "r_mbytes_per_sec": 0, 00:29:05.133 "w_mbytes_per_sec": 0 00:29:05.133 }, 00:29:05.133 "claimed": false, 00:29:05.133 "zoned": false, 00:29:05.133 "supported_io_types": { 00:29:05.133 "read": true, 00:29:05.133 "write": true, 00:29:05.133 "unmap": true, 00:29:05.133 "write_zeroes": true, 00:29:05.133 "flush": true, 00:29:05.133 "reset": true, 00:29:05.133 "compare": false, 00:29:05.133 "compare_and_write": false, 00:29:05.133 "abort": true, 00:29:05.133 "nvme_admin": false, 00:29:05.133 "nvme_io": false 00:29:05.133 }, 00:29:05.133 "memory_domains": [ 00:29:05.133 { 00:29:05.133 "dma_device_id": "system", 00:29:05.133 "dma_device_type": 1 00:29:05.133 }, 00:29:05.133 { 00:29:05.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:05.133 "dma_device_type": 2 00:29:05.133 } 00:29:05.133 ], 00:29:05.133 "driver_specific": {} 00:29:05.133 } 00:29:05.133 ] 00:29:05.133 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:05.133 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:05.133 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:05.133 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:29:05.391 BaseBdev4 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:05.391 12:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:05.650 12:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:29:05.909 [ 00:29:05.909 { 00:29:05.909 "name": "BaseBdev4", 00:29:05.909 "aliases": [ 00:29:05.909 "3966e7b4-ee4e-49e6-a837-3f87524c8213" 00:29:05.909 ], 00:29:05.909 "product_name": "Malloc disk", 00:29:05.909 "block_size": 512, 00:29:05.909 "num_blocks": 65536, 00:29:05.909 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:05.909 "assigned_rate_limits": { 00:29:05.909 "rw_ios_per_sec": 0, 00:29:05.909 "rw_mbytes_per_sec": 0, 00:29:05.909 "r_mbytes_per_sec": 0, 00:29:05.909 "w_mbytes_per_sec": 0 00:29:05.909 }, 00:29:05.909 "claimed": false, 00:29:05.909 "zoned": false, 00:29:05.909 "supported_io_types": { 00:29:05.909 "read": true, 00:29:05.909 "write": true, 00:29:05.909 "unmap": true, 00:29:05.909 "write_zeroes": true, 00:29:05.909 "flush": true, 00:29:05.909 "reset": true, 00:29:05.909 "compare": false, 00:29:05.909 "compare_and_write": false, 00:29:05.909 "abort": true, 00:29:05.909 "nvme_admin": false, 00:29:05.909 "nvme_io": false 00:29:05.909 }, 00:29:05.909 "memory_domains": [ 00:29:05.909 { 00:29:05.909 "dma_device_id": "system", 00:29:05.909 "dma_device_type": 1 00:29:05.909 }, 00:29:05.909 { 00:29:05.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:05.909 "dma_device_type": 2 00:29:05.909 } 00:29:05.909 ], 00:29:05.909 "driver_specific": {} 00:29:05.909 } 00:29:05.909 ] 00:29:05.909 12:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:05.909 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:05.909 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:05.909 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:29:06.172 [2024-06-07 12:31:29.630062] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:06.172 [2024-06-07 12:31:29.630188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:06.172 [2024-06-07 12:31:29.630246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:06.172 [2024-06-07 12:31:29.632928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:06.172 [2024-06-07 12:31:29.632990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.172 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:06.432 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.432 "name": "Existed_Raid", 00:29:06.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.432 "strip_size_kb": 64, 00:29:06.432 "state": "configuring", 00:29:06.432 "raid_level": "raid0", 00:29:06.432 "superblock": false, 00:29:06.432 "num_base_bdevs": 4, 00:29:06.432 "num_base_bdevs_discovered": 3, 00:29:06.432 "num_base_bdevs_operational": 4, 00:29:06.432 "base_bdevs_list": [ 00:29:06.432 { 00:29:06.432 "name": "BaseBdev1", 00:29:06.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.432 "is_configured": false, 00:29:06.432 "data_offset": 0, 00:29:06.432 "data_size": 0 00:29:06.432 }, 00:29:06.432 { 00:29:06.432 "name": "BaseBdev2", 00:29:06.432 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:06.432 "is_configured": true, 00:29:06.432 "data_offset": 0, 00:29:06.432 "data_size": 65536 00:29:06.432 }, 00:29:06.432 { 00:29:06.432 "name": "BaseBdev3", 00:29:06.432 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:06.432 "is_configured": true, 00:29:06.432 "data_offset": 0, 00:29:06.432 "data_size": 65536 00:29:06.432 }, 00:29:06.432 { 00:29:06.432 "name": "BaseBdev4", 00:29:06.432 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:06.432 "is_configured": true, 00:29:06.432 "data_offset": 0, 00:29:06.432 "data_size": 65536 00:29:06.432 } 00:29:06.432 ] 00:29:06.432 }' 00:29:06.432 12:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.432 12:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:07.007 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:29:07.007 [2024-06-07 12:31:30.638152] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.274 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:07.545 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.545 "name": "Existed_Raid", 00:29:07.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.545 "strip_size_kb": 64, 00:29:07.545 "state": "configuring", 00:29:07.545 "raid_level": "raid0", 00:29:07.545 "superblock": false, 00:29:07.545 "num_base_bdevs": 4, 00:29:07.545 "num_base_bdevs_discovered": 2, 00:29:07.545 "num_base_bdevs_operational": 4, 00:29:07.545 "base_bdevs_list": [ 00:29:07.545 { 00:29:07.545 "name": "BaseBdev1", 00:29:07.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.545 "is_configured": false, 00:29:07.545 "data_offset": 0, 00:29:07.545 "data_size": 0 00:29:07.545 }, 00:29:07.545 { 00:29:07.545 "name": null, 00:29:07.545 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:07.545 "is_configured": false, 00:29:07.545 "data_offset": 0, 00:29:07.545 "data_size": 65536 00:29:07.545 }, 00:29:07.545 { 00:29:07.545 "name": "BaseBdev3", 00:29:07.545 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:07.545 "is_configured": true, 00:29:07.545 "data_offset": 0, 00:29:07.545 "data_size": 65536 00:29:07.545 }, 00:29:07.545 { 00:29:07.545 "name": "BaseBdev4", 00:29:07.545 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:07.545 "is_configured": true, 00:29:07.545 "data_offset": 0, 00:29:07.545 "data_size": 65536 00:29:07.545 } 00:29:07.545 ] 00:29:07.545 }' 00:29:07.545 12:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.545 12:31:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:08.130 12:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:08.130 12:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.388 12:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:29:08.388 12:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:08.647 [2024-06-07 12:31:32.232413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:08.647 BaseBdev1 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:08.647 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:09.214 [ 00:29:09.214 { 00:29:09.214 "name": "BaseBdev1", 00:29:09.214 "aliases": [ 00:29:09.214 "763e97f4-f4fe-4447-a00e-3e2017d486f2" 00:29:09.214 ], 00:29:09.214 "product_name": "Malloc disk", 00:29:09.214 "block_size": 512, 00:29:09.214 "num_blocks": 65536, 00:29:09.214 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:09.214 "assigned_rate_limits": { 00:29:09.214 "rw_ios_per_sec": 0, 00:29:09.214 "rw_mbytes_per_sec": 0, 00:29:09.214 "r_mbytes_per_sec": 0, 00:29:09.214 "w_mbytes_per_sec": 0 00:29:09.214 }, 00:29:09.214 "claimed": true, 00:29:09.214 "claim_type": "exclusive_write", 00:29:09.214 "zoned": false, 00:29:09.214 "supported_io_types": { 00:29:09.214 "read": true, 00:29:09.214 "write": true, 00:29:09.214 "unmap": true, 00:29:09.214 "write_zeroes": true, 00:29:09.214 "flush": true, 00:29:09.214 "reset": true, 00:29:09.214 "compare": false, 00:29:09.214 "compare_and_write": false, 00:29:09.214 "abort": true, 00:29:09.214 "nvme_admin": false, 00:29:09.214 "nvme_io": false 00:29:09.214 }, 00:29:09.214 "memory_domains": [ 00:29:09.214 { 00:29:09.214 "dma_device_id": "system", 00:29:09.214 "dma_device_type": 1 00:29:09.214 }, 00:29:09.214 { 00:29:09.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.214 "dma_device_type": 2 00:29:09.214 } 00:29:09.214 ], 00:29:09.214 "driver_specific": {} 00:29:09.214 } 00:29:09.214 ] 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.214 12:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:09.472 12:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:09.472 "name": "Existed_Raid", 00:29:09.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:09.472 "strip_size_kb": 64, 00:29:09.472 "state": "configuring", 00:29:09.472 "raid_level": "raid0", 00:29:09.472 "superblock": false, 00:29:09.472 "num_base_bdevs": 4, 00:29:09.472 "num_base_bdevs_discovered": 3, 00:29:09.472 "num_base_bdevs_operational": 4, 00:29:09.472 "base_bdevs_list": [ 00:29:09.472 { 00:29:09.472 "name": "BaseBdev1", 00:29:09.472 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:09.472 "is_configured": true, 00:29:09.472 "data_offset": 0, 00:29:09.472 "data_size": 65536 00:29:09.472 }, 00:29:09.472 { 00:29:09.472 "name": null, 00:29:09.472 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:09.472 "is_configured": false, 00:29:09.472 "data_offset": 0, 00:29:09.472 "data_size": 65536 00:29:09.472 }, 00:29:09.472 { 00:29:09.473 "name": "BaseBdev3", 00:29:09.473 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:09.473 "is_configured": true, 00:29:09.473 "data_offset": 0, 00:29:09.473 "data_size": 65536 00:29:09.473 }, 00:29:09.473 { 00:29:09.473 "name": "BaseBdev4", 00:29:09.473 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:09.473 "is_configured": true, 00:29:09.473 "data_offset": 0, 00:29:09.473 "data_size": 65536 00:29:09.473 } 00:29:09.473 ] 00:29:09.473 }' 00:29:09.473 12:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:09.473 12:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:10.118 12:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.118 12:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:10.684 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:29:10.684 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:29:10.684 [2024-06-07 12:31:34.313853] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.944 "name": "Existed_Raid", 00:29:10.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:10.944 "strip_size_kb": 64, 00:29:10.944 "state": "configuring", 00:29:10.944 "raid_level": "raid0", 00:29:10.944 "superblock": false, 00:29:10.944 "num_base_bdevs": 4, 00:29:10.944 "num_base_bdevs_discovered": 2, 00:29:10.944 "num_base_bdevs_operational": 4, 00:29:10.944 "base_bdevs_list": [ 00:29:10.944 { 00:29:10.944 "name": "BaseBdev1", 00:29:10.944 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:10.944 "is_configured": true, 00:29:10.944 "data_offset": 0, 00:29:10.944 "data_size": 65536 00:29:10.944 }, 00:29:10.944 { 00:29:10.944 "name": null, 00:29:10.944 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:10.944 "is_configured": false, 00:29:10.944 "data_offset": 0, 00:29:10.944 "data_size": 65536 00:29:10.944 }, 00:29:10.944 { 00:29:10.944 "name": null, 00:29:10.944 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:10.944 "is_configured": false, 00:29:10.944 "data_offset": 0, 00:29:10.944 "data_size": 65536 00:29:10.944 }, 00:29:10.944 { 00:29:10.944 "name": "BaseBdev4", 00:29:10.944 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:10.944 "is_configured": true, 00:29:10.944 "data_offset": 0, 00:29:10.944 "data_size": 65536 00:29:10.944 } 00:29:10.944 ] 00:29:10.944 }' 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.944 12:31:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:11.880 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.880 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:11.880 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:29:11.880 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:29:12.139 [2024-06-07 12:31:35.676749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:12.139 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.398 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.398 "name": "Existed_Raid", 00:29:12.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.398 "strip_size_kb": 64, 00:29:12.398 "state": "configuring", 00:29:12.398 "raid_level": "raid0", 00:29:12.398 "superblock": false, 00:29:12.398 "num_base_bdevs": 4, 00:29:12.398 "num_base_bdevs_discovered": 3, 00:29:12.398 "num_base_bdevs_operational": 4, 00:29:12.398 "base_bdevs_list": [ 00:29:12.398 { 00:29:12.398 "name": "BaseBdev1", 00:29:12.398 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:12.398 "is_configured": true, 00:29:12.398 "data_offset": 0, 00:29:12.398 "data_size": 65536 00:29:12.398 }, 00:29:12.398 { 00:29:12.398 "name": null, 00:29:12.398 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:12.398 "is_configured": false, 00:29:12.398 "data_offset": 0, 00:29:12.398 "data_size": 65536 00:29:12.398 }, 00:29:12.398 { 00:29:12.398 "name": "BaseBdev3", 00:29:12.398 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:12.398 "is_configured": true, 00:29:12.398 "data_offset": 0, 00:29:12.398 "data_size": 65536 00:29:12.398 }, 00:29:12.398 { 00:29:12.398 "name": "BaseBdev4", 00:29:12.398 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:12.398 "is_configured": true, 00:29:12.398 "data_offset": 0, 00:29:12.398 "data_size": 65536 00:29:12.398 } 00:29:12.398 ] 00:29:12.398 }' 00:29:12.398 12:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.398 12:31:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:12.964 12:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.964 12:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:13.224 12:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:29:13.224 12:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:13.483 [2024-06-07 12:31:37.108920] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:13.743 "name": "Existed_Raid", 00:29:13.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:13.743 "strip_size_kb": 64, 00:29:13.743 "state": "configuring", 00:29:13.743 "raid_level": "raid0", 00:29:13.743 "superblock": false, 00:29:13.743 "num_base_bdevs": 4, 00:29:13.743 "num_base_bdevs_discovered": 2, 00:29:13.743 "num_base_bdevs_operational": 4, 00:29:13.743 "base_bdevs_list": [ 00:29:13.743 { 00:29:13.743 "name": null, 00:29:13.743 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:13.743 "is_configured": false, 00:29:13.743 "data_offset": 0, 00:29:13.743 "data_size": 65536 00:29:13.743 }, 00:29:13.743 { 00:29:13.743 "name": null, 00:29:13.743 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:13.743 "is_configured": false, 00:29:13.743 "data_offset": 0, 00:29:13.743 "data_size": 65536 00:29:13.743 }, 00:29:13.743 { 00:29:13.743 "name": "BaseBdev3", 00:29:13.743 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:13.743 "is_configured": true, 00:29:13.743 "data_offset": 0, 00:29:13.743 "data_size": 65536 00:29:13.743 }, 00:29:13.743 { 00:29:13.743 "name": "BaseBdev4", 00:29:13.743 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:13.743 "is_configured": true, 00:29:13.743 "data_offset": 0, 00:29:13.743 "data_size": 65536 00:29:13.743 } 00:29:13.743 ] 00:29:13.743 }' 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:13.743 12:31:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:14.678 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.678 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:14.936 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:29:14.936 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:29:15.194 [2024-06-07 12:31:38.645377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.194 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:15.453 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:15.453 "name": "Existed_Raid", 00:29:15.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.453 "strip_size_kb": 64, 00:29:15.453 "state": "configuring", 00:29:15.453 "raid_level": "raid0", 00:29:15.453 "superblock": false, 00:29:15.453 "num_base_bdevs": 4, 00:29:15.453 "num_base_bdevs_discovered": 3, 00:29:15.453 "num_base_bdevs_operational": 4, 00:29:15.453 "base_bdevs_list": [ 00:29:15.453 { 00:29:15.453 "name": null, 00:29:15.453 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:15.453 "is_configured": false, 00:29:15.453 "data_offset": 0, 00:29:15.453 "data_size": 65536 00:29:15.453 }, 00:29:15.453 { 00:29:15.453 "name": "BaseBdev2", 00:29:15.453 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:15.453 "is_configured": true, 00:29:15.453 "data_offset": 0, 00:29:15.453 "data_size": 65536 00:29:15.453 }, 00:29:15.453 { 00:29:15.453 "name": "BaseBdev3", 00:29:15.453 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:15.453 "is_configured": true, 00:29:15.453 "data_offset": 0, 00:29:15.453 "data_size": 65536 00:29:15.453 }, 00:29:15.453 { 00:29:15.453 "name": "BaseBdev4", 00:29:15.453 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:15.453 "is_configured": true, 00:29:15.453 "data_offset": 0, 00:29:15.453 "data_size": 65536 00:29:15.453 } 00:29:15.453 ] 00:29:15.453 }' 00:29:15.453 12:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:15.453 12:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:16.020 12:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.020 12:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:16.278 12:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:29:16.278 12:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:29:16.278 12:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.536 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 763e97f4-f4fe-4447-a00e-3e2017d486f2 00:29:16.795 [2024-06-07 12:31:40.329780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:29:16.795 [2024-06-07 12:31:40.330108] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:29:16.795 [2024-06-07 12:31:40.330172] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:29:16.795 [2024-06-07 12:31:40.330435] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:29:16.795 [2024-06-07 12:31:40.330904] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:29:16.795 [2024-06-07 12:31:40.331052] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:29:16.795 [2024-06-07 12:31:40.331370] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:16.795 NewBaseBdev 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:16.795 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:17.054 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:29:17.312 [ 00:29:17.312 { 00:29:17.312 "name": "NewBaseBdev", 00:29:17.312 "aliases": [ 00:29:17.312 "763e97f4-f4fe-4447-a00e-3e2017d486f2" 00:29:17.312 ], 00:29:17.312 "product_name": "Malloc disk", 00:29:17.312 "block_size": 512, 00:29:17.312 "num_blocks": 65536, 00:29:17.312 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:17.312 "assigned_rate_limits": { 00:29:17.312 "rw_ios_per_sec": 0, 00:29:17.312 "rw_mbytes_per_sec": 0, 00:29:17.312 "r_mbytes_per_sec": 0, 00:29:17.312 "w_mbytes_per_sec": 0 00:29:17.312 }, 00:29:17.312 "claimed": true, 00:29:17.312 "claim_type": "exclusive_write", 00:29:17.312 "zoned": false, 00:29:17.312 "supported_io_types": { 00:29:17.312 "read": true, 00:29:17.312 "write": true, 00:29:17.312 "unmap": true, 00:29:17.312 "write_zeroes": true, 00:29:17.312 "flush": true, 00:29:17.312 "reset": true, 00:29:17.312 "compare": false, 00:29:17.312 "compare_and_write": false, 00:29:17.312 "abort": true, 00:29:17.312 "nvme_admin": false, 00:29:17.312 "nvme_io": false 00:29:17.312 }, 00:29:17.312 "memory_domains": [ 00:29:17.312 { 00:29:17.312 "dma_device_id": "system", 00:29:17.312 "dma_device_type": 1 00:29:17.312 }, 00:29:17.312 { 00:29:17.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:17.312 "dma_device_type": 2 00:29:17.312 } 00:29:17.312 ], 00:29:17.312 "driver_specific": {} 00:29:17.312 } 00:29:17.312 ] 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.312 12:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:17.570 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.570 "name": "Existed_Raid", 00:29:17.570 "uuid": "8f527e0f-bde0-4e73-a244-8e1e45ee2ac8", 00:29:17.570 "strip_size_kb": 64, 00:29:17.570 "state": "online", 00:29:17.570 "raid_level": "raid0", 00:29:17.570 "superblock": false, 00:29:17.570 "num_base_bdevs": 4, 00:29:17.571 "num_base_bdevs_discovered": 4, 00:29:17.571 "num_base_bdevs_operational": 4, 00:29:17.571 "base_bdevs_list": [ 00:29:17.571 { 00:29:17.571 "name": "NewBaseBdev", 00:29:17.571 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:17.571 "is_configured": true, 00:29:17.571 "data_offset": 0, 00:29:17.571 "data_size": 65536 00:29:17.571 }, 00:29:17.571 { 00:29:17.571 "name": "BaseBdev2", 00:29:17.571 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:17.571 "is_configured": true, 00:29:17.571 "data_offset": 0, 00:29:17.571 "data_size": 65536 00:29:17.571 }, 00:29:17.571 { 00:29:17.571 "name": "BaseBdev3", 00:29:17.571 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:17.571 "is_configured": true, 00:29:17.571 "data_offset": 0, 00:29:17.571 "data_size": 65536 00:29:17.571 }, 00:29:17.571 { 00:29:17.571 "name": "BaseBdev4", 00:29:17.571 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:17.571 "is_configured": true, 00:29:17.571 "data_offset": 0, 00:29:17.571 "data_size": 65536 00:29:17.571 } 00:29:17.571 ] 00:29:17.571 }' 00:29:17.571 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.571 12:31:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:18.146 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:18.404 [2024-06-07 12:31:41.914051] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:18.404 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:18.404 "name": "Existed_Raid", 00:29:18.404 "aliases": [ 00:29:18.404 "8f527e0f-bde0-4e73-a244-8e1e45ee2ac8" 00:29:18.404 ], 00:29:18.404 "product_name": "Raid Volume", 00:29:18.404 "block_size": 512, 00:29:18.404 "num_blocks": 262144, 00:29:18.404 "uuid": "8f527e0f-bde0-4e73-a244-8e1e45ee2ac8", 00:29:18.404 "assigned_rate_limits": { 00:29:18.404 "rw_ios_per_sec": 0, 00:29:18.404 "rw_mbytes_per_sec": 0, 00:29:18.404 "r_mbytes_per_sec": 0, 00:29:18.404 "w_mbytes_per_sec": 0 00:29:18.404 }, 00:29:18.404 "claimed": false, 00:29:18.404 "zoned": false, 00:29:18.404 "supported_io_types": { 00:29:18.404 "read": true, 00:29:18.404 "write": true, 00:29:18.404 "unmap": true, 00:29:18.404 "write_zeroes": true, 00:29:18.404 "flush": true, 00:29:18.404 "reset": true, 00:29:18.404 "compare": false, 00:29:18.404 "compare_and_write": false, 00:29:18.404 "abort": false, 00:29:18.404 "nvme_admin": false, 00:29:18.404 "nvme_io": false 00:29:18.404 }, 00:29:18.404 "memory_domains": [ 00:29:18.404 { 00:29:18.404 "dma_device_id": "system", 00:29:18.404 "dma_device_type": 1 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.404 "dma_device_type": 2 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "system", 00:29:18.404 "dma_device_type": 1 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.404 "dma_device_type": 2 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "system", 00:29:18.404 "dma_device_type": 1 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.404 "dma_device_type": 2 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "system", 00:29:18.404 "dma_device_type": 1 00:29:18.404 }, 00:29:18.404 { 00:29:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.404 "dma_device_type": 2 00:29:18.404 } 00:29:18.404 ], 00:29:18.404 "driver_specific": { 00:29:18.404 "raid": { 00:29:18.405 "uuid": "8f527e0f-bde0-4e73-a244-8e1e45ee2ac8", 00:29:18.405 "strip_size_kb": 64, 00:29:18.405 "state": "online", 00:29:18.405 "raid_level": "raid0", 00:29:18.405 "superblock": false, 00:29:18.405 "num_base_bdevs": 4, 00:29:18.405 "num_base_bdevs_discovered": 4, 00:29:18.405 "num_base_bdevs_operational": 4, 00:29:18.405 "base_bdevs_list": [ 00:29:18.405 { 00:29:18.405 "name": "NewBaseBdev", 00:29:18.405 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:18.405 "is_configured": true, 00:29:18.405 "data_offset": 0, 00:29:18.405 "data_size": 65536 00:29:18.405 }, 00:29:18.405 { 00:29:18.405 "name": "BaseBdev2", 00:29:18.405 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:18.405 "is_configured": true, 00:29:18.405 "data_offset": 0, 00:29:18.405 "data_size": 65536 00:29:18.405 }, 00:29:18.405 { 00:29:18.405 "name": "BaseBdev3", 00:29:18.405 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:18.405 "is_configured": true, 00:29:18.405 "data_offset": 0, 00:29:18.405 "data_size": 65536 00:29:18.405 }, 00:29:18.405 { 00:29:18.405 "name": "BaseBdev4", 00:29:18.405 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:18.405 "is_configured": true, 00:29:18.405 "data_offset": 0, 00:29:18.405 "data_size": 65536 00:29:18.405 } 00:29:18.405 ] 00:29:18.405 } 00:29:18.405 } 00:29:18.405 }' 00:29:18.405 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:18.405 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:29:18.405 BaseBdev2 00:29:18.405 BaseBdev3 00:29:18.405 BaseBdev4' 00:29:18.405 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:18.405 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:29:18.405 12:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:18.971 "name": "NewBaseBdev", 00:29:18.971 "aliases": [ 00:29:18.971 "763e97f4-f4fe-4447-a00e-3e2017d486f2" 00:29:18.971 ], 00:29:18.971 "product_name": "Malloc disk", 00:29:18.971 "block_size": 512, 00:29:18.971 "num_blocks": 65536, 00:29:18.971 "uuid": "763e97f4-f4fe-4447-a00e-3e2017d486f2", 00:29:18.971 "assigned_rate_limits": { 00:29:18.971 "rw_ios_per_sec": 0, 00:29:18.971 "rw_mbytes_per_sec": 0, 00:29:18.971 "r_mbytes_per_sec": 0, 00:29:18.971 "w_mbytes_per_sec": 0 00:29:18.971 }, 00:29:18.971 "claimed": true, 00:29:18.971 "claim_type": "exclusive_write", 00:29:18.971 "zoned": false, 00:29:18.971 "supported_io_types": { 00:29:18.971 "read": true, 00:29:18.971 "write": true, 00:29:18.971 "unmap": true, 00:29:18.971 "write_zeroes": true, 00:29:18.971 "flush": true, 00:29:18.971 "reset": true, 00:29:18.971 "compare": false, 00:29:18.971 "compare_and_write": false, 00:29:18.971 "abort": true, 00:29:18.971 "nvme_admin": false, 00:29:18.971 "nvme_io": false 00:29:18.971 }, 00:29:18.971 "memory_domains": [ 00:29:18.971 { 00:29:18.971 "dma_device_id": "system", 00:29:18.971 "dma_device_type": 1 00:29:18.971 }, 00:29:18.971 { 00:29:18.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.971 "dma_device_type": 2 00:29:18.971 } 00:29:18.971 ], 00:29:18.971 "driver_specific": {} 00:29:18.971 }' 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:18.971 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.229 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.229 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:19.229 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:19.229 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:19.229 12:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:19.487 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:19.487 "name": "BaseBdev2", 00:29:19.487 "aliases": [ 00:29:19.487 "bd5d6128-46cd-4871-9ead-29c5339562f4" 00:29:19.487 ], 00:29:19.487 "product_name": "Malloc disk", 00:29:19.487 "block_size": 512, 00:29:19.487 "num_blocks": 65536, 00:29:19.487 "uuid": "bd5d6128-46cd-4871-9ead-29c5339562f4", 00:29:19.487 "assigned_rate_limits": { 00:29:19.487 "rw_ios_per_sec": 0, 00:29:19.487 "rw_mbytes_per_sec": 0, 00:29:19.487 "r_mbytes_per_sec": 0, 00:29:19.487 "w_mbytes_per_sec": 0 00:29:19.487 }, 00:29:19.487 "claimed": true, 00:29:19.487 "claim_type": "exclusive_write", 00:29:19.487 "zoned": false, 00:29:19.487 "supported_io_types": { 00:29:19.487 "read": true, 00:29:19.487 "write": true, 00:29:19.487 "unmap": true, 00:29:19.487 "write_zeroes": true, 00:29:19.487 "flush": true, 00:29:19.487 "reset": true, 00:29:19.487 "compare": false, 00:29:19.487 "compare_and_write": false, 00:29:19.487 "abort": true, 00:29:19.487 "nvme_admin": false, 00:29:19.487 "nvme_io": false 00:29:19.487 }, 00:29:19.487 "memory_domains": [ 00:29:19.487 { 00:29:19.487 "dma_device_id": "system", 00:29:19.487 "dma_device_type": 1 00:29:19.487 }, 00:29:19.487 { 00:29:19.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:19.487 "dma_device_type": 2 00:29:19.487 } 00:29:19.487 ], 00:29:19.487 "driver_specific": {} 00:29:19.487 }' 00:29:19.487 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.487 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.487 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:19.487 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.745 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:19.746 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:20.312 "name": "BaseBdev3", 00:29:20.312 "aliases": [ 00:29:20.312 "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d" 00:29:20.312 ], 00:29:20.312 "product_name": "Malloc disk", 00:29:20.312 "block_size": 512, 00:29:20.312 "num_blocks": 65536, 00:29:20.312 "uuid": "d6a2c073-2c83-49e3-bdc8-4dc37b5d861d", 00:29:20.312 "assigned_rate_limits": { 00:29:20.312 "rw_ios_per_sec": 0, 00:29:20.312 "rw_mbytes_per_sec": 0, 00:29:20.312 "r_mbytes_per_sec": 0, 00:29:20.312 "w_mbytes_per_sec": 0 00:29:20.312 }, 00:29:20.312 "claimed": true, 00:29:20.312 "claim_type": "exclusive_write", 00:29:20.312 "zoned": false, 00:29:20.312 "supported_io_types": { 00:29:20.312 "read": true, 00:29:20.312 "write": true, 00:29:20.312 "unmap": true, 00:29:20.312 "write_zeroes": true, 00:29:20.312 "flush": true, 00:29:20.312 "reset": true, 00:29:20.312 "compare": false, 00:29:20.312 "compare_and_write": false, 00:29:20.312 "abort": true, 00:29:20.312 "nvme_admin": false, 00:29:20.312 "nvme_io": false 00:29:20.312 }, 00:29:20.312 "memory_domains": [ 00:29:20.312 { 00:29:20.312 "dma_device_id": "system", 00:29:20.312 "dma_device_type": 1 00:29:20.312 }, 00:29:20.312 { 00:29:20.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:20.312 "dma_device_type": 2 00:29:20.312 } 00:29:20.312 ], 00:29:20.312 "driver_specific": {} 00:29:20.312 }' 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:20.312 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:20.571 12:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:20.571 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:20.571 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:20.571 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:20.571 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:20.830 "name": "BaseBdev4", 00:29:20.830 "aliases": [ 00:29:20.830 "3966e7b4-ee4e-49e6-a837-3f87524c8213" 00:29:20.830 ], 00:29:20.830 "product_name": "Malloc disk", 00:29:20.830 "block_size": 512, 00:29:20.830 "num_blocks": 65536, 00:29:20.830 "uuid": "3966e7b4-ee4e-49e6-a837-3f87524c8213", 00:29:20.830 "assigned_rate_limits": { 00:29:20.830 "rw_ios_per_sec": 0, 00:29:20.830 "rw_mbytes_per_sec": 0, 00:29:20.830 "r_mbytes_per_sec": 0, 00:29:20.830 "w_mbytes_per_sec": 0 00:29:20.830 }, 00:29:20.830 "claimed": true, 00:29:20.830 "claim_type": "exclusive_write", 00:29:20.830 "zoned": false, 00:29:20.830 "supported_io_types": { 00:29:20.830 "read": true, 00:29:20.830 "write": true, 00:29:20.830 "unmap": true, 00:29:20.830 "write_zeroes": true, 00:29:20.830 "flush": true, 00:29:20.830 "reset": true, 00:29:20.830 "compare": false, 00:29:20.830 "compare_and_write": false, 00:29:20.830 "abort": true, 00:29:20.830 "nvme_admin": false, 00:29:20.830 "nvme_io": false 00:29:20.830 }, 00:29:20.830 "memory_domains": [ 00:29:20.830 { 00:29:20.830 "dma_device_id": "system", 00:29:20.830 "dma_device_type": 1 00:29:20.830 }, 00:29:20.830 { 00:29:20.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:20.830 "dma_device_type": 2 00:29:20.830 } 00:29:20.830 ], 00:29:20.830 "driver_specific": {} 00:29:20.830 }' 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:20.830 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:21.089 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:21.347 [2024-06-07 12:31:44.932713] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:21.347 [2024-06-07 12:31:44.932993] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:21.347 [2024-06-07 12:31:44.933199] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:21.347 [2024-06-07 12:31:44.933390] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:21.347 [2024-06-07 12:31:44.933488] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 211497 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 211497 ']' 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 211497 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 211497 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 211497' 00:29:21.347 killing process with pid 211497 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 211497 00:29:21.347 12:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 211497 00:29:21.347 [2024-06-07 12:31:44.990505] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:21.606 [2024-06-07 12:31:45.074478] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:21.864 12:31:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:29:21.864 00:29:21.864 real 0m34.726s 00:29:21.864 user 1m3.937s 00:29:21.864 sys 0m5.707s 00:29:21.864 12:31:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:21.864 12:31:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:29:21.864 ************************************ 00:29:21.864 END TEST raid_state_function_test 00:29:21.864 ************************************ 00:29:21.864 12:31:45 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:29:21.864 12:31:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:29:21.864 12:31:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:21.864 12:31:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:22.122 ************************************ 00:29:22.122 START TEST raid_state_function_test_sb 00:29:22.122 ************************************ 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 true 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=212596 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 212596' 00:29:22.122 Process raid pid: 212596 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 212596 /var/tmp/spdk-raid.sock 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 212596 ']' 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:22.122 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:22.123 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:22.123 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:22.123 12:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:22.123 [2024-06-07 12:31:45.561782] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:29:22.123 [2024-06-07 12:31:45.562563] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:22.123 [2024-06-07 12:31:45.713381] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:22.382 [2024-06-07 12:31:45.813709] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.382 [2024-06-07 12:31:45.900874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:29:23.359 [2024-06-07 12:31:46.877752] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:23.359 [2024-06-07 12:31:46.878524] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:23.359 [2024-06-07 12:31:46.878674] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:23.359 [2024-06-07 12:31:46.878810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:23.359 [2024-06-07 12:31:46.878915] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:23.359 [2024-06-07 12:31:46.879071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:23.359 [2024-06-07 12:31:46.879206] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:29:23.359 [2024-06-07 12:31:46.879423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.359 12:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:23.619 12:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.619 "name": "Existed_Raid", 00:29:23.619 "uuid": "2a45270c-1961-4fd9-b00b-95157509886b", 00:29:23.619 "strip_size_kb": 64, 00:29:23.619 "state": "configuring", 00:29:23.619 "raid_level": "raid0", 00:29:23.619 "superblock": true, 00:29:23.619 "num_base_bdevs": 4, 00:29:23.619 "num_base_bdevs_discovered": 0, 00:29:23.619 "num_base_bdevs_operational": 4, 00:29:23.619 "base_bdevs_list": [ 00:29:23.619 { 00:29:23.619 "name": "BaseBdev1", 00:29:23.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.619 "is_configured": false, 00:29:23.619 "data_offset": 0, 00:29:23.619 "data_size": 0 00:29:23.619 }, 00:29:23.619 { 00:29:23.619 "name": "BaseBdev2", 00:29:23.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.619 "is_configured": false, 00:29:23.619 "data_offset": 0, 00:29:23.619 "data_size": 0 00:29:23.619 }, 00:29:23.619 { 00:29:23.619 "name": "BaseBdev3", 00:29:23.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.619 "is_configured": false, 00:29:23.619 "data_offset": 0, 00:29:23.619 "data_size": 0 00:29:23.619 }, 00:29:23.619 { 00:29:23.619 "name": "BaseBdev4", 00:29:23.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.619 "is_configured": false, 00:29:23.619 "data_offset": 0, 00:29:23.619 "data_size": 0 00:29:23.619 } 00:29:23.619 ] 00:29:23.619 }' 00:29:23.619 12:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.619 12:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:24.185 12:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:24.444 [2024-06-07 12:31:48.033802] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:24.444 [2024-06-07 12:31:48.034189] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:29:24.444 12:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:29:24.703 [2024-06-07 12:31:48.285896] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:24.703 [2024-06-07 12:31:48.286684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:24.703 [2024-06-07 12:31:48.286847] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:24.703 [2024-06-07 12:31:48.287083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:24.703 [2024-06-07 12:31:48.287214] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:24.703 [2024-06-07 12:31:48.287440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:24.703 [2024-06-07 12:31:48.287578] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:29:24.703 [2024-06-07 12:31:48.287736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:29:24.703 12:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:25.270 [2024-06-07 12:31:48.633913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:25.270 BaseBdev1 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:25.270 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:25.528 12:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:25.786 [ 00:29:25.786 { 00:29:25.786 "name": "BaseBdev1", 00:29:25.786 "aliases": [ 00:29:25.786 "5c222d8c-2001-4054-b3e0-3702c25ab5a2" 00:29:25.786 ], 00:29:25.786 "product_name": "Malloc disk", 00:29:25.786 "block_size": 512, 00:29:25.786 "num_blocks": 65536, 00:29:25.786 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:25.786 "assigned_rate_limits": { 00:29:25.786 "rw_ios_per_sec": 0, 00:29:25.786 "rw_mbytes_per_sec": 0, 00:29:25.786 "r_mbytes_per_sec": 0, 00:29:25.786 "w_mbytes_per_sec": 0 00:29:25.786 }, 00:29:25.786 "claimed": true, 00:29:25.786 "claim_type": "exclusive_write", 00:29:25.786 "zoned": false, 00:29:25.786 "supported_io_types": { 00:29:25.786 "read": true, 00:29:25.786 "write": true, 00:29:25.786 "unmap": true, 00:29:25.786 "write_zeroes": true, 00:29:25.786 "flush": true, 00:29:25.786 "reset": true, 00:29:25.786 "compare": false, 00:29:25.786 "compare_and_write": false, 00:29:25.786 "abort": true, 00:29:25.786 "nvme_admin": false, 00:29:25.786 "nvme_io": false 00:29:25.786 }, 00:29:25.786 "memory_domains": [ 00:29:25.786 { 00:29:25.786 "dma_device_id": "system", 00:29:25.786 "dma_device_type": 1 00:29:25.786 }, 00:29:25.786 { 00:29:25.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:25.786 "dma_device_type": 2 00:29:25.786 } 00:29:25.786 ], 00:29:25.786 "driver_specific": {} 00:29:25.786 } 00:29:25.786 ] 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.786 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:26.045 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:26.045 "name": "Existed_Raid", 00:29:26.045 "uuid": "08a2ccc7-d8ed-4f1e-af9e-f02e24cb7f9a", 00:29:26.045 "strip_size_kb": 64, 00:29:26.045 "state": "configuring", 00:29:26.045 "raid_level": "raid0", 00:29:26.045 "superblock": true, 00:29:26.045 "num_base_bdevs": 4, 00:29:26.045 "num_base_bdevs_discovered": 1, 00:29:26.045 "num_base_bdevs_operational": 4, 00:29:26.045 "base_bdevs_list": [ 00:29:26.045 { 00:29:26.045 "name": "BaseBdev1", 00:29:26.045 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:26.045 "is_configured": true, 00:29:26.045 "data_offset": 2048, 00:29:26.045 "data_size": 63488 00:29:26.045 }, 00:29:26.045 { 00:29:26.045 "name": "BaseBdev2", 00:29:26.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.045 "is_configured": false, 00:29:26.045 "data_offset": 0, 00:29:26.045 "data_size": 0 00:29:26.045 }, 00:29:26.045 { 00:29:26.045 "name": "BaseBdev3", 00:29:26.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.045 "is_configured": false, 00:29:26.045 "data_offset": 0, 00:29:26.045 "data_size": 0 00:29:26.045 }, 00:29:26.045 { 00:29:26.045 "name": "BaseBdev4", 00:29:26.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.045 "is_configured": false, 00:29:26.045 "data_offset": 0, 00:29:26.045 "data_size": 0 00:29:26.045 } 00:29:26.045 ] 00:29:26.045 }' 00:29:26.045 12:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:26.045 12:31:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:26.612 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:26.871 [2024-06-07 12:31:50.426335] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:26.871 [2024-06-07 12:31:50.426694] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:29:26.871 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:29:27.438 [2024-06-07 12:31:50.834485] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:27.438 [2024-06-07 12:31:50.836782] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:27.438 [2024-06-07 12:31:50.837488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:27.438 [2024-06-07 12:31:50.837629] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:29:27.438 [2024-06-07 12:31:50.837769] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:29:27.438 [2024-06-07 12:31:50.837913] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:29:27.438 [2024-06-07 12:31:50.838029] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.438 12:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:27.697 12:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.697 "name": "Existed_Raid", 00:29:27.697 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:27.697 "strip_size_kb": 64, 00:29:27.697 "state": "configuring", 00:29:27.697 "raid_level": "raid0", 00:29:27.697 "superblock": true, 00:29:27.697 "num_base_bdevs": 4, 00:29:27.697 "num_base_bdevs_discovered": 1, 00:29:27.697 "num_base_bdevs_operational": 4, 00:29:27.697 "base_bdevs_list": [ 00:29:27.697 { 00:29:27.697 "name": "BaseBdev1", 00:29:27.697 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:27.697 "is_configured": true, 00:29:27.697 "data_offset": 2048, 00:29:27.697 "data_size": 63488 00:29:27.697 }, 00:29:27.697 { 00:29:27.697 "name": "BaseBdev2", 00:29:27.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.697 "is_configured": false, 00:29:27.697 "data_offset": 0, 00:29:27.697 "data_size": 0 00:29:27.697 }, 00:29:27.697 { 00:29:27.697 "name": "BaseBdev3", 00:29:27.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.697 "is_configured": false, 00:29:27.697 "data_offset": 0, 00:29:27.697 "data_size": 0 00:29:27.697 }, 00:29:27.697 { 00:29:27.697 "name": "BaseBdev4", 00:29:27.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.697 "is_configured": false, 00:29:27.697 "data_offset": 0, 00:29:27.697 "data_size": 0 00:29:27.697 } 00:29:27.697 ] 00:29:27.697 }' 00:29:27.697 12:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.697 12:31:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:28.263 12:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:28.829 [2024-06-07 12:31:52.314831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:28.829 BaseBdev2 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:28.829 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:29.087 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:29.345 [ 00:29:29.345 { 00:29:29.345 "name": "BaseBdev2", 00:29:29.345 "aliases": [ 00:29:29.345 "8e4d482e-c58f-4b52-a7c1-37a143b54fe0" 00:29:29.345 ], 00:29:29.345 "product_name": "Malloc disk", 00:29:29.345 "block_size": 512, 00:29:29.345 "num_blocks": 65536, 00:29:29.345 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:29.345 "assigned_rate_limits": { 00:29:29.345 "rw_ios_per_sec": 0, 00:29:29.345 "rw_mbytes_per_sec": 0, 00:29:29.345 "r_mbytes_per_sec": 0, 00:29:29.345 "w_mbytes_per_sec": 0 00:29:29.345 }, 00:29:29.345 "claimed": true, 00:29:29.345 "claim_type": "exclusive_write", 00:29:29.345 "zoned": false, 00:29:29.345 "supported_io_types": { 00:29:29.345 "read": true, 00:29:29.345 "write": true, 00:29:29.345 "unmap": true, 00:29:29.345 "write_zeroes": true, 00:29:29.345 "flush": true, 00:29:29.345 "reset": true, 00:29:29.345 "compare": false, 00:29:29.345 "compare_and_write": false, 00:29:29.345 "abort": true, 00:29:29.345 "nvme_admin": false, 00:29:29.345 "nvme_io": false 00:29:29.345 }, 00:29:29.345 "memory_domains": [ 00:29:29.345 { 00:29:29.345 "dma_device_id": "system", 00:29:29.345 "dma_device_type": 1 00:29:29.345 }, 00:29:29.345 { 00:29:29.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:29.345 "dma_device_type": 2 00:29:29.345 } 00:29:29.345 ], 00:29:29.345 "driver_specific": {} 00:29:29.345 } 00:29:29.345 ] 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.604 12:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:29.863 12:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:29.863 "name": "Existed_Raid", 00:29:29.863 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:29.863 "strip_size_kb": 64, 00:29:29.864 "state": "configuring", 00:29:29.864 "raid_level": "raid0", 00:29:29.864 "superblock": true, 00:29:29.864 "num_base_bdevs": 4, 00:29:29.864 "num_base_bdevs_discovered": 2, 00:29:29.864 "num_base_bdevs_operational": 4, 00:29:29.864 "base_bdevs_list": [ 00:29:29.864 { 00:29:29.864 "name": "BaseBdev1", 00:29:29.864 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:29.864 "is_configured": true, 00:29:29.864 "data_offset": 2048, 00:29:29.864 "data_size": 63488 00:29:29.864 }, 00:29:29.864 { 00:29:29.864 "name": "BaseBdev2", 00:29:29.864 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:29.864 "is_configured": true, 00:29:29.864 "data_offset": 2048, 00:29:29.864 "data_size": 63488 00:29:29.864 }, 00:29:29.864 { 00:29:29.864 "name": "BaseBdev3", 00:29:29.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.864 "is_configured": false, 00:29:29.864 "data_offset": 0, 00:29:29.864 "data_size": 0 00:29:29.864 }, 00:29:29.864 { 00:29:29.864 "name": "BaseBdev4", 00:29:29.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.864 "is_configured": false, 00:29:29.864 "data_offset": 0, 00:29:29.864 "data_size": 0 00:29:29.864 } 00:29:29.864 ] 00:29:29.864 }' 00:29:29.864 12:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:29.864 12:31:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:30.431 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:29:30.689 [2024-06-07 12:31:54.306516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:30.689 BaseBdev3 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:30.689 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:31.258 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:29:31.516 [ 00:29:31.516 { 00:29:31.516 "name": "BaseBdev3", 00:29:31.516 "aliases": [ 00:29:31.516 "e255ba5a-7cff-4288-9af8-fccda24c7fce" 00:29:31.516 ], 00:29:31.516 "product_name": "Malloc disk", 00:29:31.516 "block_size": 512, 00:29:31.516 "num_blocks": 65536, 00:29:31.516 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:31.516 "assigned_rate_limits": { 00:29:31.516 "rw_ios_per_sec": 0, 00:29:31.516 "rw_mbytes_per_sec": 0, 00:29:31.516 "r_mbytes_per_sec": 0, 00:29:31.516 "w_mbytes_per_sec": 0 00:29:31.516 }, 00:29:31.516 "claimed": true, 00:29:31.516 "claim_type": "exclusive_write", 00:29:31.516 "zoned": false, 00:29:31.516 "supported_io_types": { 00:29:31.516 "read": true, 00:29:31.516 "write": true, 00:29:31.516 "unmap": true, 00:29:31.516 "write_zeroes": true, 00:29:31.516 "flush": true, 00:29:31.516 "reset": true, 00:29:31.516 "compare": false, 00:29:31.516 "compare_and_write": false, 00:29:31.516 "abort": true, 00:29:31.516 "nvme_admin": false, 00:29:31.516 "nvme_io": false 00:29:31.516 }, 00:29:31.516 "memory_domains": [ 00:29:31.516 { 00:29:31.516 "dma_device_id": "system", 00:29:31.516 "dma_device_type": 1 00:29:31.516 }, 00:29:31.516 { 00:29:31.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:31.516 "dma_device_type": 2 00:29:31.516 } 00:29:31.516 ], 00:29:31.516 "driver_specific": {} 00:29:31.516 } 00:29:31.516 ] 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:31.516 12:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.775 12:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.775 "name": "Existed_Raid", 00:29:31.775 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:31.775 "strip_size_kb": 64, 00:29:31.775 "state": "configuring", 00:29:31.775 "raid_level": "raid0", 00:29:31.775 "superblock": true, 00:29:31.775 "num_base_bdevs": 4, 00:29:31.775 "num_base_bdevs_discovered": 3, 00:29:31.775 "num_base_bdevs_operational": 4, 00:29:31.775 "base_bdevs_list": [ 00:29:31.775 { 00:29:31.775 "name": "BaseBdev1", 00:29:31.775 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:31.775 "is_configured": true, 00:29:31.775 "data_offset": 2048, 00:29:31.775 "data_size": 63488 00:29:31.775 }, 00:29:31.775 { 00:29:31.775 "name": "BaseBdev2", 00:29:31.775 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:31.775 "is_configured": true, 00:29:31.775 "data_offset": 2048, 00:29:31.775 "data_size": 63488 00:29:31.775 }, 00:29:31.775 { 00:29:31.775 "name": "BaseBdev3", 00:29:31.775 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:31.775 "is_configured": true, 00:29:31.775 "data_offset": 2048, 00:29:31.775 "data_size": 63488 00:29:31.775 }, 00:29:31.775 { 00:29:31.775 "name": "BaseBdev4", 00:29:31.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:31.775 "is_configured": false, 00:29:31.775 "data_offset": 0, 00:29:31.775 "data_size": 0 00:29:31.775 } 00:29:31.775 ] 00:29:31.775 }' 00:29:31.775 12:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.775 12:31:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:32.709 12:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:29:32.967 [2024-06-07 12:31:56.435439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:29:32.967 [2024-06-07 12:31:56.436152] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:29:32.967 [2024-06-07 12:31:56.436397] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:29:32.967 [2024-06-07 12:31:56.436781] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:29:32.967 [2024-06-07 12:31:56.437537] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:29:32.967 [2024-06-07 12:31:56.437747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:29:32.967 [2024-06-07 12:31:56.438161] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:32.967 BaseBdev4 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:32.967 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:33.226 12:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:29:33.794 [ 00:29:33.794 { 00:29:33.794 "name": "BaseBdev4", 00:29:33.794 "aliases": [ 00:29:33.794 "b4e2997c-719f-49f9-aafa-257ff36a1fb1" 00:29:33.794 ], 00:29:33.794 "product_name": "Malloc disk", 00:29:33.794 "block_size": 512, 00:29:33.794 "num_blocks": 65536, 00:29:33.794 "uuid": "b4e2997c-719f-49f9-aafa-257ff36a1fb1", 00:29:33.794 "assigned_rate_limits": { 00:29:33.794 "rw_ios_per_sec": 0, 00:29:33.794 "rw_mbytes_per_sec": 0, 00:29:33.794 "r_mbytes_per_sec": 0, 00:29:33.794 "w_mbytes_per_sec": 0 00:29:33.794 }, 00:29:33.794 "claimed": true, 00:29:33.794 "claim_type": "exclusive_write", 00:29:33.794 "zoned": false, 00:29:33.794 "supported_io_types": { 00:29:33.794 "read": true, 00:29:33.794 "write": true, 00:29:33.794 "unmap": true, 00:29:33.794 "write_zeroes": true, 00:29:33.794 "flush": true, 00:29:33.794 "reset": true, 00:29:33.794 "compare": false, 00:29:33.794 "compare_and_write": false, 00:29:33.794 "abort": true, 00:29:33.794 "nvme_admin": false, 00:29:33.794 "nvme_io": false 00:29:33.794 }, 00:29:33.794 "memory_domains": [ 00:29:33.794 { 00:29:33.794 "dma_device_id": "system", 00:29:33.794 "dma_device_type": 1 00:29:33.794 }, 00:29:33.794 { 00:29:33.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:33.794 "dma_device_type": 2 00:29:33.794 } 00:29:33.794 ], 00:29:33.794 "driver_specific": {} 00:29:33.794 } 00:29:33.794 ] 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.794 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:34.051 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.051 "name": "Existed_Raid", 00:29:34.051 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:34.051 "strip_size_kb": 64, 00:29:34.051 "state": "online", 00:29:34.051 "raid_level": "raid0", 00:29:34.051 "superblock": true, 00:29:34.051 "num_base_bdevs": 4, 00:29:34.051 "num_base_bdevs_discovered": 4, 00:29:34.051 "num_base_bdevs_operational": 4, 00:29:34.051 "base_bdevs_list": [ 00:29:34.051 { 00:29:34.051 "name": "BaseBdev1", 00:29:34.051 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:34.051 "is_configured": true, 00:29:34.051 "data_offset": 2048, 00:29:34.051 "data_size": 63488 00:29:34.051 }, 00:29:34.051 { 00:29:34.051 "name": "BaseBdev2", 00:29:34.051 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:34.051 "is_configured": true, 00:29:34.051 "data_offset": 2048, 00:29:34.051 "data_size": 63488 00:29:34.051 }, 00:29:34.051 { 00:29:34.051 "name": "BaseBdev3", 00:29:34.051 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:34.051 "is_configured": true, 00:29:34.051 "data_offset": 2048, 00:29:34.051 "data_size": 63488 00:29:34.051 }, 00:29:34.051 { 00:29:34.051 "name": "BaseBdev4", 00:29:34.051 "uuid": "b4e2997c-719f-49f9-aafa-257ff36a1fb1", 00:29:34.051 "is_configured": true, 00:29:34.051 "data_offset": 2048, 00:29:34.051 "data_size": 63488 00:29:34.051 } 00:29:34.051 ] 00:29:34.051 }' 00:29:34.051 12:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.051 12:31:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:34.980 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:34.980 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:34.980 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:34.980 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:34.980 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:34.981 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:29:34.981 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:34.981 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:35.239 [2024-06-07 12:31:58.712119] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:35.239 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:35.239 "name": "Existed_Raid", 00:29:35.239 "aliases": [ 00:29:35.239 "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a" 00:29:35.239 ], 00:29:35.239 "product_name": "Raid Volume", 00:29:35.239 "block_size": 512, 00:29:35.239 "num_blocks": 253952, 00:29:35.239 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:35.239 "assigned_rate_limits": { 00:29:35.239 "rw_ios_per_sec": 0, 00:29:35.239 "rw_mbytes_per_sec": 0, 00:29:35.239 "r_mbytes_per_sec": 0, 00:29:35.239 "w_mbytes_per_sec": 0 00:29:35.239 }, 00:29:35.239 "claimed": false, 00:29:35.239 "zoned": false, 00:29:35.239 "supported_io_types": { 00:29:35.239 "read": true, 00:29:35.239 "write": true, 00:29:35.239 "unmap": true, 00:29:35.239 "write_zeroes": true, 00:29:35.239 "flush": true, 00:29:35.239 "reset": true, 00:29:35.239 "compare": false, 00:29:35.239 "compare_and_write": false, 00:29:35.239 "abort": false, 00:29:35.239 "nvme_admin": false, 00:29:35.239 "nvme_io": false 00:29:35.239 }, 00:29:35.239 "memory_domains": [ 00:29:35.239 { 00:29:35.239 "dma_device_id": "system", 00:29:35.239 "dma_device_type": 1 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.239 "dma_device_type": 2 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "system", 00:29:35.239 "dma_device_type": 1 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.239 "dma_device_type": 2 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "system", 00:29:35.239 "dma_device_type": 1 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.239 "dma_device_type": 2 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "system", 00:29:35.239 "dma_device_type": 1 00:29:35.239 }, 00:29:35.239 { 00:29:35.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.239 "dma_device_type": 2 00:29:35.239 } 00:29:35.239 ], 00:29:35.239 "driver_specific": { 00:29:35.239 "raid": { 00:29:35.239 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:35.239 "strip_size_kb": 64, 00:29:35.239 "state": "online", 00:29:35.239 "raid_level": "raid0", 00:29:35.239 "superblock": true, 00:29:35.239 "num_base_bdevs": 4, 00:29:35.239 "num_base_bdevs_discovered": 4, 00:29:35.239 "num_base_bdevs_operational": 4, 00:29:35.239 "base_bdevs_list": [ 00:29:35.239 { 00:29:35.239 "name": "BaseBdev1", 00:29:35.239 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:35.239 "is_configured": true, 00:29:35.239 "data_offset": 2048, 00:29:35.239 "data_size": 63488 00:29:35.239 }, 00:29:35.239 { 00:29:35.240 "name": "BaseBdev2", 00:29:35.240 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:35.240 "is_configured": true, 00:29:35.240 "data_offset": 2048, 00:29:35.240 "data_size": 63488 00:29:35.240 }, 00:29:35.240 { 00:29:35.240 "name": "BaseBdev3", 00:29:35.240 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:35.240 "is_configured": true, 00:29:35.240 "data_offset": 2048, 00:29:35.240 "data_size": 63488 00:29:35.240 }, 00:29:35.240 { 00:29:35.240 "name": "BaseBdev4", 00:29:35.240 "uuid": "b4e2997c-719f-49f9-aafa-257ff36a1fb1", 00:29:35.240 "is_configured": true, 00:29:35.240 "data_offset": 2048, 00:29:35.240 "data_size": 63488 00:29:35.240 } 00:29:35.240 ] 00:29:35.240 } 00:29:35.240 } 00:29:35.240 }' 00:29:35.240 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:35.240 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:35.240 BaseBdev2 00:29:35.240 BaseBdev3 00:29:35.240 BaseBdev4' 00:29:35.240 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:35.240 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:35.240 12:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:35.818 "name": "BaseBdev1", 00:29:35.818 "aliases": [ 00:29:35.818 "5c222d8c-2001-4054-b3e0-3702c25ab5a2" 00:29:35.818 ], 00:29:35.818 "product_name": "Malloc disk", 00:29:35.818 "block_size": 512, 00:29:35.818 "num_blocks": 65536, 00:29:35.818 "uuid": "5c222d8c-2001-4054-b3e0-3702c25ab5a2", 00:29:35.818 "assigned_rate_limits": { 00:29:35.818 "rw_ios_per_sec": 0, 00:29:35.818 "rw_mbytes_per_sec": 0, 00:29:35.818 "r_mbytes_per_sec": 0, 00:29:35.818 "w_mbytes_per_sec": 0 00:29:35.818 }, 00:29:35.818 "claimed": true, 00:29:35.818 "claim_type": "exclusive_write", 00:29:35.818 "zoned": false, 00:29:35.818 "supported_io_types": { 00:29:35.818 "read": true, 00:29:35.818 "write": true, 00:29:35.818 "unmap": true, 00:29:35.818 "write_zeroes": true, 00:29:35.818 "flush": true, 00:29:35.818 "reset": true, 00:29:35.818 "compare": false, 00:29:35.818 "compare_and_write": false, 00:29:35.818 "abort": true, 00:29:35.818 "nvme_admin": false, 00:29:35.818 "nvme_io": false 00:29:35.818 }, 00:29:35.818 "memory_domains": [ 00:29:35.818 { 00:29:35.818 "dma_device_id": "system", 00:29:35.818 "dma_device_type": 1 00:29:35.818 }, 00:29:35.818 { 00:29:35.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.818 "dma_device_type": 2 00:29:35.818 } 00:29:35.818 ], 00:29:35.818 "driver_specific": {} 00:29:35.818 }' 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.818 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:36.076 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:36.333 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:36.333 "name": "BaseBdev2", 00:29:36.333 "aliases": [ 00:29:36.333 "8e4d482e-c58f-4b52-a7c1-37a143b54fe0" 00:29:36.333 ], 00:29:36.333 "product_name": "Malloc disk", 00:29:36.333 "block_size": 512, 00:29:36.333 "num_blocks": 65536, 00:29:36.333 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:36.334 "assigned_rate_limits": { 00:29:36.334 "rw_ios_per_sec": 0, 00:29:36.334 "rw_mbytes_per_sec": 0, 00:29:36.334 "r_mbytes_per_sec": 0, 00:29:36.334 "w_mbytes_per_sec": 0 00:29:36.334 }, 00:29:36.334 "claimed": true, 00:29:36.334 "claim_type": "exclusive_write", 00:29:36.334 "zoned": false, 00:29:36.334 "supported_io_types": { 00:29:36.334 "read": true, 00:29:36.334 "write": true, 00:29:36.334 "unmap": true, 00:29:36.334 "write_zeroes": true, 00:29:36.334 "flush": true, 00:29:36.334 "reset": true, 00:29:36.334 "compare": false, 00:29:36.334 "compare_and_write": false, 00:29:36.334 "abort": true, 00:29:36.334 "nvme_admin": false, 00:29:36.334 "nvme_io": false 00:29:36.334 }, 00:29:36.334 "memory_domains": [ 00:29:36.334 { 00:29:36.334 "dma_device_id": "system", 00:29:36.334 "dma_device_type": 1 00:29:36.334 }, 00:29:36.334 { 00:29:36.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:36.334 "dma_device_type": 2 00:29:36.334 } 00:29:36.334 ], 00:29:36.334 "driver_specific": {} 00:29:36.334 }' 00:29:36.334 12:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:36.592 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:29:36.850 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:37.108 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:37.108 "name": "BaseBdev3", 00:29:37.108 "aliases": [ 00:29:37.108 "e255ba5a-7cff-4288-9af8-fccda24c7fce" 00:29:37.108 ], 00:29:37.108 "product_name": "Malloc disk", 00:29:37.108 "block_size": 512, 00:29:37.108 "num_blocks": 65536, 00:29:37.108 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:37.108 "assigned_rate_limits": { 00:29:37.108 "rw_ios_per_sec": 0, 00:29:37.108 "rw_mbytes_per_sec": 0, 00:29:37.108 "r_mbytes_per_sec": 0, 00:29:37.108 "w_mbytes_per_sec": 0 00:29:37.108 }, 00:29:37.108 "claimed": true, 00:29:37.108 "claim_type": "exclusive_write", 00:29:37.108 "zoned": false, 00:29:37.108 "supported_io_types": { 00:29:37.108 "read": true, 00:29:37.108 "write": true, 00:29:37.108 "unmap": true, 00:29:37.108 "write_zeroes": true, 00:29:37.108 "flush": true, 00:29:37.108 "reset": true, 00:29:37.108 "compare": false, 00:29:37.108 "compare_and_write": false, 00:29:37.108 "abort": true, 00:29:37.108 "nvme_admin": false, 00:29:37.108 "nvme_io": false 00:29:37.108 }, 00:29:37.108 "memory_domains": [ 00:29:37.108 { 00:29:37.108 "dma_device_id": "system", 00:29:37.108 "dma_device_type": 1 00:29:37.108 }, 00:29:37.108 { 00:29:37.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.108 "dma_device_type": 2 00:29:37.108 } 00:29:37.108 ], 00:29:37.108 "driver_specific": {} 00:29:37.108 }' 00:29:37.108 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:37.366 12:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:37.366 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:29:37.624 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:37.881 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:37.881 "name": "BaseBdev4", 00:29:37.881 "aliases": [ 00:29:37.881 "b4e2997c-719f-49f9-aafa-257ff36a1fb1" 00:29:37.881 ], 00:29:37.881 "product_name": "Malloc disk", 00:29:37.881 "block_size": 512, 00:29:37.881 "num_blocks": 65536, 00:29:37.881 "uuid": "b4e2997c-719f-49f9-aafa-257ff36a1fb1", 00:29:37.881 "assigned_rate_limits": { 00:29:37.881 "rw_ios_per_sec": 0, 00:29:37.881 "rw_mbytes_per_sec": 0, 00:29:37.881 "r_mbytes_per_sec": 0, 00:29:37.881 "w_mbytes_per_sec": 0 00:29:37.881 }, 00:29:37.881 "claimed": true, 00:29:37.881 "claim_type": "exclusive_write", 00:29:37.881 "zoned": false, 00:29:37.882 "supported_io_types": { 00:29:37.882 "read": true, 00:29:37.882 "write": true, 00:29:37.882 "unmap": true, 00:29:37.882 "write_zeroes": true, 00:29:37.882 "flush": true, 00:29:37.882 "reset": true, 00:29:37.882 "compare": false, 00:29:37.882 "compare_and_write": false, 00:29:37.882 "abort": true, 00:29:37.882 "nvme_admin": false, 00:29:37.882 "nvme_io": false 00:29:37.882 }, 00:29:37.882 "memory_domains": [ 00:29:37.882 { 00:29:37.882 "dma_device_id": "system", 00:29:37.882 "dma_device_type": 1 00:29:37.882 }, 00:29:37.882 { 00:29:37.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.882 "dma_device_type": 2 00:29:37.882 } 00:29:37.882 ], 00:29:37.882 "driver_specific": {} 00:29:37.882 }' 00:29:37.882 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:37.882 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:37.882 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:37.882 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.140 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.398 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:38.398 12:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:38.657 [2024-06-07 12:32:02.108442] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:38.657 [2024-06-07 12:32:02.108499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:38.657 [2024-06-07 12:32:02.108576] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.657 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:38.915 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.915 "name": "Existed_Raid", 00:29:38.915 "uuid": "1cdc8b0c-71bf-49e7-9eaf-b9928f66482a", 00:29:38.915 "strip_size_kb": 64, 00:29:38.915 "state": "offline", 00:29:38.915 "raid_level": "raid0", 00:29:38.915 "superblock": true, 00:29:38.915 "num_base_bdevs": 4, 00:29:38.915 "num_base_bdevs_discovered": 3, 00:29:38.915 "num_base_bdevs_operational": 3, 00:29:38.915 "base_bdevs_list": [ 00:29:38.915 { 00:29:38.915 "name": null, 00:29:38.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:38.915 "is_configured": false, 00:29:38.915 "data_offset": 2048, 00:29:38.915 "data_size": 63488 00:29:38.915 }, 00:29:38.915 { 00:29:38.915 "name": "BaseBdev2", 00:29:38.915 "uuid": "8e4d482e-c58f-4b52-a7c1-37a143b54fe0", 00:29:38.915 "is_configured": true, 00:29:38.915 "data_offset": 2048, 00:29:38.915 "data_size": 63488 00:29:38.915 }, 00:29:38.915 { 00:29:38.915 "name": "BaseBdev3", 00:29:38.915 "uuid": "e255ba5a-7cff-4288-9af8-fccda24c7fce", 00:29:38.915 "is_configured": true, 00:29:38.915 "data_offset": 2048, 00:29:38.915 "data_size": 63488 00:29:38.915 }, 00:29:38.915 { 00:29:38.915 "name": "BaseBdev4", 00:29:38.915 "uuid": "b4e2997c-719f-49f9-aafa-257ff36a1fb1", 00:29:38.915 "is_configured": true, 00:29:38.915 "data_offset": 2048, 00:29:38.915 "data_size": 63488 00:29:38.915 } 00:29:38.915 ] 00:29:38.915 }' 00:29:38.916 12:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.916 12:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:39.848 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:39.848 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:39.848 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:39.848 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.107 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:40.107 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:40.107 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:40.366 [2024-06-07 12:32:03.793082] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:40.366 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:40.366 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:40.366 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.366 12:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:40.625 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:40.625 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:40.625 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:29:40.884 [2024-06-07 12:32:04.421723] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:40.884 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:40.884 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:40.884 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.884 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:41.142 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:41.142 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:41.142 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:29:41.400 [2024-06-07 12:32:04.943722] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:29:41.400 [2024-06-07 12:32:04.943812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:29:41.400 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:41.400 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:41.401 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.401 12:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:41.658 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:41.658 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:41.658 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:29:41.658 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:29:41.659 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:41.659 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:29:41.917 BaseBdev2 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:42.176 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:42.434 12:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:42.693 [ 00:29:42.693 { 00:29:42.693 "name": "BaseBdev2", 00:29:42.693 "aliases": [ 00:29:42.693 "bee1a0f2-887b-434d-b593-81b7c7d8457f" 00:29:42.693 ], 00:29:42.693 "product_name": "Malloc disk", 00:29:42.693 "block_size": 512, 00:29:42.693 "num_blocks": 65536, 00:29:42.693 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:42.693 "assigned_rate_limits": { 00:29:42.693 "rw_ios_per_sec": 0, 00:29:42.693 "rw_mbytes_per_sec": 0, 00:29:42.693 "r_mbytes_per_sec": 0, 00:29:42.693 "w_mbytes_per_sec": 0 00:29:42.693 }, 00:29:42.693 "claimed": false, 00:29:42.693 "zoned": false, 00:29:42.693 "supported_io_types": { 00:29:42.693 "read": true, 00:29:42.693 "write": true, 00:29:42.693 "unmap": true, 00:29:42.693 "write_zeroes": true, 00:29:42.693 "flush": true, 00:29:42.693 "reset": true, 00:29:42.693 "compare": false, 00:29:42.693 "compare_and_write": false, 00:29:42.693 "abort": true, 00:29:42.693 "nvme_admin": false, 00:29:42.693 "nvme_io": false 00:29:42.693 }, 00:29:42.693 "memory_domains": [ 00:29:42.693 { 00:29:42.693 "dma_device_id": "system", 00:29:42.693 "dma_device_type": 1 00:29:42.693 }, 00:29:42.693 { 00:29:42.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:42.693 "dma_device_type": 2 00:29:42.693 } 00:29:42.693 ], 00:29:42.693 "driver_specific": {} 00:29:42.693 } 00:29:42.693 ] 00:29:42.693 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:42.693 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:42.693 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:42.693 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:29:42.973 BaseBdev3 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:42.973 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:43.231 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:29:43.231 [ 00:29:43.231 { 00:29:43.231 "name": "BaseBdev3", 00:29:43.231 "aliases": [ 00:29:43.231 "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00" 00:29:43.231 ], 00:29:43.231 "product_name": "Malloc disk", 00:29:43.231 "block_size": 512, 00:29:43.231 "num_blocks": 65536, 00:29:43.231 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:43.231 "assigned_rate_limits": { 00:29:43.231 "rw_ios_per_sec": 0, 00:29:43.231 "rw_mbytes_per_sec": 0, 00:29:43.231 "r_mbytes_per_sec": 0, 00:29:43.231 "w_mbytes_per_sec": 0 00:29:43.231 }, 00:29:43.231 "claimed": false, 00:29:43.231 "zoned": false, 00:29:43.231 "supported_io_types": { 00:29:43.231 "read": true, 00:29:43.231 "write": true, 00:29:43.231 "unmap": true, 00:29:43.231 "write_zeroes": true, 00:29:43.231 "flush": true, 00:29:43.231 "reset": true, 00:29:43.231 "compare": false, 00:29:43.231 "compare_and_write": false, 00:29:43.231 "abort": true, 00:29:43.231 "nvme_admin": false, 00:29:43.231 "nvme_io": false 00:29:43.231 }, 00:29:43.231 "memory_domains": [ 00:29:43.231 { 00:29:43.231 "dma_device_id": "system", 00:29:43.231 "dma_device_type": 1 00:29:43.231 }, 00:29:43.231 { 00:29:43.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:43.231 "dma_device_type": 2 00:29:43.231 } 00:29:43.231 ], 00:29:43.231 "driver_specific": {} 00:29:43.231 } 00:29:43.231 ] 00:29:43.490 12:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:43.490 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:43.490 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:43.490 12:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:29:43.748 BaseBdev4 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:43.748 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:44.006 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:29:44.006 [ 00:29:44.006 { 00:29:44.006 "name": "BaseBdev4", 00:29:44.006 "aliases": [ 00:29:44.006 "4010d6f5-c9cd-472d-a95a-104adfba7290" 00:29:44.006 ], 00:29:44.006 "product_name": "Malloc disk", 00:29:44.006 "block_size": 512, 00:29:44.006 "num_blocks": 65536, 00:29:44.006 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:44.006 "assigned_rate_limits": { 00:29:44.006 "rw_ios_per_sec": 0, 00:29:44.006 "rw_mbytes_per_sec": 0, 00:29:44.006 "r_mbytes_per_sec": 0, 00:29:44.006 "w_mbytes_per_sec": 0 00:29:44.006 }, 00:29:44.006 "claimed": false, 00:29:44.006 "zoned": false, 00:29:44.006 "supported_io_types": { 00:29:44.006 "read": true, 00:29:44.006 "write": true, 00:29:44.006 "unmap": true, 00:29:44.006 "write_zeroes": true, 00:29:44.006 "flush": true, 00:29:44.006 "reset": true, 00:29:44.006 "compare": false, 00:29:44.006 "compare_and_write": false, 00:29:44.006 "abort": true, 00:29:44.006 "nvme_admin": false, 00:29:44.006 "nvme_io": false 00:29:44.006 }, 00:29:44.006 "memory_domains": [ 00:29:44.006 { 00:29:44.006 "dma_device_id": "system", 00:29:44.006 "dma_device_type": 1 00:29:44.006 }, 00:29:44.006 { 00:29:44.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.006 "dma_device_type": 2 00:29:44.006 } 00:29:44.006 ], 00:29:44.006 "driver_specific": {} 00:29:44.006 } 00:29:44.006 ] 00:29:44.006 12:32:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:44.006 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:29:44.006 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:29:44.006 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:29:44.265 [2024-06-07 12:32:07.842830] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:44.265 [2024-06-07 12:32:07.842946] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:44.265 [2024-06-07 12:32:07.842979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:44.265 [2024-06-07 12:32:07.845199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:44.265 [2024-06-07 12:32:07.845264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.265 12:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:44.831 12:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:44.831 "name": "Existed_Raid", 00:29:44.831 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:44.831 "strip_size_kb": 64, 00:29:44.832 "state": "configuring", 00:29:44.832 "raid_level": "raid0", 00:29:44.832 "superblock": true, 00:29:44.832 "num_base_bdevs": 4, 00:29:44.832 "num_base_bdevs_discovered": 3, 00:29:44.832 "num_base_bdevs_operational": 4, 00:29:44.832 "base_bdevs_list": [ 00:29:44.832 { 00:29:44.832 "name": "BaseBdev1", 00:29:44.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:44.832 "is_configured": false, 00:29:44.832 "data_offset": 0, 00:29:44.832 "data_size": 0 00:29:44.832 }, 00:29:44.832 { 00:29:44.832 "name": "BaseBdev2", 00:29:44.832 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:44.832 "is_configured": true, 00:29:44.832 "data_offset": 2048, 00:29:44.832 "data_size": 63488 00:29:44.832 }, 00:29:44.832 { 00:29:44.832 "name": "BaseBdev3", 00:29:44.832 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:44.832 "is_configured": true, 00:29:44.832 "data_offset": 2048, 00:29:44.832 "data_size": 63488 00:29:44.832 }, 00:29:44.832 { 00:29:44.832 "name": "BaseBdev4", 00:29:44.832 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:44.832 "is_configured": true, 00:29:44.832 "data_offset": 2048, 00:29:44.832 "data_size": 63488 00:29:44.832 } 00:29:44.832 ] 00:29:44.832 }' 00:29:44.832 12:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:44.832 12:32:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:45.399 12:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:29:45.399 [2024-06-07 12:32:09.030926] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:45.728 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.988 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.988 "name": "Existed_Raid", 00:29:45.988 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:45.988 "strip_size_kb": 64, 00:29:45.988 "state": "configuring", 00:29:45.988 "raid_level": "raid0", 00:29:45.988 "superblock": true, 00:29:45.988 "num_base_bdevs": 4, 00:29:45.988 "num_base_bdevs_discovered": 2, 00:29:45.988 "num_base_bdevs_operational": 4, 00:29:45.988 "base_bdevs_list": [ 00:29:45.988 { 00:29:45.988 "name": "BaseBdev1", 00:29:45.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.988 "is_configured": false, 00:29:45.988 "data_offset": 0, 00:29:45.988 "data_size": 0 00:29:45.988 }, 00:29:45.988 { 00:29:45.988 "name": null, 00:29:45.988 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:45.988 "is_configured": false, 00:29:45.988 "data_offset": 2048, 00:29:45.988 "data_size": 63488 00:29:45.988 }, 00:29:45.988 { 00:29:45.988 "name": "BaseBdev3", 00:29:45.988 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:45.988 "is_configured": true, 00:29:45.988 "data_offset": 2048, 00:29:45.988 "data_size": 63488 00:29:45.988 }, 00:29:45.988 { 00:29:45.988 "name": "BaseBdev4", 00:29:45.988 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:45.988 "is_configured": true, 00:29:45.988 "data_offset": 2048, 00:29:45.988 "data_size": 63488 00:29:45.988 } 00:29:45.988 ] 00:29:45.988 }' 00:29:45.988 12:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.988 12:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:46.554 12:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.554 12:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:46.812 12:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:29:46.812 12:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:29:47.070 [2024-06-07 12:32:10.625014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:47.070 BaseBdev1 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:47.070 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:47.329 12:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:47.587 [ 00:29:47.587 { 00:29:47.587 "name": "BaseBdev1", 00:29:47.587 "aliases": [ 00:29:47.587 "907f27e1-c9d0-495d-b830-1b6291c5133a" 00:29:47.587 ], 00:29:47.587 "product_name": "Malloc disk", 00:29:47.587 "block_size": 512, 00:29:47.587 "num_blocks": 65536, 00:29:47.587 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:47.587 "assigned_rate_limits": { 00:29:47.587 "rw_ios_per_sec": 0, 00:29:47.587 "rw_mbytes_per_sec": 0, 00:29:47.587 "r_mbytes_per_sec": 0, 00:29:47.587 "w_mbytes_per_sec": 0 00:29:47.587 }, 00:29:47.587 "claimed": true, 00:29:47.587 "claim_type": "exclusive_write", 00:29:47.587 "zoned": false, 00:29:47.587 "supported_io_types": { 00:29:47.587 "read": true, 00:29:47.587 "write": true, 00:29:47.587 "unmap": true, 00:29:47.587 "write_zeroes": true, 00:29:47.587 "flush": true, 00:29:47.587 "reset": true, 00:29:47.587 "compare": false, 00:29:47.587 "compare_and_write": false, 00:29:47.587 "abort": true, 00:29:47.587 "nvme_admin": false, 00:29:47.587 "nvme_io": false 00:29:47.587 }, 00:29:47.587 "memory_domains": [ 00:29:47.587 { 00:29:47.587 "dma_device_id": "system", 00:29:47.587 "dma_device_type": 1 00:29:47.587 }, 00:29:47.587 { 00:29:47.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:47.587 "dma_device_type": 2 00:29:47.587 } 00:29:47.587 ], 00:29:47.587 "driver_specific": {} 00:29:47.587 } 00:29:47.587 ] 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:47.845 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:47.846 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:47.846 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.846 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:48.127 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.127 "name": "Existed_Raid", 00:29:48.127 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:48.127 "strip_size_kb": 64, 00:29:48.127 "state": "configuring", 00:29:48.127 "raid_level": "raid0", 00:29:48.127 "superblock": true, 00:29:48.127 "num_base_bdevs": 4, 00:29:48.127 "num_base_bdevs_discovered": 3, 00:29:48.127 "num_base_bdevs_operational": 4, 00:29:48.127 "base_bdevs_list": [ 00:29:48.127 { 00:29:48.127 "name": "BaseBdev1", 00:29:48.127 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:48.127 "is_configured": true, 00:29:48.127 "data_offset": 2048, 00:29:48.127 "data_size": 63488 00:29:48.127 }, 00:29:48.127 { 00:29:48.127 "name": null, 00:29:48.127 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:48.127 "is_configured": false, 00:29:48.127 "data_offset": 2048, 00:29:48.127 "data_size": 63488 00:29:48.127 }, 00:29:48.127 { 00:29:48.127 "name": "BaseBdev3", 00:29:48.127 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:48.127 "is_configured": true, 00:29:48.127 "data_offset": 2048, 00:29:48.127 "data_size": 63488 00:29:48.127 }, 00:29:48.127 { 00:29:48.127 "name": "BaseBdev4", 00:29:48.127 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:48.127 "is_configured": true, 00:29:48.127 "data_offset": 2048, 00:29:48.128 "data_size": 63488 00:29:48.128 } 00:29:48.128 ] 00:29:48.128 }' 00:29:48.128 12:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.128 12:32:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:48.695 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:48.695 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.953 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:29:48.953 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:29:49.212 [2024-06-07 12:32:12.777770] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.212 12:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:49.471 12:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.471 "name": "Existed_Raid", 00:29:49.471 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:49.471 "strip_size_kb": 64, 00:29:49.471 "state": "configuring", 00:29:49.471 "raid_level": "raid0", 00:29:49.471 "superblock": true, 00:29:49.471 "num_base_bdevs": 4, 00:29:49.471 "num_base_bdevs_discovered": 2, 00:29:49.471 "num_base_bdevs_operational": 4, 00:29:49.471 "base_bdevs_list": [ 00:29:49.471 { 00:29:49.471 "name": "BaseBdev1", 00:29:49.471 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:49.471 "is_configured": true, 00:29:49.471 "data_offset": 2048, 00:29:49.471 "data_size": 63488 00:29:49.471 }, 00:29:49.471 { 00:29:49.471 "name": null, 00:29:49.471 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:49.471 "is_configured": false, 00:29:49.471 "data_offset": 2048, 00:29:49.471 "data_size": 63488 00:29:49.471 }, 00:29:49.471 { 00:29:49.471 "name": null, 00:29:49.471 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:49.471 "is_configured": false, 00:29:49.471 "data_offset": 2048, 00:29:49.471 "data_size": 63488 00:29:49.471 }, 00:29:49.471 { 00:29:49.471 "name": "BaseBdev4", 00:29:49.471 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:49.471 "is_configured": true, 00:29:49.471 "data_offset": 2048, 00:29:49.471 "data_size": 63488 00:29:49.471 } 00:29:49.471 ] 00:29:49.471 }' 00:29:49.471 12:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.471 12:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:50.404 12:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.404 12:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:50.663 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:29:50.663 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:29:50.921 [2024-06-07 12:32:14.414165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.921 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:51.180 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:51.180 "name": "Existed_Raid", 00:29:51.180 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:51.180 "strip_size_kb": 64, 00:29:51.180 "state": "configuring", 00:29:51.180 "raid_level": "raid0", 00:29:51.180 "superblock": true, 00:29:51.180 "num_base_bdevs": 4, 00:29:51.180 "num_base_bdevs_discovered": 3, 00:29:51.180 "num_base_bdevs_operational": 4, 00:29:51.180 "base_bdevs_list": [ 00:29:51.180 { 00:29:51.180 "name": "BaseBdev1", 00:29:51.180 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:51.180 "is_configured": true, 00:29:51.180 "data_offset": 2048, 00:29:51.180 "data_size": 63488 00:29:51.180 }, 00:29:51.180 { 00:29:51.180 "name": null, 00:29:51.180 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:51.180 "is_configured": false, 00:29:51.180 "data_offset": 2048, 00:29:51.180 "data_size": 63488 00:29:51.180 }, 00:29:51.180 { 00:29:51.180 "name": "BaseBdev3", 00:29:51.180 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:51.180 "is_configured": true, 00:29:51.180 "data_offset": 2048, 00:29:51.180 "data_size": 63488 00:29:51.180 }, 00:29:51.180 { 00:29:51.180 "name": "BaseBdev4", 00:29:51.180 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:51.180 "is_configured": true, 00:29:51.180 "data_offset": 2048, 00:29:51.180 "data_size": 63488 00:29:51.180 } 00:29:51.180 ] 00:29:51.180 }' 00:29:51.180 12:32:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:51.180 12:32:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:52.115 12:32:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.115 12:32:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:29:52.115 12:32:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:29:52.115 12:32:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:52.683 [2024-06-07 12:32:16.028528] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.683 "name": "Existed_Raid", 00:29:52.683 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:52.683 "strip_size_kb": 64, 00:29:52.683 "state": "configuring", 00:29:52.683 "raid_level": "raid0", 00:29:52.683 "superblock": true, 00:29:52.683 "num_base_bdevs": 4, 00:29:52.683 "num_base_bdevs_discovered": 2, 00:29:52.683 "num_base_bdevs_operational": 4, 00:29:52.683 "base_bdevs_list": [ 00:29:52.683 { 00:29:52.683 "name": null, 00:29:52.683 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:52.683 "is_configured": false, 00:29:52.683 "data_offset": 2048, 00:29:52.683 "data_size": 63488 00:29:52.683 }, 00:29:52.683 { 00:29:52.683 "name": null, 00:29:52.683 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:52.683 "is_configured": false, 00:29:52.683 "data_offset": 2048, 00:29:52.683 "data_size": 63488 00:29:52.683 }, 00:29:52.683 { 00:29:52.683 "name": "BaseBdev3", 00:29:52.683 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:52.683 "is_configured": true, 00:29:52.683 "data_offset": 2048, 00:29:52.683 "data_size": 63488 00:29:52.683 }, 00:29:52.683 { 00:29:52.683 "name": "BaseBdev4", 00:29:52.683 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:52.683 "is_configured": true, 00:29:52.683 "data_offset": 2048, 00:29:52.683 "data_size": 63488 00:29:52.683 } 00:29:52.683 ] 00:29:52.683 }' 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.683 12:32:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:53.620 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:29:53.620 12:32:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.878 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:29:53.879 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:29:54.138 [2024-06-07 12:32:17.528733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.138 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:54.397 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:54.397 "name": "Existed_Raid", 00:29:54.397 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:54.397 "strip_size_kb": 64, 00:29:54.397 "state": "configuring", 00:29:54.397 "raid_level": "raid0", 00:29:54.397 "superblock": true, 00:29:54.397 "num_base_bdevs": 4, 00:29:54.397 "num_base_bdevs_discovered": 3, 00:29:54.397 "num_base_bdevs_operational": 4, 00:29:54.397 "base_bdevs_list": [ 00:29:54.397 { 00:29:54.397 "name": null, 00:29:54.397 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:54.397 "is_configured": false, 00:29:54.397 "data_offset": 2048, 00:29:54.397 "data_size": 63488 00:29:54.397 }, 00:29:54.397 { 00:29:54.397 "name": "BaseBdev2", 00:29:54.397 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:54.397 "is_configured": true, 00:29:54.397 "data_offset": 2048, 00:29:54.397 "data_size": 63488 00:29:54.397 }, 00:29:54.397 { 00:29:54.397 "name": "BaseBdev3", 00:29:54.397 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:54.397 "is_configured": true, 00:29:54.397 "data_offset": 2048, 00:29:54.397 "data_size": 63488 00:29:54.397 }, 00:29:54.397 { 00:29:54.397 "name": "BaseBdev4", 00:29:54.397 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:54.397 "is_configured": true, 00:29:54.397 "data_offset": 2048, 00:29:54.397 "data_size": 63488 00:29:54.397 } 00:29:54.397 ] 00:29:54.397 }' 00:29:54.397 12:32:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:54.397 12:32:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:54.965 12:32:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.965 12:32:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:29:55.223 12:32:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:29:55.223 12:32:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:29:55.223 12:32:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:55.481 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 907f27e1-c9d0-495d-b830-1b6291c5133a 00:29:55.787 [2024-06-07 12:32:19.383550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:29:55.787 [2024-06-07 12:32:19.384059] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:29:55.787 NewBaseBdev 00:29:55.787 [2024-06-07 12:32:19.385809] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:29:55.787 [2024-06-07 12:32:19.385965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:29:55.787 [2024-06-07 12:32:19.386303] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:29:55.787 [2024-06-07 12:32:19.386423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:29:55.787 [2024-06-07 12:32:19.386582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:55.787 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:56.355 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:29:56.355 [ 00:29:56.355 { 00:29:56.355 "name": "NewBaseBdev", 00:29:56.355 "aliases": [ 00:29:56.355 "907f27e1-c9d0-495d-b830-1b6291c5133a" 00:29:56.355 ], 00:29:56.355 "product_name": "Malloc disk", 00:29:56.355 "block_size": 512, 00:29:56.355 "num_blocks": 65536, 00:29:56.355 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:56.355 "assigned_rate_limits": { 00:29:56.355 "rw_ios_per_sec": 0, 00:29:56.355 "rw_mbytes_per_sec": 0, 00:29:56.355 "r_mbytes_per_sec": 0, 00:29:56.355 "w_mbytes_per_sec": 0 00:29:56.355 }, 00:29:56.355 "claimed": true, 00:29:56.355 "claim_type": "exclusive_write", 00:29:56.355 "zoned": false, 00:29:56.355 "supported_io_types": { 00:29:56.355 "read": true, 00:29:56.355 "write": true, 00:29:56.355 "unmap": true, 00:29:56.355 "write_zeroes": true, 00:29:56.355 "flush": true, 00:29:56.355 "reset": true, 00:29:56.355 "compare": false, 00:29:56.355 "compare_and_write": false, 00:29:56.355 "abort": true, 00:29:56.355 "nvme_admin": false, 00:29:56.355 "nvme_io": false 00:29:56.355 }, 00:29:56.355 "memory_domains": [ 00:29:56.355 { 00:29:56.355 "dma_device_id": "system", 00:29:56.355 "dma_device_type": 1 00:29:56.355 }, 00:29:56.355 { 00:29:56.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:56.355 "dma_device_type": 2 00:29:56.355 } 00:29:56.355 ], 00:29:56.356 "driver_specific": {} 00:29:56.356 } 00:29:56.356 ] 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.356 12:32:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:56.922 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:56.922 "name": "Existed_Raid", 00:29:56.922 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:56.922 "strip_size_kb": 64, 00:29:56.922 "state": "online", 00:29:56.922 "raid_level": "raid0", 00:29:56.922 "superblock": true, 00:29:56.922 "num_base_bdevs": 4, 00:29:56.922 "num_base_bdevs_discovered": 4, 00:29:56.922 "num_base_bdevs_operational": 4, 00:29:56.922 "base_bdevs_list": [ 00:29:56.922 { 00:29:56.922 "name": "NewBaseBdev", 00:29:56.922 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:56.922 "is_configured": true, 00:29:56.922 "data_offset": 2048, 00:29:56.922 "data_size": 63488 00:29:56.922 }, 00:29:56.922 { 00:29:56.922 "name": "BaseBdev2", 00:29:56.922 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:56.922 "is_configured": true, 00:29:56.922 "data_offset": 2048, 00:29:56.922 "data_size": 63488 00:29:56.922 }, 00:29:56.922 { 00:29:56.922 "name": "BaseBdev3", 00:29:56.922 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:56.922 "is_configured": true, 00:29:56.922 "data_offset": 2048, 00:29:56.922 "data_size": 63488 00:29:56.922 }, 00:29:56.922 { 00:29:56.922 "name": "BaseBdev4", 00:29:56.922 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:56.922 "is_configured": true, 00:29:56.922 "data_offset": 2048, 00:29:56.922 "data_size": 63488 00:29:56.922 } 00:29:56.922 ] 00:29:56.922 }' 00:29:56.922 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:56.922 12:32:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:57.490 12:32:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:57.748 [2024-06-07 12:32:21.251549] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:57.748 "name": "Existed_Raid", 00:29:57.748 "aliases": [ 00:29:57.748 "d63f368b-4708-4e6c-9545-cfba434ea1e7" 00:29:57.748 ], 00:29:57.748 "product_name": "Raid Volume", 00:29:57.748 "block_size": 512, 00:29:57.748 "num_blocks": 253952, 00:29:57.748 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:57.748 "assigned_rate_limits": { 00:29:57.748 "rw_ios_per_sec": 0, 00:29:57.748 "rw_mbytes_per_sec": 0, 00:29:57.748 "r_mbytes_per_sec": 0, 00:29:57.748 "w_mbytes_per_sec": 0 00:29:57.748 }, 00:29:57.748 "claimed": false, 00:29:57.748 "zoned": false, 00:29:57.748 "supported_io_types": { 00:29:57.748 "read": true, 00:29:57.748 "write": true, 00:29:57.748 "unmap": true, 00:29:57.748 "write_zeroes": true, 00:29:57.748 "flush": true, 00:29:57.748 "reset": true, 00:29:57.748 "compare": false, 00:29:57.748 "compare_and_write": false, 00:29:57.748 "abort": false, 00:29:57.748 "nvme_admin": false, 00:29:57.748 "nvme_io": false 00:29:57.748 }, 00:29:57.748 "memory_domains": [ 00:29:57.748 { 00:29:57.748 "dma_device_id": "system", 00:29:57.748 "dma_device_type": 1 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.748 "dma_device_type": 2 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "system", 00:29:57.748 "dma_device_type": 1 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.748 "dma_device_type": 2 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "system", 00:29:57.748 "dma_device_type": 1 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.748 "dma_device_type": 2 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "system", 00:29:57.748 "dma_device_type": 1 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.748 "dma_device_type": 2 00:29:57.748 } 00:29:57.748 ], 00:29:57.748 "driver_specific": { 00:29:57.748 "raid": { 00:29:57.748 "uuid": "d63f368b-4708-4e6c-9545-cfba434ea1e7", 00:29:57.748 "strip_size_kb": 64, 00:29:57.748 "state": "online", 00:29:57.748 "raid_level": "raid0", 00:29:57.748 "superblock": true, 00:29:57.748 "num_base_bdevs": 4, 00:29:57.748 "num_base_bdevs_discovered": 4, 00:29:57.748 "num_base_bdevs_operational": 4, 00:29:57.748 "base_bdevs_list": [ 00:29:57.748 { 00:29:57.748 "name": "NewBaseBdev", 00:29:57.748 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:57.748 "is_configured": true, 00:29:57.748 "data_offset": 2048, 00:29:57.748 "data_size": 63488 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "name": "BaseBdev2", 00:29:57.748 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:57.748 "is_configured": true, 00:29:57.748 "data_offset": 2048, 00:29:57.748 "data_size": 63488 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "name": "BaseBdev3", 00:29:57.748 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:57.748 "is_configured": true, 00:29:57.748 "data_offset": 2048, 00:29:57.748 "data_size": 63488 00:29:57.748 }, 00:29:57.748 { 00:29:57.748 "name": "BaseBdev4", 00:29:57.748 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:29:57.748 "is_configured": true, 00:29:57.748 "data_offset": 2048, 00:29:57.748 "data_size": 63488 00:29:57.748 } 00:29:57.748 ] 00:29:57.748 } 00:29:57.748 } 00:29:57.748 }' 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:29:57.748 BaseBdev2 00:29:57.748 BaseBdev3 00:29:57.748 BaseBdev4' 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:57.748 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:58.314 "name": "NewBaseBdev", 00:29:58.314 "aliases": [ 00:29:58.314 "907f27e1-c9d0-495d-b830-1b6291c5133a" 00:29:58.314 ], 00:29:58.314 "product_name": "Malloc disk", 00:29:58.314 "block_size": 512, 00:29:58.314 "num_blocks": 65536, 00:29:58.314 "uuid": "907f27e1-c9d0-495d-b830-1b6291c5133a", 00:29:58.314 "assigned_rate_limits": { 00:29:58.314 "rw_ios_per_sec": 0, 00:29:58.314 "rw_mbytes_per_sec": 0, 00:29:58.314 "r_mbytes_per_sec": 0, 00:29:58.314 "w_mbytes_per_sec": 0 00:29:58.314 }, 00:29:58.314 "claimed": true, 00:29:58.314 "claim_type": "exclusive_write", 00:29:58.314 "zoned": false, 00:29:58.314 "supported_io_types": { 00:29:58.314 "read": true, 00:29:58.314 "write": true, 00:29:58.314 "unmap": true, 00:29:58.314 "write_zeroes": true, 00:29:58.314 "flush": true, 00:29:58.314 "reset": true, 00:29:58.314 "compare": false, 00:29:58.314 "compare_and_write": false, 00:29:58.314 "abort": true, 00:29:58.314 "nvme_admin": false, 00:29:58.314 "nvme_io": false 00:29:58.314 }, 00:29:58.314 "memory_domains": [ 00:29:58.314 { 00:29:58.314 "dma_device_id": "system", 00:29:58.314 "dma_device_type": 1 00:29:58.314 }, 00:29:58.314 { 00:29:58.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:58.314 "dma_device_type": 2 00:29:58.314 } 00:29:58.314 ], 00:29:58.314 "driver_specific": {} 00:29:58.314 }' 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.314 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.315 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:58.315 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:58.315 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:58.315 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:58.315 12:32:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:58.573 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:58.573 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:58.573 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:58.573 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:58.573 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:58.832 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:58.832 "name": "BaseBdev2", 00:29:58.832 "aliases": [ 00:29:58.832 "bee1a0f2-887b-434d-b593-81b7c7d8457f" 00:29:58.832 ], 00:29:58.832 "product_name": "Malloc disk", 00:29:58.832 "block_size": 512, 00:29:58.832 "num_blocks": 65536, 00:29:58.832 "uuid": "bee1a0f2-887b-434d-b593-81b7c7d8457f", 00:29:58.832 "assigned_rate_limits": { 00:29:58.832 "rw_ios_per_sec": 0, 00:29:58.832 "rw_mbytes_per_sec": 0, 00:29:58.832 "r_mbytes_per_sec": 0, 00:29:58.832 "w_mbytes_per_sec": 0 00:29:58.832 }, 00:29:58.832 "claimed": true, 00:29:58.832 "claim_type": "exclusive_write", 00:29:58.832 "zoned": false, 00:29:58.832 "supported_io_types": { 00:29:58.832 "read": true, 00:29:58.832 "write": true, 00:29:58.832 "unmap": true, 00:29:58.832 "write_zeroes": true, 00:29:58.832 "flush": true, 00:29:58.832 "reset": true, 00:29:58.832 "compare": false, 00:29:58.832 "compare_and_write": false, 00:29:58.832 "abort": true, 00:29:58.832 "nvme_admin": false, 00:29:58.832 "nvme_io": false 00:29:58.832 }, 00:29:58.832 "memory_domains": [ 00:29:58.832 { 00:29:58.832 "dma_device_id": "system", 00:29:58.832 "dma_device_type": 1 00:29:58.832 }, 00:29:58.832 { 00:29:58.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:58.832 "dma_device_type": 2 00:29:58.832 } 00:29:58.832 ], 00:29:58.832 "driver_specific": {} 00:29:58.832 }' 00:29:58.832 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.832 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:59.090 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.349 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.349 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:59.349 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:59.349 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:29:59.349 12:32:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:59.605 "name": "BaseBdev3", 00:29:59.605 "aliases": [ 00:29:59.605 "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00" 00:29:59.605 ], 00:29:59.605 "product_name": "Malloc disk", 00:29:59.605 "block_size": 512, 00:29:59.605 "num_blocks": 65536, 00:29:59.605 "uuid": "a0fdaaa5-87ed-4262-b66a-3ec6ec45cd00", 00:29:59.605 "assigned_rate_limits": { 00:29:59.605 "rw_ios_per_sec": 0, 00:29:59.605 "rw_mbytes_per_sec": 0, 00:29:59.605 "r_mbytes_per_sec": 0, 00:29:59.605 "w_mbytes_per_sec": 0 00:29:59.605 }, 00:29:59.605 "claimed": true, 00:29:59.605 "claim_type": "exclusive_write", 00:29:59.605 "zoned": false, 00:29:59.605 "supported_io_types": { 00:29:59.605 "read": true, 00:29:59.605 "write": true, 00:29:59.605 "unmap": true, 00:29:59.605 "write_zeroes": true, 00:29:59.605 "flush": true, 00:29:59.605 "reset": true, 00:29:59.605 "compare": false, 00:29:59.605 "compare_and_write": false, 00:29:59.605 "abort": true, 00:29:59.605 "nvme_admin": false, 00:29:59.605 "nvme_io": false 00:29:59.605 }, 00:29:59.605 "memory_domains": [ 00:29:59.605 { 00:29:59.605 "dma_device_id": "system", 00:29:59.605 "dma_device_type": 1 00:29:59.605 }, 00:29:59.605 { 00:29:59.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:59.605 "dma_device_type": 2 00:29:59.605 } 00:29:59.605 ], 00:29:59.605 "driver_specific": {} 00:29:59.605 }' 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:59.605 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:29:59.863 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:00.121 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:00.121 "name": "BaseBdev4", 00:30:00.121 "aliases": [ 00:30:00.121 "4010d6f5-c9cd-472d-a95a-104adfba7290" 00:30:00.121 ], 00:30:00.121 "product_name": "Malloc disk", 00:30:00.121 "block_size": 512, 00:30:00.121 "num_blocks": 65536, 00:30:00.121 "uuid": "4010d6f5-c9cd-472d-a95a-104adfba7290", 00:30:00.121 "assigned_rate_limits": { 00:30:00.121 "rw_ios_per_sec": 0, 00:30:00.121 "rw_mbytes_per_sec": 0, 00:30:00.121 "r_mbytes_per_sec": 0, 00:30:00.121 "w_mbytes_per_sec": 0 00:30:00.121 }, 00:30:00.121 "claimed": true, 00:30:00.121 "claim_type": "exclusive_write", 00:30:00.121 "zoned": false, 00:30:00.121 "supported_io_types": { 00:30:00.121 "read": true, 00:30:00.121 "write": true, 00:30:00.121 "unmap": true, 00:30:00.121 "write_zeroes": true, 00:30:00.121 "flush": true, 00:30:00.121 "reset": true, 00:30:00.121 "compare": false, 00:30:00.121 "compare_and_write": false, 00:30:00.121 "abort": true, 00:30:00.121 "nvme_admin": false, 00:30:00.121 "nvme_io": false 00:30:00.121 }, 00:30:00.121 "memory_domains": [ 00:30:00.121 { 00:30:00.121 "dma_device_id": "system", 00:30:00.121 "dma_device_type": 1 00:30:00.121 }, 00:30:00.121 { 00:30:00.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:00.121 "dma_device_type": 2 00:30:00.121 } 00:30:00.121 ], 00:30:00.121 "driver_specific": {} 00:30:00.121 }' 00:30:00.121 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:00.380 12:32:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:00.380 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:00.380 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:00.638 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:00.638 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:00.638 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:00.897 [2024-06-07 12:32:24.391907] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:00.897 [2024-06-07 12:32:24.392208] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:00.897 [2024-06-07 12:32:24.392430] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:00.897 [2024-06-07 12:32:24.392599] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:00.897 [2024-06-07 12:32:24.392730] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 212596 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 212596 ']' 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 212596 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 212596 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 212596' 00:30:00.897 killing process with pid 212596 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 212596 00:30:00.897 [2024-06-07 12:32:24.447366] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:00.897 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 212596 00:30:00.897 [2024-06-07 12:32:24.529020] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:01.464 12:32:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:30:01.464 00:30:01.464 real 0m39.373s 00:30:01.464 user 1m12.361s 00:30:01.464 sys 0m6.036s 00:30:01.464 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:01.464 12:32:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:01.464 ************************************ 00:30:01.464 END TEST raid_state_function_test_sb 00:30:01.464 ************************************ 00:30:01.464 12:32:24 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:30:01.464 12:32:24 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:30:01.464 12:32:24 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:01.464 12:32:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:01.465 ************************************ 00:30:01.465 START TEST raid_superblock_test 00:30:01.465 ************************************ 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 4 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=213744 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 213744 /var/tmp/spdk-raid.sock 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 213744 ']' 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:01.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:01.465 12:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:01.465 [2024-06-07 12:32:25.002076] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:30:01.465 [2024-06-07 12:32:25.002501] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid213744 ] 00:30:01.723 [2024-06-07 12:32:25.139327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:01.723 [2024-06-07 12:32:25.233313] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.723 [2024-06-07 12:32:25.320754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:01.982 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:30:02.240 malloc1 00:30:02.240 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:02.498 [2024-06-07 12:32:25.962106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:02.498 [2024-06-07 12:32:25.962709] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:02.498 [2024-06-07 12:32:25.962952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:30:02.498 [2024-06-07 12:32:25.963253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:02.498 [2024-06-07 12:32:25.965934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:02.498 [2024-06-07 12:32:25.966196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:02.498 pt1 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:02.498 12:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:30:02.756 malloc2 00:30:02.756 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:02.756 [2024-06-07 12:32:26.398275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:02.756 [2024-06-07 12:32:26.398950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:02.756 [2024-06-07 12:32:26.399170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:30:02.756 [2024-06-07 12:32:26.399384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:02.756 [2024-06-07 12:32:26.401657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:02.756 [2024-06-07 12:32:26.401865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:03.014 pt2 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:03.014 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:30:03.275 malloc3 00:30:03.275 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:30:03.275 [2024-06-07 12:32:26.911526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:30:03.275 [2024-06-07 12:32:26.911919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:03.275 [2024-06-07 12:32:26.912137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:30:03.275 [2024-06-07 12:32:26.912390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:03.275 [2024-06-07 12:32:26.915260] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:03.275 [2024-06-07 12:32:26.915475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:30:03.275 pt3 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:03.533 12:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:30:03.791 malloc4 00:30:03.791 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:30:03.791 [2024-06-07 12:32:27.415755] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:30:03.791 [2024-06-07 12:32:27.416101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:03.791 [2024-06-07 12:32:27.416215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:30:03.791 [2024-06-07 12:32:27.416450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:03.791 [2024-06-07 12:32:27.418961] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:03.791 [2024-06-07 12:32:27.419167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:30:03.791 pt4 00:30:04.050 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:04.050 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:04.050 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:30:04.308 [2024-06-07 12:32:27.715903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:04.308 [2024-06-07 12:32:27.718431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:04.308 [2024-06-07 12:32:27.718684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:30:04.308 [2024-06-07 12:32:27.718843] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:30:04.308 [2024-06-07 12:32:27.719098] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008480 00:30:04.308 [2024-06-07 12:32:27.719212] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:30:04.308 [2024-06-07 12:32:27.719467] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:30:04.308 [2024-06-07 12:32:27.719996] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008480 00:30:04.308 [2024-06-07 12:32:27.720124] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008480 00:30:04.308 [2024-06-07 12:32:27.720428] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.308 12:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.566 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:04.566 "name": "raid_bdev1", 00:30:04.566 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:04.566 "strip_size_kb": 64, 00:30:04.566 "state": "online", 00:30:04.566 "raid_level": "raid0", 00:30:04.566 "superblock": true, 00:30:04.566 "num_base_bdevs": 4, 00:30:04.566 "num_base_bdevs_discovered": 4, 00:30:04.566 "num_base_bdevs_operational": 4, 00:30:04.566 "base_bdevs_list": [ 00:30:04.566 { 00:30:04.566 "name": "pt1", 00:30:04.566 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:04.566 "is_configured": true, 00:30:04.566 "data_offset": 2048, 00:30:04.566 "data_size": 63488 00:30:04.566 }, 00:30:04.566 { 00:30:04.566 "name": "pt2", 00:30:04.566 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:04.566 "is_configured": true, 00:30:04.566 "data_offset": 2048, 00:30:04.566 "data_size": 63488 00:30:04.566 }, 00:30:04.566 { 00:30:04.566 "name": "pt3", 00:30:04.566 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:04.566 "is_configured": true, 00:30:04.566 "data_offset": 2048, 00:30:04.566 "data_size": 63488 00:30:04.566 }, 00:30:04.566 { 00:30:04.566 "name": "pt4", 00:30:04.566 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:04.566 "is_configured": true, 00:30:04.566 "data_offset": 2048, 00:30:04.566 "data_size": 63488 00:30:04.566 } 00:30:04.566 ] 00:30:04.566 }' 00:30:04.566 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:04.566 12:32:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:05.132 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:05.398 [2024-06-07 12:32:28.928816] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:05.398 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:05.398 "name": "raid_bdev1", 00:30:05.398 "aliases": [ 00:30:05.398 "677dfde6-5b45-4cde-b3cb-25c71c132486" 00:30:05.398 ], 00:30:05.398 "product_name": "Raid Volume", 00:30:05.398 "block_size": 512, 00:30:05.398 "num_blocks": 253952, 00:30:05.398 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:05.398 "assigned_rate_limits": { 00:30:05.398 "rw_ios_per_sec": 0, 00:30:05.398 "rw_mbytes_per_sec": 0, 00:30:05.398 "r_mbytes_per_sec": 0, 00:30:05.398 "w_mbytes_per_sec": 0 00:30:05.398 }, 00:30:05.398 "claimed": false, 00:30:05.398 "zoned": false, 00:30:05.398 "supported_io_types": { 00:30:05.398 "read": true, 00:30:05.398 "write": true, 00:30:05.398 "unmap": true, 00:30:05.398 "write_zeroes": true, 00:30:05.398 "flush": true, 00:30:05.398 "reset": true, 00:30:05.398 "compare": false, 00:30:05.398 "compare_and_write": false, 00:30:05.398 "abort": false, 00:30:05.398 "nvme_admin": false, 00:30:05.398 "nvme_io": false 00:30:05.398 }, 00:30:05.398 "memory_domains": [ 00:30:05.398 { 00:30:05.398 "dma_device_id": "system", 00:30:05.398 "dma_device_type": 1 00:30:05.398 }, 00:30:05.398 { 00:30:05.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.398 "dma_device_type": 2 00:30:05.398 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "system", 00:30:05.399 "dma_device_type": 1 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.399 "dma_device_type": 2 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "system", 00:30:05.399 "dma_device_type": 1 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.399 "dma_device_type": 2 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "system", 00:30:05.399 "dma_device_type": 1 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.399 "dma_device_type": 2 00:30:05.399 } 00:30:05.399 ], 00:30:05.399 "driver_specific": { 00:30:05.399 "raid": { 00:30:05.399 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:05.399 "strip_size_kb": 64, 00:30:05.399 "state": "online", 00:30:05.399 "raid_level": "raid0", 00:30:05.399 "superblock": true, 00:30:05.399 "num_base_bdevs": 4, 00:30:05.399 "num_base_bdevs_discovered": 4, 00:30:05.399 "num_base_bdevs_operational": 4, 00:30:05.399 "base_bdevs_list": [ 00:30:05.399 { 00:30:05.399 "name": "pt1", 00:30:05.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:05.399 "is_configured": true, 00:30:05.399 "data_offset": 2048, 00:30:05.399 "data_size": 63488 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "name": "pt2", 00:30:05.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:05.399 "is_configured": true, 00:30:05.399 "data_offset": 2048, 00:30:05.399 "data_size": 63488 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "name": "pt3", 00:30:05.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:05.399 "is_configured": true, 00:30:05.399 "data_offset": 2048, 00:30:05.399 "data_size": 63488 00:30:05.399 }, 00:30:05.399 { 00:30:05.399 "name": "pt4", 00:30:05.399 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:05.399 "is_configured": true, 00:30:05.399 "data_offset": 2048, 00:30:05.399 "data_size": 63488 00:30:05.399 } 00:30:05.399 ] 00:30:05.399 } 00:30:05.399 } 00:30:05.399 }' 00:30:05.399 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:05.399 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:05.399 pt2 00:30:05.399 pt3 00:30:05.399 pt4' 00:30:05.399 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:05.399 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:05.399 12:32:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:05.658 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:05.658 "name": "pt1", 00:30:05.658 "aliases": [ 00:30:05.658 "00000000-0000-0000-0000-000000000001" 00:30:05.658 ], 00:30:05.658 "product_name": "passthru", 00:30:05.658 "block_size": 512, 00:30:05.658 "num_blocks": 65536, 00:30:05.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:05.658 "assigned_rate_limits": { 00:30:05.658 "rw_ios_per_sec": 0, 00:30:05.658 "rw_mbytes_per_sec": 0, 00:30:05.658 "r_mbytes_per_sec": 0, 00:30:05.658 "w_mbytes_per_sec": 0 00:30:05.658 }, 00:30:05.658 "claimed": true, 00:30:05.658 "claim_type": "exclusive_write", 00:30:05.658 "zoned": false, 00:30:05.658 "supported_io_types": { 00:30:05.658 "read": true, 00:30:05.658 "write": true, 00:30:05.658 "unmap": true, 00:30:05.658 "write_zeroes": true, 00:30:05.658 "flush": true, 00:30:05.658 "reset": true, 00:30:05.658 "compare": false, 00:30:05.658 "compare_and_write": false, 00:30:05.658 "abort": true, 00:30:05.658 "nvme_admin": false, 00:30:05.658 "nvme_io": false 00:30:05.658 }, 00:30:05.658 "memory_domains": [ 00:30:05.658 { 00:30:05.658 "dma_device_id": "system", 00:30:05.658 "dma_device_type": 1 00:30:05.658 }, 00:30:05.658 { 00:30:05.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.658 "dma_device_type": 2 00:30:05.658 } 00:30:05.658 ], 00:30:05.658 "driver_specific": { 00:30:05.658 "passthru": { 00:30:05.658 "name": "pt1", 00:30:05.658 "base_bdev_name": "malloc1" 00:30:05.658 } 00:30:05.658 } 00:30:05.658 }' 00:30:05.658 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:05.915 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.175 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.175 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:06.175 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:06.175 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:06.175 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:06.434 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:06.434 "name": "pt2", 00:30:06.434 "aliases": [ 00:30:06.434 "00000000-0000-0000-0000-000000000002" 00:30:06.435 ], 00:30:06.435 "product_name": "passthru", 00:30:06.435 "block_size": 512, 00:30:06.435 "num_blocks": 65536, 00:30:06.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:06.435 "assigned_rate_limits": { 00:30:06.435 "rw_ios_per_sec": 0, 00:30:06.435 "rw_mbytes_per_sec": 0, 00:30:06.435 "r_mbytes_per_sec": 0, 00:30:06.435 "w_mbytes_per_sec": 0 00:30:06.435 }, 00:30:06.435 "claimed": true, 00:30:06.435 "claim_type": "exclusive_write", 00:30:06.435 "zoned": false, 00:30:06.435 "supported_io_types": { 00:30:06.435 "read": true, 00:30:06.435 "write": true, 00:30:06.435 "unmap": true, 00:30:06.435 "write_zeroes": true, 00:30:06.435 "flush": true, 00:30:06.435 "reset": true, 00:30:06.435 "compare": false, 00:30:06.435 "compare_and_write": false, 00:30:06.435 "abort": true, 00:30:06.435 "nvme_admin": false, 00:30:06.435 "nvme_io": false 00:30:06.435 }, 00:30:06.435 "memory_domains": [ 00:30:06.435 { 00:30:06.435 "dma_device_id": "system", 00:30:06.435 "dma_device_type": 1 00:30:06.435 }, 00:30:06.435 { 00:30:06.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.435 "dma_device_type": 2 00:30:06.435 } 00:30:06.435 ], 00:30:06.435 "driver_specific": { 00:30:06.435 "passthru": { 00:30:06.435 "name": "pt2", 00:30:06.435 "base_bdev_name": "malloc2" 00:30:06.435 } 00:30:06.435 } 00:30:06.435 }' 00:30:06.435 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.435 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.435 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:06.435 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:06.435 12:32:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:06.435 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:06.435 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.435 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:06.694 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:30:06.952 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:06.952 "name": "pt3", 00:30:06.952 "aliases": [ 00:30:06.952 "00000000-0000-0000-0000-000000000003" 00:30:06.952 ], 00:30:06.952 "product_name": "passthru", 00:30:06.952 "block_size": 512, 00:30:06.952 "num_blocks": 65536, 00:30:06.952 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:06.952 "assigned_rate_limits": { 00:30:06.952 "rw_ios_per_sec": 0, 00:30:06.952 "rw_mbytes_per_sec": 0, 00:30:06.952 "r_mbytes_per_sec": 0, 00:30:06.952 "w_mbytes_per_sec": 0 00:30:06.952 }, 00:30:06.952 "claimed": true, 00:30:06.952 "claim_type": "exclusive_write", 00:30:06.952 "zoned": false, 00:30:06.952 "supported_io_types": { 00:30:06.952 "read": true, 00:30:06.952 "write": true, 00:30:06.952 "unmap": true, 00:30:06.952 "write_zeroes": true, 00:30:06.952 "flush": true, 00:30:06.952 "reset": true, 00:30:06.952 "compare": false, 00:30:06.952 "compare_and_write": false, 00:30:06.952 "abort": true, 00:30:06.952 "nvme_admin": false, 00:30:06.952 "nvme_io": false 00:30:06.952 }, 00:30:06.952 "memory_domains": [ 00:30:06.952 { 00:30:06.952 "dma_device_id": "system", 00:30:06.952 "dma_device_type": 1 00:30:06.952 }, 00:30:06.952 { 00:30:06.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.952 "dma_device_type": 2 00:30:06.952 } 00:30:06.952 ], 00:30:06.952 "driver_specific": { 00:30:06.952 "passthru": { 00:30:06.952 "name": "pt3", 00:30:06.952 "base_bdev_name": "malloc3" 00:30:06.952 } 00:30:06.952 } 00:30:06.952 }' 00:30:06.952 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.952 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.952 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:06.952 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:30:07.211 12:32:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:07.470 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:07.470 "name": "pt4", 00:30:07.470 "aliases": [ 00:30:07.470 "00000000-0000-0000-0000-000000000004" 00:30:07.470 ], 00:30:07.470 "product_name": "passthru", 00:30:07.470 "block_size": 512, 00:30:07.470 "num_blocks": 65536, 00:30:07.470 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:07.470 "assigned_rate_limits": { 00:30:07.470 "rw_ios_per_sec": 0, 00:30:07.470 "rw_mbytes_per_sec": 0, 00:30:07.470 "r_mbytes_per_sec": 0, 00:30:07.470 "w_mbytes_per_sec": 0 00:30:07.470 }, 00:30:07.470 "claimed": true, 00:30:07.470 "claim_type": "exclusive_write", 00:30:07.470 "zoned": false, 00:30:07.470 "supported_io_types": { 00:30:07.470 "read": true, 00:30:07.470 "write": true, 00:30:07.470 "unmap": true, 00:30:07.470 "write_zeroes": true, 00:30:07.470 "flush": true, 00:30:07.470 "reset": true, 00:30:07.470 "compare": false, 00:30:07.470 "compare_and_write": false, 00:30:07.470 "abort": true, 00:30:07.470 "nvme_admin": false, 00:30:07.470 "nvme_io": false 00:30:07.470 }, 00:30:07.470 "memory_domains": [ 00:30:07.470 { 00:30:07.470 "dma_device_id": "system", 00:30:07.470 "dma_device_type": 1 00:30:07.470 }, 00:30:07.470 { 00:30:07.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:07.470 "dma_device_type": 2 00:30:07.470 } 00:30:07.470 ], 00:30:07.470 "driver_specific": { 00:30:07.470 "passthru": { 00:30:07.470 "name": "pt4", 00:30:07.470 "base_bdev_name": "malloc4" 00:30:07.470 } 00:30:07.470 } 00:30:07.470 }' 00:30:07.470 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:07.470 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:07.470 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:07.729 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:30:08.296 [2024-06-07 12:32:31.641143] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:08.296 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=677dfde6-5b45-4cde-b3cb-25c71c132486 00:30:08.296 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 677dfde6-5b45-4cde-b3cb-25c71c132486 ']' 00:30:08.296 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:08.296 [2024-06-07 12:32:31.900921] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:08.296 [2024-06-07 12:32:31.901196] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:08.296 [2024-06-07 12:32:31.901419] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:08.296 [2024-06-07 12:32:31.901592] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:08.296 [2024-06-07 12:32:31.901676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008480 name raid_bdev1, state offline 00:30:08.296 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.296 12:32:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:30:08.555 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:30:08.555 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:30:08.555 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:08.555 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:08.813 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:08.813 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:09.072 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:09.072 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:30:09.330 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:09.330 12:32:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:30:09.588 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:09.588 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:30:09.845 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:30:10.103 [2024-06-07 12:32:33.605143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:10.103 [2024-06-07 12:32:33.607474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:10.103 [2024-06-07 12:32:33.607677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:30:10.103 [2024-06-07 12:32:33.607794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:30:10.103 [2024-06-07 12:32:33.607935] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:10.103 [2024-06-07 12:32:33.608156] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:10.103 [2024-06-07 12:32:33.608325] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:30:10.103 [2024-06-07 12:32:33.608471] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:30:10.103 [2024-06-07 12:32:33.608579] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:10.103 [2024-06-07 12:32:33.608662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state configuring 00:30:10.103 request: 00:30:10.103 { 00:30:10.103 "name": "raid_bdev1", 00:30:10.103 "raid_level": "raid0", 00:30:10.103 "base_bdevs": [ 00:30:10.103 "malloc1", 00:30:10.103 "malloc2", 00:30:10.103 "malloc3", 00:30:10.103 "malloc4" 00:30:10.103 ], 00:30:10.103 "superblock": false, 00:30:10.103 "strip_size_kb": 64, 00:30:10.103 "method": "bdev_raid_create", 00:30:10.103 "req_id": 1 00:30:10.103 } 00:30:10.103 Got JSON-RPC error response 00:30:10.103 response: 00:30:10.103 { 00:30:10.103 "code": -17, 00:30:10.103 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:10.103 } 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.103 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:30:10.361 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:30:10.362 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:30:10.362 12:32:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:10.618 [2024-06-07 12:32:34.045314] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:10.618 [2024-06-07 12:32:34.045607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:10.618 [2024-06-07 12:32:34.045680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:30:10.618 [2024-06-07 12:32:34.045795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:10.618 [2024-06-07 12:32:34.048110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:10.618 [2024-06-07 12:32:34.048314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:10.618 [2024-06-07 12:32:34.048486] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:10.618 [2024-06-07 12:32:34.048566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:10.618 pt1 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:10.618 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.875 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.875 "name": "raid_bdev1", 00:30:10.875 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:10.875 "strip_size_kb": 64, 00:30:10.875 "state": "configuring", 00:30:10.875 "raid_level": "raid0", 00:30:10.875 "superblock": true, 00:30:10.875 "num_base_bdevs": 4, 00:30:10.875 "num_base_bdevs_discovered": 1, 00:30:10.875 "num_base_bdevs_operational": 4, 00:30:10.875 "base_bdevs_list": [ 00:30:10.875 { 00:30:10.875 "name": "pt1", 00:30:10.875 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:10.875 "is_configured": true, 00:30:10.875 "data_offset": 2048, 00:30:10.875 "data_size": 63488 00:30:10.875 }, 00:30:10.875 { 00:30:10.875 "name": null, 00:30:10.875 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:10.875 "is_configured": false, 00:30:10.875 "data_offset": 2048, 00:30:10.875 "data_size": 63488 00:30:10.875 }, 00:30:10.875 { 00:30:10.875 "name": null, 00:30:10.875 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:10.875 "is_configured": false, 00:30:10.875 "data_offset": 2048, 00:30:10.875 "data_size": 63488 00:30:10.875 }, 00:30:10.875 { 00:30:10.875 "name": null, 00:30:10.875 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:10.875 "is_configured": false, 00:30:10.875 "data_offset": 2048, 00:30:10.875 "data_size": 63488 00:30:10.875 } 00:30:10.875 ] 00:30:10.875 }' 00:30:10.875 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.875 12:32:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:11.440 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:30:11.440 12:32:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:11.698 [2024-06-07 12:32:35.173546] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:11.698 [2024-06-07 12:32:35.173820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:11.698 [2024-06-07 12:32:35.173917] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:30:11.698 [2024-06-07 12:32:35.174054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:11.698 [2024-06-07 12:32:35.174542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:11.698 [2024-06-07 12:32:35.174720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:11.698 [2024-06-07 12:32:35.174937] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:11.698 [2024-06-07 12:32:35.175071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:11.698 pt2 00:30:11.698 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:11.956 [2024-06-07 12:32:35.465628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.956 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.214 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.214 "name": "raid_bdev1", 00:30:12.214 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:12.214 "strip_size_kb": 64, 00:30:12.214 "state": "configuring", 00:30:12.214 "raid_level": "raid0", 00:30:12.214 "superblock": true, 00:30:12.214 "num_base_bdevs": 4, 00:30:12.214 "num_base_bdevs_discovered": 1, 00:30:12.214 "num_base_bdevs_operational": 4, 00:30:12.214 "base_bdevs_list": [ 00:30:12.214 { 00:30:12.214 "name": "pt1", 00:30:12.214 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:12.214 "is_configured": true, 00:30:12.214 "data_offset": 2048, 00:30:12.214 "data_size": 63488 00:30:12.214 }, 00:30:12.214 { 00:30:12.214 "name": null, 00:30:12.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:12.214 "is_configured": false, 00:30:12.214 "data_offset": 2048, 00:30:12.214 "data_size": 63488 00:30:12.214 }, 00:30:12.214 { 00:30:12.214 "name": null, 00:30:12.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:12.214 "is_configured": false, 00:30:12.214 "data_offset": 2048, 00:30:12.214 "data_size": 63488 00:30:12.214 }, 00:30:12.214 { 00:30:12.214 "name": null, 00:30:12.214 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:12.214 "is_configured": false, 00:30:12.214 "data_offset": 2048, 00:30:12.214 "data_size": 63488 00:30:12.214 } 00:30:12.214 ] 00:30:12.214 }' 00:30:12.214 12:32:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.214 12:32:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:12.810 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:30:12.810 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:12.810 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:13.108 [2024-06-07 12:32:36.613696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:13.108 [2024-06-07 12:32:36.614039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:13.108 [2024-06-07 12:32:36.614133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009c80 00:30:13.108 [2024-06-07 12:32:36.614294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:13.108 [2024-06-07 12:32:36.614768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:13.108 [2024-06-07 12:32:36.614964] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:13.108 [2024-06-07 12:32:36.615186] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:13.108 [2024-06-07 12:32:36.615326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:13.108 pt2 00:30:13.108 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:13.108 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:13.108 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:30:13.366 [2024-06-07 12:32:36.837742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:30:13.366 [2024-06-07 12:32:36.838061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:13.366 [2024-06-07 12:32:36.838126] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:30:13.366 [2024-06-07 12:32:36.838267] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:13.366 [2024-06-07 12:32:36.838676] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:13.366 [2024-06-07 12:32:36.838845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:30:13.366 [2024-06-07 12:32:36.839025] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:30:13.366 [2024-06-07 12:32:36.839087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:30:13.366 pt3 00:30:13.366 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:13.366 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:13.366 12:32:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:30:13.626 [2024-06-07 12:32:37.125762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:30:13.626 [2024-06-07 12:32:37.126070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:13.626 [2024-06-07 12:32:37.126140] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a280 00:30:13.626 [2024-06-07 12:32:37.126323] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:13.626 [2024-06-07 12:32:37.126739] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:13.626 [2024-06-07 12:32:37.126918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:30:13.626 [2024-06-07 12:32:37.127085] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:30:13.626 [2024-06-07 12:32:37.127180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:30:13.626 [2024-06-07 12:32:37.127318] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:30:13.626 [2024-06-07 12:32:37.127409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:30:13.626 [2024-06-07 12:32:37.127504] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:30:13.626 [2024-06-07 12:32:37.127773] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:30:13.626 [2024-06-07 12:32:37.127810] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:30:13.626 [2024-06-07 12:32:37.128043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:13.626 pt4 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.626 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:13.885 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:13.885 "name": "raid_bdev1", 00:30:13.885 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:13.885 "strip_size_kb": 64, 00:30:13.885 "state": "online", 00:30:13.885 "raid_level": "raid0", 00:30:13.885 "superblock": true, 00:30:13.885 "num_base_bdevs": 4, 00:30:13.885 "num_base_bdevs_discovered": 4, 00:30:13.885 "num_base_bdevs_operational": 4, 00:30:13.885 "base_bdevs_list": [ 00:30:13.885 { 00:30:13.885 "name": "pt1", 00:30:13.885 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:13.885 "is_configured": true, 00:30:13.885 "data_offset": 2048, 00:30:13.885 "data_size": 63488 00:30:13.885 }, 00:30:13.885 { 00:30:13.885 "name": "pt2", 00:30:13.885 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:13.885 "is_configured": true, 00:30:13.885 "data_offset": 2048, 00:30:13.885 "data_size": 63488 00:30:13.885 }, 00:30:13.885 { 00:30:13.885 "name": "pt3", 00:30:13.885 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:13.885 "is_configured": true, 00:30:13.885 "data_offset": 2048, 00:30:13.885 "data_size": 63488 00:30:13.885 }, 00:30:13.885 { 00:30:13.885 "name": "pt4", 00:30:13.885 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:13.885 "is_configured": true, 00:30:13.885 "data_offset": 2048, 00:30:13.885 "data_size": 63488 00:30:13.885 } 00:30:13.885 ] 00:30:13.885 }' 00:30:13.885 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:13.885 12:32:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:14.450 12:32:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:14.708 [2024-06-07 12:32:38.194053] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:14.708 "name": "raid_bdev1", 00:30:14.708 "aliases": [ 00:30:14.708 "677dfde6-5b45-4cde-b3cb-25c71c132486" 00:30:14.708 ], 00:30:14.708 "product_name": "Raid Volume", 00:30:14.708 "block_size": 512, 00:30:14.708 "num_blocks": 253952, 00:30:14.708 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:14.708 "assigned_rate_limits": { 00:30:14.708 "rw_ios_per_sec": 0, 00:30:14.708 "rw_mbytes_per_sec": 0, 00:30:14.708 "r_mbytes_per_sec": 0, 00:30:14.708 "w_mbytes_per_sec": 0 00:30:14.708 }, 00:30:14.708 "claimed": false, 00:30:14.708 "zoned": false, 00:30:14.708 "supported_io_types": { 00:30:14.708 "read": true, 00:30:14.708 "write": true, 00:30:14.708 "unmap": true, 00:30:14.708 "write_zeroes": true, 00:30:14.708 "flush": true, 00:30:14.708 "reset": true, 00:30:14.708 "compare": false, 00:30:14.708 "compare_and_write": false, 00:30:14.708 "abort": false, 00:30:14.708 "nvme_admin": false, 00:30:14.708 "nvme_io": false 00:30:14.708 }, 00:30:14.708 "memory_domains": [ 00:30:14.708 { 00:30:14.708 "dma_device_id": "system", 00:30:14.708 "dma_device_type": 1 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.708 "dma_device_type": 2 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "system", 00:30:14.708 "dma_device_type": 1 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.708 "dma_device_type": 2 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "system", 00:30:14.708 "dma_device_type": 1 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.708 "dma_device_type": 2 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "system", 00:30:14.708 "dma_device_type": 1 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.708 "dma_device_type": 2 00:30:14.708 } 00:30:14.708 ], 00:30:14.708 "driver_specific": { 00:30:14.708 "raid": { 00:30:14.708 "uuid": "677dfde6-5b45-4cde-b3cb-25c71c132486", 00:30:14.708 "strip_size_kb": 64, 00:30:14.708 "state": "online", 00:30:14.708 "raid_level": "raid0", 00:30:14.708 "superblock": true, 00:30:14.708 "num_base_bdevs": 4, 00:30:14.708 "num_base_bdevs_discovered": 4, 00:30:14.708 "num_base_bdevs_operational": 4, 00:30:14.708 "base_bdevs_list": [ 00:30:14.708 { 00:30:14.708 "name": "pt1", 00:30:14.708 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:14.708 "is_configured": true, 00:30:14.708 "data_offset": 2048, 00:30:14.708 "data_size": 63488 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "name": "pt2", 00:30:14.708 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:14.708 "is_configured": true, 00:30:14.708 "data_offset": 2048, 00:30:14.708 "data_size": 63488 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "name": "pt3", 00:30:14.708 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:14.708 "is_configured": true, 00:30:14.708 "data_offset": 2048, 00:30:14.708 "data_size": 63488 00:30:14.708 }, 00:30:14.708 { 00:30:14.708 "name": "pt4", 00:30:14.708 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:14.708 "is_configured": true, 00:30:14.708 "data_offset": 2048, 00:30:14.708 "data_size": 63488 00:30:14.708 } 00:30:14.708 ] 00:30:14.708 } 00:30:14.708 } 00:30:14.708 }' 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:14.708 pt2 00:30:14.708 pt3 00:30:14.708 pt4' 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:14.708 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:14.966 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:14.966 "name": "pt1", 00:30:14.966 "aliases": [ 00:30:14.966 "00000000-0000-0000-0000-000000000001" 00:30:14.966 ], 00:30:14.966 "product_name": "passthru", 00:30:14.966 "block_size": 512, 00:30:14.966 "num_blocks": 65536, 00:30:14.966 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:14.966 "assigned_rate_limits": { 00:30:14.966 "rw_ios_per_sec": 0, 00:30:14.966 "rw_mbytes_per_sec": 0, 00:30:14.966 "r_mbytes_per_sec": 0, 00:30:14.966 "w_mbytes_per_sec": 0 00:30:14.966 }, 00:30:14.966 "claimed": true, 00:30:14.966 "claim_type": "exclusive_write", 00:30:14.966 "zoned": false, 00:30:14.966 "supported_io_types": { 00:30:14.966 "read": true, 00:30:14.966 "write": true, 00:30:14.966 "unmap": true, 00:30:14.966 "write_zeroes": true, 00:30:14.966 "flush": true, 00:30:14.966 "reset": true, 00:30:14.966 "compare": false, 00:30:14.966 "compare_and_write": false, 00:30:14.966 "abort": true, 00:30:14.966 "nvme_admin": false, 00:30:14.966 "nvme_io": false 00:30:14.966 }, 00:30:14.966 "memory_domains": [ 00:30:14.966 { 00:30:14.966 "dma_device_id": "system", 00:30:14.966 "dma_device_type": 1 00:30:14.966 }, 00:30:14.966 { 00:30:14.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:14.966 "dma_device_type": 2 00:30:14.966 } 00:30:14.966 ], 00:30:14.966 "driver_specific": { 00:30:14.966 "passthru": { 00:30:14.966 "name": "pt1", 00:30:14.966 "base_bdev_name": "malloc1" 00:30:14.966 } 00:30:14.966 } 00:30:14.966 }' 00:30:14.966 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:14.966 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:15.223 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:15.223 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.223 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.223 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:15.223 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.224 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.224 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:15.224 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:15.224 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:15.481 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:15.481 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:15.481 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:15.481 12:32:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:15.481 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:15.481 "name": "pt2", 00:30:15.481 "aliases": [ 00:30:15.481 "00000000-0000-0000-0000-000000000002" 00:30:15.481 ], 00:30:15.481 "product_name": "passthru", 00:30:15.481 "block_size": 512, 00:30:15.481 "num_blocks": 65536, 00:30:15.481 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:15.481 "assigned_rate_limits": { 00:30:15.481 "rw_ios_per_sec": 0, 00:30:15.481 "rw_mbytes_per_sec": 0, 00:30:15.481 "r_mbytes_per_sec": 0, 00:30:15.481 "w_mbytes_per_sec": 0 00:30:15.481 }, 00:30:15.481 "claimed": true, 00:30:15.481 "claim_type": "exclusive_write", 00:30:15.481 "zoned": false, 00:30:15.481 "supported_io_types": { 00:30:15.481 "read": true, 00:30:15.481 "write": true, 00:30:15.481 "unmap": true, 00:30:15.481 "write_zeroes": true, 00:30:15.481 "flush": true, 00:30:15.481 "reset": true, 00:30:15.481 "compare": false, 00:30:15.481 "compare_and_write": false, 00:30:15.481 "abort": true, 00:30:15.481 "nvme_admin": false, 00:30:15.481 "nvme_io": false 00:30:15.481 }, 00:30:15.481 "memory_domains": [ 00:30:15.481 { 00:30:15.481 "dma_device_id": "system", 00:30:15.481 "dma_device_type": 1 00:30:15.481 }, 00:30:15.481 { 00:30:15.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:15.481 "dma_device_type": 2 00:30:15.481 } 00:30:15.481 ], 00:30:15.481 "driver_specific": { 00:30:15.481 "passthru": { 00:30:15.481 "name": "pt2", 00:30:15.481 "base_bdev_name": "malloc2" 00:30:15.481 } 00:30:15.481 } 00:30:15.481 }' 00:30:15.481 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:15.738 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:15.995 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:15.995 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:15.995 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:30:15.995 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:16.272 "name": "pt3", 00:30:16.272 "aliases": [ 00:30:16.272 "00000000-0000-0000-0000-000000000003" 00:30:16.272 ], 00:30:16.272 "product_name": "passthru", 00:30:16.272 "block_size": 512, 00:30:16.272 "num_blocks": 65536, 00:30:16.272 "uuid": "00000000-0000-0000-0000-000000000003", 00:30:16.272 "assigned_rate_limits": { 00:30:16.272 "rw_ios_per_sec": 0, 00:30:16.272 "rw_mbytes_per_sec": 0, 00:30:16.272 "r_mbytes_per_sec": 0, 00:30:16.272 "w_mbytes_per_sec": 0 00:30:16.272 }, 00:30:16.272 "claimed": true, 00:30:16.272 "claim_type": "exclusive_write", 00:30:16.272 "zoned": false, 00:30:16.272 "supported_io_types": { 00:30:16.272 "read": true, 00:30:16.272 "write": true, 00:30:16.272 "unmap": true, 00:30:16.272 "write_zeroes": true, 00:30:16.272 "flush": true, 00:30:16.272 "reset": true, 00:30:16.272 "compare": false, 00:30:16.272 "compare_and_write": false, 00:30:16.272 "abort": true, 00:30:16.272 "nvme_admin": false, 00:30:16.272 "nvme_io": false 00:30:16.272 }, 00:30:16.272 "memory_domains": [ 00:30:16.272 { 00:30:16.272 "dma_device_id": "system", 00:30:16.272 "dma_device_type": 1 00:30:16.272 }, 00:30:16.272 { 00:30:16.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:16.272 "dma_device_type": 2 00:30:16.272 } 00:30:16.272 ], 00:30:16.272 "driver_specific": { 00:30:16.272 "passthru": { 00:30:16.272 "name": "pt3", 00:30:16.272 "base_bdev_name": "malloc3" 00:30:16.272 } 00:30:16.272 } 00:30:16.272 }' 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.272 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:16.530 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:16.530 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.530 12:32:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:16.530 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:16.530 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:16.530 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:30:16.530 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:16.787 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:16.787 "name": "pt4", 00:30:16.787 "aliases": [ 00:30:16.787 "00000000-0000-0000-0000-000000000004" 00:30:16.787 ], 00:30:16.787 "product_name": "passthru", 00:30:16.787 "block_size": 512, 00:30:16.787 "num_blocks": 65536, 00:30:16.787 "uuid": "00000000-0000-0000-0000-000000000004", 00:30:16.787 "assigned_rate_limits": { 00:30:16.787 "rw_ios_per_sec": 0, 00:30:16.787 "rw_mbytes_per_sec": 0, 00:30:16.787 "r_mbytes_per_sec": 0, 00:30:16.787 "w_mbytes_per_sec": 0 00:30:16.787 }, 00:30:16.787 "claimed": true, 00:30:16.787 "claim_type": "exclusive_write", 00:30:16.787 "zoned": false, 00:30:16.787 "supported_io_types": { 00:30:16.787 "read": true, 00:30:16.787 "write": true, 00:30:16.787 "unmap": true, 00:30:16.787 "write_zeroes": true, 00:30:16.787 "flush": true, 00:30:16.787 "reset": true, 00:30:16.787 "compare": false, 00:30:16.787 "compare_and_write": false, 00:30:16.787 "abort": true, 00:30:16.787 "nvme_admin": false, 00:30:16.787 "nvme_io": false 00:30:16.787 }, 00:30:16.787 "memory_domains": [ 00:30:16.787 { 00:30:16.787 "dma_device_id": "system", 00:30:16.787 "dma_device_type": 1 00:30:16.787 }, 00:30:16.787 { 00:30:16.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:16.787 "dma_device_type": 2 00:30:16.787 } 00:30:16.787 ], 00:30:16.787 "driver_specific": { 00:30:16.787 "passthru": { 00:30:16.787 "name": "pt4", 00:30:16.787 "base_bdev_name": "malloc4" 00:30:16.787 } 00:30:16.787 } 00:30:16.787 }' 00:30:16.787 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.788 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:16.788 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:16.788 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:16.788 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:17.045 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:30:17.304 [2024-06-07 12:32:40.886408] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 677dfde6-5b45-4cde-b3cb-25c71c132486 '!=' 677dfde6-5b45-4cde-b3cb-25c71c132486 ']' 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 213744 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 213744 ']' 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 213744 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 213744 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 213744' 00:30:17.304 killing process with pid 213744 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 213744 00:30:17.304 [2024-06-07 12:32:40.944957] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:17.304 12:32:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 213744 00:30:17.304 [2024-06-07 12:32:40.945259] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:17.304 [2024-06-07 12:32:40.945420] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:17.304 [2024-06-07 12:32:40.945510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:30:17.562 [2024-06-07 12:32:41.034346] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:17.820 12:32:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:30:17.820 00:30:17.820 real 0m16.417s 00:30:17.820 user 0m29.772s 00:30:17.820 sys 0m2.797s 00:30:17.820 12:32:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:17.820 12:32:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:30:17.820 ************************************ 00:30:17.820 END TEST raid_superblock_test 00:30:17.820 ************************************ 00:30:17.820 12:32:41 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:30:17.820 12:32:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:17.820 12:32:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:17.820 12:32:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:17.820 ************************************ 00:30:17.820 START TEST raid_read_error_test 00:30:17.820 ************************************ 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 read 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:30:17.820 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.y27xxU92Vw 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=214280 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 214280 /var/tmp/spdk-raid.sock 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 214280 ']' 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:18.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:18.079 12:32:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:18.079 [2024-06-07 12:32:41.502418] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:30:18.079 [2024-06-07 12:32:41.503585] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214280 ] 00:30:18.079 [2024-06-07 12:32:41.652070] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:18.337 [2024-06-07 12:32:41.744445] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:18.337 [2024-06-07 12:32:41.825951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:18.904 12:32:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:18.904 12:32:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:30:18.904 12:32:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:18.904 12:32:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:19.162 BaseBdev1_malloc 00:30:19.162 12:32:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:30:19.421 true 00:30:19.421 12:32:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:30:19.679 [2024-06-07 12:32:43.223890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:30:19.679 [2024-06-07 12:32:43.224316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:19.679 [2024-06-07 12:32:43.224567] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:30:19.679 [2024-06-07 12:32:43.224753] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:19.679 [2024-06-07 12:32:43.227618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:19.679 [2024-06-07 12:32:43.227835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:19.679 BaseBdev1 00:30:19.679 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:19.679 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:19.936 BaseBdev2_malloc 00:30:19.936 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:30:20.194 true 00:30:20.194 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:30:20.455 [2024-06-07 12:32:43.928260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:30:20.455 [2024-06-07 12:32:43.928554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:20.455 [2024-06-07 12:32:43.928661] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:30:20.455 [2024-06-07 12:32:43.928938] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:20.455 [2024-06-07 12:32:43.931228] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:20.455 [2024-06-07 12:32:43.931403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:20.455 BaseBdev2 00:30:20.455 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:20.455 12:32:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:20.725 BaseBdev3_malloc 00:30:20.725 12:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:30:20.983 true 00:30:20.983 12:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:30:21.242 [2024-06-07 12:32:44.690904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:30:21.242 [2024-06-07 12:32:44.691247] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:21.242 [2024-06-07 12:32:44.691335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:30:21.242 [2024-06-07 12:32:44.691636] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:21.242 [2024-06-07 12:32:44.694144] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:21.242 [2024-06-07 12:32:44.694356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:21.242 BaseBdev3 00:30:21.242 12:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:21.242 12:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:30:21.501 BaseBdev4_malloc 00:30:21.501 12:32:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:30:21.762 true 00:30:21.762 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:30:22.020 [2024-06-07 12:32:45.446786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:30:22.020 [2024-06-07 12:32:45.447127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:22.020 [2024-06-07 12:32:45.447207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:30:22.020 [2024-06-07 12:32:45.447445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:22.020 [2024-06-07 12:32:45.449834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:22.020 [2024-06-07 12:32:45.450023] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:30:22.020 BaseBdev4 00:30:22.020 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:30:22.279 [2024-06-07 12:32:45.670937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:22.279 [2024-06-07 12:32:45.673330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:22.279 [2024-06-07 12:32:45.673543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:22.279 [2024-06-07 12:32:45.673631] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:22.279 [2024-06-07 12:32:45.673886] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:30:22.279 [2024-06-07 12:32:45.673930] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:30:22.279 [2024-06-07 12:32:45.674168] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:30:22.279 [2024-06-07 12:32:45.674610] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:30:22.279 [2024-06-07 12:32:45.674723] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:30:22.279 [2024-06-07 12:32:45.675005] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:22.279 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.538 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:22.538 "name": "raid_bdev1", 00:30:22.538 "uuid": "c38d860f-d8bf-475b-9191-0b846e43495f", 00:30:22.538 "strip_size_kb": 64, 00:30:22.538 "state": "online", 00:30:22.538 "raid_level": "raid0", 00:30:22.538 "superblock": true, 00:30:22.538 "num_base_bdevs": 4, 00:30:22.538 "num_base_bdevs_discovered": 4, 00:30:22.538 "num_base_bdevs_operational": 4, 00:30:22.538 "base_bdevs_list": [ 00:30:22.538 { 00:30:22.538 "name": "BaseBdev1", 00:30:22.538 "uuid": "5d4c4bc4-903c-5617-ba0e-7c129606cd0a", 00:30:22.538 "is_configured": true, 00:30:22.538 "data_offset": 2048, 00:30:22.538 "data_size": 63488 00:30:22.538 }, 00:30:22.538 { 00:30:22.538 "name": "BaseBdev2", 00:30:22.538 "uuid": "0eb755d4-ad6e-5714-8093-35f91953c2b6", 00:30:22.538 "is_configured": true, 00:30:22.538 "data_offset": 2048, 00:30:22.538 "data_size": 63488 00:30:22.538 }, 00:30:22.538 { 00:30:22.538 "name": "BaseBdev3", 00:30:22.538 "uuid": "01f0b4b4-610b-5863-9b8b-db6534a4c7eb", 00:30:22.538 "is_configured": true, 00:30:22.538 "data_offset": 2048, 00:30:22.538 "data_size": 63488 00:30:22.538 }, 00:30:22.538 { 00:30:22.538 "name": "BaseBdev4", 00:30:22.538 "uuid": "f7eebc27-b7ff-547a-ac61-746286961dcc", 00:30:22.538 "is_configured": true, 00:30:22.538 "data_offset": 2048, 00:30:22.538 "data_size": 63488 00:30:22.538 } 00:30:22.538 ] 00:30:22.538 }' 00:30:22.538 12:32:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:22.538 12:32:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:23.105 12:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:30:23.105 12:32:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:23.105 [2024-06-07 12:32:46.735451] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:30:24.038 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:24.297 12:32:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.863 12:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:24.863 "name": "raid_bdev1", 00:30:24.863 "uuid": "c38d860f-d8bf-475b-9191-0b846e43495f", 00:30:24.863 "strip_size_kb": 64, 00:30:24.863 "state": "online", 00:30:24.863 "raid_level": "raid0", 00:30:24.863 "superblock": true, 00:30:24.863 "num_base_bdevs": 4, 00:30:24.863 "num_base_bdevs_discovered": 4, 00:30:24.863 "num_base_bdevs_operational": 4, 00:30:24.863 "base_bdevs_list": [ 00:30:24.863 { 00:30:24.863 "name": "BaseBdev1", 00:30:24.863 "uuid": "5d4c4bc4-903c-5617-ba0e-7c129606cd0a", 00:30:24.863 "is_configured": true, 00:30:24.863 "data_offset": 2048, 00:30:24.863 "data_size": 63488 00:30:24.863 }, 00:30:24.863 { 00:30:24.863 "name": "BaseBdev2", 00:30:24.864 "uuid": "0eb755d4-ad6e-5714-8093-35f91953c2b6", 00:30:24.864 "is_configured": true, 00:30:24.864 "data_offset": 2048, 00:30:24.864 "data_size": 63488 00:30:24.864 }, 00:30:24.864 { 00:30:24.864 "name": "BaseBdev3", 00:30:24.864 "uuid": "01f0b4b4-610b-5863-9b8b-db6534a4c7eb", 00:30:24.864 "is_configured": true, 00:30:24.864 "data_offset": 2048, 00:30:24.864 "data_size": 63488 00:30:24.864 }, 00:30:24.864 { 00:30:24.864 "name": "BaseBdev4", 00:30:24.864 "uuid": "f7eebc27-b7ff-547a-ac61-746286961dcc", 00:30:24.864 "is_configured": true, 00:30:24.864 "data_offset": 2048, 00:30:24.864 "data_size": 63488 00:30:24.864 } 00:30:24.864 ] 00:30:24.864 }' 00:30:24.864 12:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:24.864 12:32:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:25.428 12:32:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:25.429 [2024-06-07 12:32:49.042974] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:25.429 [2024-06-07 12:32:49.043271] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:25.429 [2024-06-07 12:32:49.044663] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:25.429 [2024-06-07 12:32:49.044856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:25.429 [2024-06-07 12:32:49.045002] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:25.429 [2024-06-07 12:32:49.045094] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:30:25.429 0 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 214280 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 214280 ']' 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 214280 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 214280 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 214280' 00:30:25.687 killing process with pid 214280 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 214280 00:30:25.687 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 214280 00:30:25.687 [2024-06-07 12:32:49.102944] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:25.687 [2024-06-07 12:32:49.171963] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.y27xxU92Vw 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.43 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.43 != \0\.\0\0 ]] 00:30:25.946 00:30:25.946 real 0m8.120s 00:30:25.946 user 0m12.905s 00:30:25.946 sys 0m1.281s 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:25.946 12:32:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:25.946 ************************************ 00:30:25.946 END TEST raid_read_error_test 00:30:25.946 ************************************ 00:30:26.205 12:32:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:30:26.205 12:32:49 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:26.205 12:32:49 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:26.205 12:32:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:26.205 ************************************ 00:30:26.205 START TEST raid_write_error_test 00:30:26.205 ************************************ 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 write 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NTqqbAo8iH 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=214483 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 214483 /var/tmp/spdk-raid.sock 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 214483 ']' 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:26.205 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:26.205 12:32:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:26.205 [2024-06-07 12:32:49.686499] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:30:26.205 [2024-06-07 12:32:49.687423] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214483 ] 00:30:26.205 [2024-06-07 12:32:49.831096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:26.464 [2024-06-07 12:32:49.926478] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:26.464 [2024-06-07 12:32:50.013024] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:27.031 12:32:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:27.031 12:32:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:30:27.031 12:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:27.031 12:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:27.289 BaseBdev1_malloc 00:30:27.289 12:32:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:30:27.547 true 00:30:27.548 12:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:30:27.806 [2024-06-07 12:32:51.443020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:30:27.806 [2024-06-07 12:32:51.443437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:27.806 [2024-06-07 12:32:51.443574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:30:27.806 [2024-06-07 12:32:51.443992] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:27.806 [2024-06-07 12:32:51.446849] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:27.806 [2024-06-07 12:32:51.447052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:27.806 BaseBdev1 00:30:28.064 12:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:28.064 12:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:28.322 BaseBdev2_malloc 00:30:28.322 12:32:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:30:28.580 true 00:30:28.580 12:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:30:28.838 [2024-06-07 12:32:52.390948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:30:28.838 [2024-06-07 12:32:52.391240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:28.838 [2024-06-07 12:32:52.391340] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:30:28.838 [2024-06-07 12:32:52.391581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:28.838 [2024-06-07 12:32:52.394036] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:28.838 [2024-06-07 12:32:52.394218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:28.838 BaseBdev2 00:30:28.838 12:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:28.838 12:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:29.097 BaseBdev3_malloc 00:30:29.355 12:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:30:29.355 true 00:30:29.355 12:32:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:30:29.613 [2024-06-07 12:32:53.225144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:30:29.613 [2024-06-07 12:32:53.225585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:29.613 [2024-06-07 12:32:53.225681] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:30:29.613 [2024-06-07 12:32:53.225864] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:29.613 [2024-06-07 12:32:53.228392] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:29.613 [2024-06-07 12:32:53.228613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:29.613 BaseBdev3 00:30:29.613 12:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:30:29.613 12:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:30:29.871 BaseBdev4_malloc 00:30:29.871 12:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:30:30.130 true 00:30:30.130 12:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:30:30.388 [2024-06-07 12:32:53.892701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:30:30.388 [2024-06-07 12:32:53.893023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:30.388 [2024-06-07 12:32:53.893117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:30:30.388 [2024-06-07 12:32:53.893371] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:30.388 [2024-06-07 12:32:53.895896] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:30.388 [2024-06-07 12:32:53.896095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:30:30.388 BaseBdev4 00:30:30.388 12:32:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:30:30.647 [2024-06-07 12:32:54.148934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:30.647 [2024-06-07 12:32:54.151151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:30.647 [2024-06-07 12:32:54.151367] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:30.647 [2024-06-07 12:32:54.151450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:30.647 [2024-06-07 12:32:54.151713] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:30:30.647 [2024-06-07 12:32:54.151813] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:30:30.647 [2024-06-07 12:32:54.152006] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:30:30.647 [2024-06-07 12:32:54.152422] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:30:30.647 [2024-06-07 12:32:54.152534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:30:30.647 [2024-06-07 12:32:54.152792] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:30.647 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:30.907 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:30.907 "name": "raid_bdev1", 00:30:30.907 "uuid": "86f2d0fa-cc43-472b-b2b1-a638ee4fd29b", 00:30:30.907 "strip_size_kb": 64, 00:30:30.907 "state": "online", 00:30:30.907 "raid_level": "raid0", 00:30:30.907 "superblock": true, 00:30:30.907 "num_base_bdevs": 4, 00:30:30.907 "num_base_bdevs_discovered": 4, 00:30:30.907 "num_base_bdevs_operational": 4, 00:30:30.907 "base_bdevs_list": [ 00:30:30.907 { 00:30:30.907 "name": "BaseBdev1", 00:30:30.907 "uuid": "a9d25998-de58-5026-8388-91f893c5b2be", 00:30:30.907 "is_configured": true, 00:30:30.907 "data_offset": 2048, 00:30:30.907 "data_size": 63488 00:30:30.907 }, 00:30:30.907 { 00:30:30.907 "name": "BaseBdev2", 00:30:30.907 "uuid": "edf756c9-1982-5339-954d-fa7077ccb833", 00:30:30.907 "is_configured": true, 00:30:30.907 "data_offset": 2048, 00:30:30.907 "data_size": 63488 00:30:30.907 }, 00:30:30.907 { 00:30:30.907 "name": "BaseBdev3", 00:30:30.907 "uuid": "1804f156-6140-5e1d-a4db-381556439ccc", 00:30:30.907 "is_configured": true, 00:30:30.907 "data_offset": 2048, 00:30:30.907 "data_size": 63488 00:30:30.907 }, 00:30:30.907 { 00:30:30.907 "name": "BaseBdev4", 00:30:30.907 "uuid": "0098811e-1bd4-53ad-94a8-96a699efa22b", 00:30:30.907 "is_configured": true, 00:30:30.907 "data_offset": 2048, 00:30:30.907 "data_size": 63488 00:30:30.907 } 00:30:30.907 ] 00:30:30.907 }' 00:30:30.907 12:32:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:30.907 12:32:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:31.845 12:32:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:30:31.845 12:32:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:31.845 [2024-06-07 12:32:55.257364] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.783 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:33.042 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:33.042 "name": "raid_bdev1", 00:30:33.042 "uuid": "86f2d0fa-cc43-472b-b2b1-a638ee4fd29b", 00:30:33.042 "strip_size_kb": 64, 00:30:33.042 "state": "online", 00:30:33.042 "raid_level": "raid0", 00:30:33.042 "superblock": true, 00:30:33.042 "num_base_bdevs": 4, 00:30:33.042 "num_base_bdevs_discovered": 4, 00:30:33.042 "num_base_bdevs_operational": 4, 00:30:33.042 "base_bdevs_list": [ 00:30:33.042 { 00:30:33.042 "name": "BaseBdev1", 00:30:33.042 "uuid": "a9d25998-de58-5026-8388-91f893c5b2be", 00:30:33.042 "is_configured": true, 00:30:33.042 "data_offset": 2048, 00:30:33.042 "data_size": 63488 00:30:33.042 }, 00:30:33.042 { 00:30:33.042 "name": "BaseBdev2", 00:30:33.042 "uuid": "edf756c9-1982-5339-954d-fa7077ccb833", 00:30:33.042 "is_configured": true, 00:30:33.042 "data_offset": 2048, 00:30:33.042 "data_size": 63488 00:30:33.042 }, 00:30:33.042 { 00:30:33.042 "name": "BaseBdev3", 00:30:33.042 "uuid": "1804f156-6140-5e1d-a4db-381556439ccc", 00:30:33.042 "is_configured": true, 00:30:33.042 "data_offset": 2048, 00:30:33.042 "data_size": 63488 00:30:33.042 }, 00:30:33.042 { 00:30:33.042 "name": "BaseBdev4", 00:30:33.043 "uuid": "0098811e-1bd4-53ad-94a8-96a699efa22b", 00:30:33.043 "is_configured": true, 00:30:33.043 "data_offset": 2048, 00:30:33.043 "data_size": 63488 00:30:33.043 } 00:30:33.043 ] 00:30:33.043 }' 00:30:33.043 12:32:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:33.043 12:32:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:33.979 12:32:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:33.979 [2024-06-07 12:32:57.588852] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:33.979 [2024-06-07 12:32:57.589123] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:33.979 [2024-06-07 12:32:57.590607] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:33.979 [2024-06-07 12:32:57.590767] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:33.979 [2024-06-07 12:32:57.590834] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:33.979 [2024-06-07 12:32:57.590914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:30:33.979 0 00:30:33.979 12:32:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 214483 00:30:33.979 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 214483 ']' 00:30:33.979 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 214483 00:30:33.979 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 214483 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 214483' 00:30:34.238 killing process with pid 214483 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 214483 00:30:34.238 [2024-06-07 12:32:57.651821] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:34.238 12:32:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 214483 00:30:34.238 [2024-06-07 12:32:57.721042] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NTqqbAo8iH 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.43 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.43 != \0\.\0\0 ]] 00:30:34.497 00:30:34.497 real 0m8.468s 00:30:34.497 user 0m13.547s 00:30:34.497 sys 0m1.390s 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:34.497 12:32:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:30:34.497 ************************************ 00:30:34.497 END TEST raid_write_error_test 00:30:34.497 ************************************ 00:30:34.814 12:32:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:30:34.815 12:32:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:30:34.815 12:32:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:34.815 12:32:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:34.815 12:32:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:34.815 ************************************ 00:30:34.815 START TEST raid_state_function_test 00:30:34.815 ************************************ 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 false 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=214689 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 214689' 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:30:34.815 Process raid pid: 214689 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 214689 /var/tmp/spdk-raid.sock 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 214689 ']' 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:34.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:34.815 12:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:34.815 [2024-06-07 12:32:58.218275] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:30:34.815 [2024-06-07 12:32:58.219339] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:34.815 [2024-06-07 12:32:58.370107] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:35.073 [2024-06-07 12:32:58.470885] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:35.073 [2024-06-07 12:32:58.556402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:30:36.008 [2024-06-07 12:32:59.505423] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:36.008 [2024-06-07 12:32:59.506520] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:36.008 [2024-06-07 12:32:59.506673] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:36.008 [2024-06-07 12:32:59.506808] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:36.008 [2024-06-07 12:32:59.507008] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:36.008 [2024-06-07 12:32:59.507173] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:36.008 [2024-06-07 12:32:59.507393] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:30:36.008 [2024-06-07 12:32:59.507565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:36.008 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:36.266 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:36.266 "name": "Existed_Raid", 00:30:36.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:36.266 "strip_size_kb": 64, 00:30:36.266 "state": "configuring", 00:30:36.266 "raid_level": "concat", 00:30:36.266 "superblock": false, 00:30:36.266 "num_base_bdevs": 4, 00:30:36.266 "num_base_bdevs_discovered": 0, 00:30:36.266 "num_base_bdevs_operational": 4, 00:30:36.266 "base_bdevs_list": [ 00:30:36.266 { 00:30:36.266 "name": "BaseBdev1", 00:30:36.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:36.266 "is_configured": false, 00:30:36.266 "data_offset": 0, 00:30:36.266 "data_size": 0 00:30:36.266 }, 00:30:36.266 { 00:30:36.266 "name": "BaseBdev2", 00:30:36.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:36.266 "is_configured": false, 00:30:36.266 "data_offset": 0, 00:30:36.266 "data_size": 0 00:30:36.266 }, 00:30:36.266 { 00:30:36.266 "name": "BaseBdev3", 00:30:36.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:36.266 "is_configured": false, 00:30:36.266 "data_offset": 0, 00:30:36.266 "data_size": 0 00:30:36.266 }, 00:30:36.266 { 00:30:36.266 "name": "BaseBdev4", 00:30:36.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:36.266 "is_configured": false, 00:30:36.266 "data_offset": 0, 00:30:36.266 "data_size": 0 00:30:36.266 } 00:30:36.266 ] 00:30:36.266 }' 00:30:36.266 12:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:36.266 12:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:37.202 12:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:37.202 [2024-06-07 12:33:00.801390] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:37.202 [2024-06-07 12:33:00.801697] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:30:37.202 12:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:30:37.461 [2024-06-07 12:33:01.021490] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:37.461 [2024-06-07 12:33:01.022158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:37.461 [2024-06-07 12:33:01.022337] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:37.461 [2024-06-07 12:33:01.022505] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:37.461 [2024-06-07 12:33:01.022620] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:37.461 [2024-06-07 12:33:01.022755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:37.461 [2024-06-07 12:33:01.022900] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:30:37.461 [2024-06-07 12:33:01.023041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:30:37.461 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:30:37.719 [2024-06-07 12:33:01.337478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:37.719 BaseBdev1 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:37.719 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:37.977 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:38.236 [ 00:30:38.236 { 00:30:38.236 "name": "BaseBdev1", 00:30:38.236 "aliases": [ 00:30:38.236 "a1a33dd5-a8d9-493d-868a-f7578d01eba5" 00:30:38.236 ], 00:30:38.236 "product_name": "Malloc disk", 00:30:38.236 "block_size": 512, 00:30:38.236 "num_blocks": 65536, 00:30:38.236 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:38.236 "assigned_rate_limits": { 00:30:38.236 "rw_ios_per_sec": 0, 00:30:38.236 "rw_mbytes_per_sec": 0, 00:30:38.236 "r_mbytes_per_sec": 0, 00:30:38.236 "w_mbytes_per_sec": 0 00:30:38.236 }, 00:30:38.236 "claimed": true, 00:30:38.236 "claim_type": "exclusive_write", 00:30:38.236 "zoned": false, 00:30:38.236 "supported_io_types": { 00:30:38.236 "read": true, 00:30:38.236 "write": true, 00:30:38.236 "unmap": true, 00:30:38.236 "write_zeroes": true, 00:30:38.236 "flush": true, 00:30:38.236 "reset": true, 00:30:38.236 "compare": false, 00:30:38.236 "compare_and_write": false, 00:30:38.236 "abort": true, 00:30:38.236 "nvme_admin": false, 00:30:38.236 "nvme_io": false 00:30:38.236 }, 00:30:38.236 "memory_domains": [ 00:30:38.236 { 00:30:38.236 "dma_device_id": "system", 00:30:38.236 "dma_device_type": 1 00:30:38.236 }, 00:30:38.236 { 00:30:38.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:38.236 "dma_device_type": 2 00:30:38.236 } 00:30:38.236 ], 00:30:38.236 "driver_specific": {} 00:30:38.236 } 00:30:38.236 ] 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.236 12:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:38.496 12:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:38.496 "name": "Existed_Raid", 00:30:38.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.496 "strip_size_kb": 64, 00:30:38.496 "state": "configuring", 00:30:38.496 "raid_level": "concat", 00:30:38.496 "superblock": false, 00:30:38.496 "num_base_bdevs": 4, 00:30:38.496 "num_base_bdevs_discovered": 1, 00:30:38.496 "num_base_bdevs_operational": 4, 00:30:38.496 "base_bdevs_list": [ 00:30:38.496 { 00:30:38.496 "name": "BaseBdev1", 00:30:38.496 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:38.496 "is_configured": true, 00:30:38.496 "data_offset": 0, 00:30:38.496 "data_size": 65536 00:30:38.496 }, 00:30:38.496 { 00:30:38.496 "name": "BaseBdev2", 00:30:38.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.496 "is_configured": false, 00:30:38.496 "data_offset": 0, 00:30:38.496 "data_size": 0 00:30:38.496 }, 00:30:38.496 { 00:30:38.496 "name": "BaseBdev3", 00:30:38.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.496 "is_configured": false, 00:30:38.496 "data_offset": 0, 00:30:38.496 "data_size": 0 00:30:38.496 }, 00:30:38.496 { 00:30:38.496 "name": "BaseBdev4", 00:30:38.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.496 "is_configured": false, 00:30:38.496 "data_offset": 0, 00:30:38.496 "data_size": 0 00:30:38.496 } 00:30:38.496 ] 00:30:38.496 }' 00:30:38.496 12:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:38.496 12:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:39.432 12:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:39.432 [2024-06-07 12:33:03.041785] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:39.432 [2024-06-07 12:33:03.042069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:30:39.432 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:30:39.688 [2024-06-07 12:33:03.265894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:39.688 [2024-06-07 12:33:03.268114] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:39.688 [2024-06-07 12:33:03.268757] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:39.688 [2024-06-07 12:33:03.268894] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:30:39.688 [2024-06-07 12:33:03.269046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:30:39.688 [2024-06-07 12:33:03.269241] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:30:39.688 [2024-06-07 12:33:03.269362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.688 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:40.254 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.254 "name": "Existed_Raid", 00:30:40.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.254 "strip_size_kb": 64, 00:30:40.254 "state": "configuring", 00:30:40.254 "raid_level": "concat", 00:30:40.254 "superblock": false, 00:30:40.254 "num_base_bdevs": 4, 00:30:40.254 "num_base_bdevs_discovered": 1, 00:30:40.254 "num_base_bdevs_operational": 4, 00:30:40.254 "base_bdevs_list": [ 00:30:40.254 { 00:30:40.254 "name": "BaseBdev1", 00:30:40.254 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:40.254 "is_configured": true, 00:30:40.254 "data_offset": 0, 00:30:40.254 "data_size": 65536 00:30:40.254 }, 00:30:40.254 { 00:30:40.254 "name": "BaseBdev2", 00:30:40.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.254 "is_configured": false, 00:30:40.254 "data_offset": 0, 00:30:40.254 "data_size": 0 00:30:40.254 }, 00:30:40.254 { 00:30:40.254 "name": "BaseBdev3", 00:30:40.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.254 "is_configured": false, 00:30:40.254 "data_offset": 0, 00:30:40.254 "data_size": 0 00:30:40.254 }, 00:30:40.254 { 00:30:40.254 "name": "BaseBdev4", 00:30:40.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.254 "is_configured": false, 00:30:40.254 "data_offset": 0, 00:30:40.254 "data_size": 0 00:30:40.254 } 00:30:40.254 ] 00:30:40.254 }' 00:30:40.254 12:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.254 12:33:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:30:40.831 [2024-06-07 12:33:04.419838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:40.831 BaseBdev2 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:40.831 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:41.089 12:33:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:41.654 [ 00:30:41.654 { 00:30:41.654 "name": "BaseBdev2", 00:30:41.654 "aliases": [ 00:30:41.654 "36f0386f-55ab-47d8-a910-0a82f8fe493c" 00:30:41.654 ], 00:30:41.654 "product_name": "Malloc disk", 00:30:41.654 "block_size": 512, 00:30:41.654 "num_blocks": 65536, 00:30:41.654 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:41.654 "assigned_rate_limits": { 00:30:41.654 "rw_ios_per_sec": 0, 00:30:41.654 "rw_mbytes_per_sec": 0, 00:30:41.654 "r_mbytes_per_sec": 0, 00:30:41.654 "w_mbytes_per_sec": 0 00:30:41.654 }, 00:30:41.654 "claimed": true, 00:30:41.654 "claim_type": "exclusive_write", 00:30:41.654 "zoned": false, 00:30:41.654 "supported_io_types": { 00:30:41.654 "read": true, 00:30:41.654 "write": true, 00:30:41.654 "unmap": true, 00:30:41.654 "write_zeroes": true, 00:30:41.654 "flush": true, 00:30:41.654 "reset": true, 00:30:41.654 "compare": false, 00:30:41.654 "compare_and_write": false, 00:30:41.654 "abort": true, 00:30:41.654 "nvme_admin": false, 00:30:41.654 "nvme_io": false 00:30:41.654 }, 00:30:41.654 "memory_domains": [ 00:30:41.654 { 00:30:41.654 "dma_device_id": "system", 00:30:41.654 "dma_device_type": 1 00:30:41.654 }, 00:30:41.654 { 00:30:41.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:41.654 "dma_device_type": 2 00:30:41.654 } 00:30:41.654 ], 00:30:41.654 "driver_specific": {} 00:30:41.654 } 00:30:41.654 ] 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:41.654 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:41.912 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:41.912 "name": "Existed_Raid", 00:30:41.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:41.912 "strip_size_kb": 64, 00:30:41.912 "state": "configuring", 00:30:41.912 "raid_level": "concat", 00:30:41.912 "superblock": false, 00:30:41.912 "num_base_bdevs": 4, 00:30:41.912 "num_base_bdevs_discovered": 2, 00:30:41.912 "num_base_bdevs_operational": 4, 00:30:41.912 "base_bdevs_list": [ 00:30:41.912 { 00:30:41.912 "name": "BaseBdev1", 00:30:41.912 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:41.912 "is_configured": true, 00:30:41.912 "data_offset": 0, 00:30:41.912 "data_size": 65536 00:30:41.912 }, 00:30:41.912 { 00:30:41.912 "name": "BaseBdev2", 00:30:41.912 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:41.912 "is_configured": true, 00:30:41.912 "data_offset": 0, 00:30:41.912 "data_size": 65536 00:30:41.912 }, 00:30:41.912 { 00:30:41.912 "name": "BaseBdev3", 00:30:41.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:41.912 "is_configured": false, 00:30:41.912 "data_offset": 0, 00:30:41.912 "data_size": 0 00:30:41.912 }, 00:30:41.912 { 00:30:41.912 "name": "BaseBdev4", 00:30:41.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:41.912 "is_configured": false, 00:30:41.912 "data_offset": 0, 00:30:41.912 "data_size": 0 00:30:41.912 } 00:30:41.912 ] 00:30:41.912 }' 00:30:41.912 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:41.912 12:33:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:42.480 12:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:42.480 [2024-06-07 12:33:06.125771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:42.739 BaseBdev3 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:42.739 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:42.998 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:43.256 [ 00:30:43.256 { 00:30:43.256 "name": "BaseBdev3", 00:30:43.256 "aliases": [ 00:30:43.256 "233647ee-7897-4f9c-a42f-18af7829b9a3" 00:30:43.256 ], 00:30:43.256 "product_name": "Malloc disk", 00:30:43.256 "block_size": 512, 00:30:43.256 "num_blocks": 65536, 00:30:43.256 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:43.256 "assigned_rate_limits": { 00:30:43.256 "rw_ios_per_sec": 0, 00:30:43.256 "rw_mbytes_per_sec": 0, 00:30:43.256 "r_mbytes_per_sec": 0, 00:30:43.256 "w_mbytes_per_sec": 0 00:30:43.256 }, 00:30:43.256 "claimed": true, 00:30:43.256 "claim_type": "exclusive_write", 00:30:43.256 "zoned": false, 00:30:43.256 "supported_io_types": { 00:30:43.256 "read": true, 00:30:43.256 "write": true, 00:30:43.256 "unmap": true, 00:30:43.256 "write_zeroes": true, 00:30:43.256 "flush": true, 00:30:43.256 "reset": true, 00:30:43.256 "compare": false, 00:30:43.256 "compare_and_write": false, 00:30:43.256 "abort": true, 00:30:43.256 "nvme_admin": false, 00:30:43.256 "nvme_io": false 00:30:43.256 }, 00:30:43.257 "memory_domains": [ 00:30:43.257 { 00:30:43.257 "dma_device_id": "system", 00:30:43.257 "dma_device_type": 1 00:30:43.257 }, 00:30:43.257 { 00:30:43.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:43.257 "dma_device_type": 2 00:30:43.257 } 00:30:43.257 ], 00:30:43.257 "driver_specific": {} 00:30:43.257 } 00:30:43.257 ] 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:43.257 12:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:43.515 12:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:43.515 "name": "Existed_Raid", 00:30:43.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.515 "strip_size_kb": 64, 00:30:43.515 "state": "configuring", 00:30:43.515 "raid_level": "concat", 00:30:43.515 "superblock": false, 00:30:43.515 "num_base_bdevs": 4, 00:30:43.515 "num_base_bdevs_discovered": 3, 00:30:43.515 "num_base_bdevs_operational": 4, 00:30:43.515 "base_bdevs_list": [ 00:30:43.515 { 00:30:43.515 "name": "BaseBdev1", 00:30:43.515 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:43.515 "is_configured": true, 00:30:43.515 "data_offset": 0, 00:30:43.515 "data_size": 65536 00:30:43.515 }, 00:30:43.515 { 00:30:43.515 "name": "BaseBdev2", 00:30:43.515 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:43.515 "is_configured": true, 00:30:43.515 "data_offset": 0, 00:30:43.515 "data_size": 65536 00:30:43.515 }, 00:30:43.515 { 00:30:43.515 "name": "BaseBdev3", 00:30:43.515 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:43.515 "is_configured": true, 00:30:43.515 "data_offset": 0, 00:30:43.515 "data_size": 65536 00:30:43.515 }, 00:30:43.515 { 00:30:43.515 "name": "BaseBdev4", 00:30:43.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.515 "is_configured": false, 00:30:43.515 "data_offset": 0, 00:30:43.515 "data_size": 0 00:30:43.515 } 00:30:43.515 ] 00:30:43.515 }' 00:30:43.515 12:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:43.515 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:44.080 12:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:30:44.338 [2024-06-07 12:33:07.839828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:44.338 [2024-06-07 12:33:07.840119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:30:44.338 [2024-06-07 12:33:07.840179] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:30:44.338 [2024-06-07 12:33:07.840452] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:30:44.338 [2024-06-07 12:33:07.840951] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:30:44.338 [2024-06-07 12:33:07.841068] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:30:44.338 [2024-06-07 12:33:07.841398] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:44.338 BaseBdev4 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:44.338 12:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:44.595 12:33:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:30:44.853 [ 00:30:44.853 { 00:30:44.853 "name": "BaseBdev4", 00:30:44.853 "aliases": [ 00:30:44.853 "0190b6ee-645f-4420-a035-0c540449dcdf" 00:30:44.853 ], 00:30:44.853 "product_name": "Malloc disk", 00:30:44.853 "block_size": 512, 00:30:44.853 "num_blocks": 65536, 00:30:44.853 "uuid": "0190b6ee-645f-4420-a035-0c540449dcdf", 00:30:44.853 "assigned_rate_limits": { 00:30:44.853 "rw_ios_per_sec": 0, 00:30:44.853 "rw_mbytes_per_sec": 0, 00:30:44.854 "r_mbytes_per_sec": 0, 00:30:44.854 "w_mbytes_per_sec": 0 00:30:44.854 }, 00:30:44.854 "claimed": true, 00:30:44.854 "claim_type": "exclusive_write", 00:30:44.854 "zoned": false, 00:30:44.854 "supported_io_types": { 00:30:44.854 "read": true, 00:30:44.854 "write": true, 00:30:44.854 "unmap": true, 00:30:44.854 "write_zeroes": true, 00:30:44.854 "flush": true, 00:30:44.854 "reset": true, 00:30:44.854 "compare": false, 00:30:44.854 "compare_and_write": false, 00:30:44.854 "abort": true, 00:30:44.854 "nvme_admin": false, 00:30:44.854 "nvme_io": false 00:30:44.854 }, 00:30:44.854 "memory_domains": [ 00:30:44.854 { 00:30:44.854 "dma_device_id": "system", 00:30:44.854 "dma_device_type": 1 00:30:44.854 }, 00:30:44.854 { 00:30:44.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:44.854 "dma_device_type": 2 00:30:44.854 } 00:30:44.854 ], 00:30:44.854 "driver_specific": {} 00:30:44.854 } 00:30:44.854 ] 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.854 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:45.112 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:45.112 "name": "Existed_Raid", 00:30:45.112 "uuid": "664d057d-ee8e-4838-a60a-de44eb6215ab", 00:30:45.112 "strip_size_kb": 64, 00:30:45.112 "state": "online", 00:30:45.112 "raid_level": "concat", 00:30:45.112 "superblock": false, 00:30:45.112 "num_base_bdevs": 4, 00:30:45.112 "num_base_bdevs_discovered": 4, 00:30:45.113 "num_base_bdevs_operational": 4, 00:30:45.113 "base_bdevs_list": [ 00:30:45.113 { 00:30:45.113 "name": "BaseBdev1", 00:30:45.113 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:45.113 "is_configured": true, 00:30:45.113 "data_offset": 0, 00:30:45.113 "data_size": 65536 00:30:45.113 }, 00:30:45.113 { 00:30:45.113 "name": "BaseBdev2", 00:30:45.113 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:45.113 "is_configured": true, 00:30:45.113 "data_offset": 0, 00:30:45.113 "data_size": 65536 00:30:45.113 }, 00:30:45.113 { 00:30:45.113 "name": "BaseBdev3", 00:30:45.113 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:45.113 "is_configured": true, 00:30:45.113 "data_offset": 0, 00:30:45.113 "data_size": 65536 00:30:45.113 }, 00:30:45.113 { 00:30:45.113 "name": "BaseBdev4", 00:30:45.113 "uuid": "0190b6ee-645f-4420-a035-0c540449dcdf", 00:30:45.113 "is_configured": true, 00:30:45.113 "data_offset": 0, 00:30:45.113 "data_size": 65536 00:30:45.113 } 00:30:45.113 ] 00:30:45.113 }' 00:30:45.113 12:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:45.113 12:33:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:46.048 [2024-06-07 12:33:09.572346] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:46.048 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:46.048 "name": "Existed_Raid", 00:30:46.048 "aliases": [ 00:30:46.048 "664d057d-ee8e-4838-a60a-de44eb6215ab" 00:30:46.048 ], 00:30:46.048 "product_name": "Raid Volume", 00:30:46.048 "block_size": 512, 00:30:46.048 "num_blocks": 262144, 00:30:46.048 "uuid": "664d057d-ee8e-4838-a60a-de44eb6215ab", 00:30:46.048 "assigned_rate_limits": { 00:30:46.048 "rw_ios_per_sec": 0, 00:30:46.048 "rw_mbytes_per_sec": 0, 00:30:46.048 "r_mbytes_per_sec": 0, 00:30:46.048 "w_mbytes_per_sec": 0 00:30:46.048 }, 00:30:46.048 "claimed": false, 00:30:46.048 "zoned": false, 00:30:46.048 "supported_io_types": { 00:30:46.048 "read": true, 00:30:46.048 "write": true, 00:30:46.048 "unmap": true, 00:30:46.048 "write_zeroes": true, 00:30:46.048 "flush": true, 00:30:46.048 "reset": true, 00:30:46.048 "compare": false, 00:30:46.048 "compare_and_write": false, 00:30:46.048 "abort": false, 00:30:46.048 "nvme_admin": false, 00:30:46.048 "nvme_io": false 00:30:46.048 }, 00:30:46.048 "memory_domains": [ 00:30:46.048 { 00:30:46.048 "dma_device_id": "system", 00:30:46.048 "dma_device_type": 1 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.048 "dma_device_type": 2 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "system", 00:30:46.048 "dma_device_type": 1 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.048 "dma_device_type": 2 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "system", 00:30:46.048 "dma_device_type": 1 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.048 "dma_device_type": 2 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "system", 00:30:46.048 "dma_device_type": 1 00:30:46.048 }, 00:30:46.048 { 00:30:46.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.048 "dma_device_type": 2 00:30:46.048 } 00:30:46.048 ], 00:30:46.048 "driver_specific": { 00:30:46.048 "raid": { 00:30:46.048 "uuid": "664d057d-ee8e-4838-a60a-de44eb6215ab", 00:30:46.048 "strip_size_kb": 64, 00:30:46.048 "state": "online", 00:30:46.048 "raid_level": "concat", 00:30:46.048 "superblock": false, 00:30:46.048 "num_base_bdevs": 4, 00:30:46.048 "num_base_bdevs_discovered": 4, 00:30:46.048 "num_base_bdevs_operational": 4, 00:30:46.048 "base_bdevs_list": [ 00:30:46.048 { 00:30:46.048 "name": "BaseBdev1", 00:30:46.048 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:46.049 "is_configured": true, 00:30:46.049 "data_offset": 0, 00:30:46.049 "data_size": 65536 00:30:46.049 }, 00:30:46.049 { 00:30:46.049 "name": "BaseBdev2", 00:30:46.049 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:46.049 "is_configured": true, 00:30:46.049 "data_offset": 0, 00:30:46.049 "data_size": 65536 00:30:46.049 }, 00:30:46.049 { 00:30:46.049 "name": "BaseBdev3", 00:30:46.049 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:46.049 "is_configured": true, 00:30:46.049 "data_offset": 0, 00:30:46.049 "data_size": 65536 00:30:46.049 }, 00:30:46.049 { 00:30:46.049 "name": "BaseBdev4", 00:30:46.049 "uuid": "0190b6ee-645f-4420-a035-0c540449dcdf", 00:30:46.049 "is_configured": true, 00:30:46.049 "data_offset": 0, 00:30:46.049 "data_size": 65536 00:30:46.049 } 00:30:46.049 ] 00:30:46.049 } 00:30:46.049 } 00:30:46.049 }' 00:30:46.049 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:46.049 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:46.049 BaseBdev2 00:30:46.049 BaseBdev3 00:30:46.049 BaseBdev4' 00:30:46.049 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:46.049 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:46.049 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:46.306 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:46.307 "name": "BaseBdev1", 00:30:46.307 "aliases": [ 00:30:46.307 "a1a33dd5-a8d9-493d-868a-f7578d01eba5" 00:30:46.307 ], 00:30:46.307 "product_name": "Malloc disk", 00:30:46.307 "block_size": 512, 00:30:46.307 "num_blocks": 65536, 00:30:46.307 "uuid": "a1a33dd5-a8d9-493d-868a-f7578d01eba5", 00:30:46.307 "assigned_rate_limits": { 00:30:46.307 "rw_ios_per_sec": 0, 00:30:46.307 "rw_mbytes_per_sec": 0, 00:30:46.307 "r_mbytes_per_sec": 0, 00:30:46.307 "w_mbytes_per_sec": 0 00:30:46.307 }, 00:30:46.307 "claimed": true, 00:30:46.307 "claim_type": "exclusive_write", 00:30:46.307 "zoned": false, 00:30:46.307 "supported_io_types": { 00:30:46.307 "read": true, 00:30:46.307 "write": true, 00:30:46.307 "unmap": true, 00:30:46.307 "write_zeroes": true, 00:30:46.307 "flush": true, 00:30:46.307 "reset": true, 00:30:46.307 "compare": false, 00:30:46.307 "compare_and_write": false, 00:30:46.307 "abort": true, 00:30:46.307 "nvme_admin": false, 00:30:46.307 "nvme_io": false 00:30:46.307 }, 00:30:46.307 "memory_domains": [ 00:30:46.307 { 00:30:46.307 "dma_device_id": "system", 00:30:46.307 "dma_device_type": 1 00:30:46.307 }, 00:30:46.307 { 00:30:46.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:46.307 "dma_device_type": 2 00:30:46.307 } 00:30:46.307 ], 00:30:46.307 "driver_specific": {} 00:30:46.307 }' 00:30:46.307 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:46.564 12:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:46.564 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:46.832 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:47.109 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:47.109 "name": "BaseBdev2", 00:30:47.109 "aliases": [ 00:30:47.109 "36f0386f-55ab-47d8-a910-0a82f8fe493c" 00:30:47.109 ], 00:30:47.109 "product_name": "Malloc disk", 00:30:47.109 "block_size": 512, 00:30:47.109 "num_blocks": 65536, 00:30:47.109 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:47.109 "assigned_rate_limits": { 00:30:47.109 "rw_ios_per_sec": 0, 00:30:47.109 "rw_mbytes_per_sec": 0, 00:30:47.109 "r_mbytes_per_sec": 0, 00:30:47.109 "w_mbytes_per_sec": 0 00:30:47.109 }, 00:30:47.109 "claimed": true, 00:30:47.109 "claim_type": "exclusive_write", 00:30:47.109 "zoned": false, 00:30:47.109 "supported_io_types": { 00:30:47.109 "read": true, 00:30:47.109 "write": true, 00:30:47.109 "unmap": true, 00:30:47.109 "write_zeroes": true, 00:30:47.109 "flush": true, 00:30:47.109 "reset": true, 00:30:47.109 "compare": false, 00:30:47.109 "compare_and_write": false, 00:30:47.109 "abort": true, 00:30:47.109 "nvme_admin": false, 00:30:47.109 "nvme_io": false 00:30:47.109 }, 00:30:47.109 "memory_domains": [ 00:30:47.109 { 00:30:47.109 "dma_device_id": "system", 00:30:47.109 "dma_device_type": 1 00:30:47.109 }, 00:30:47.109 { 00:30:47.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:47.109 "dma_device_type": 2 00:30:47.109 } 00:30:47.109 ], 00:30:47.109 "driver_specific": {} 00:30:47.109 }' 00:30:47.109 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.109 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.109 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:47.109 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:47.367 12:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:47.367 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:30:47.367 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:47.934 "name": "BaseBdev3", 00:30:47.934 "aliases": [ 00:30:47.934 "233647ee-7897-4f9c-a42f-18af7829b9a3" 00:30:47.934 ], 00:30:47.934 "product_name": "Malloc disk", 00:30:47.934 "block_size": 512, 00:30:47.934 "num_blocks": 65536, 00:30:47.934 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:47.934 "assigned_rate_limits": { 00:30:47.934 "rw_ios_per_sec": 0, 00:30:47.934 "rw_mbytes_per_sec": 0, 00:30:47.934 "r_mbytes_per_sec": 0, 00:30:47.934 "w_mbytes_per_sec": 0 00:30:47.934 }, 00:30:47.934 "claimed": true, 00:30:47.934 "claim_type": "exclusive_write", 00:30:47.934 "zoned": false, 00:30:47.934 "supported_io_types": { 00:30:47.934 "read": true, 00:30:47.934 "write": true, 00:30:47.934 "unmap": true, 00:30:47.934 "write_zeroes": true, 00:30:47.934 "flush": true, 00:30:47.934 "reset": true, 00:30:47.934 "compare": false, 00:30:47.934 "compare_and_write": false, 00:30:47.934 "abort": true, 00:30:47.934 "nvme_admin": false, 00:30:47.934 "nvme_io": false 00:30:47.934 }, 00:30:47.934 "memory_domains": [ 00:30:47.934 { 00:30:47.934 "dma_device_id": "system", 00:30:47.934 "dma_device_type": 1 00:30:47.934 }, 00:30:47.934 { 00:30:47.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:47.934 "dma_device_type": 2 00:30:47.934 } 00:30:47.934 ], 00:30:47.934 "driver_specific": {} 00:30:47.934 }' 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:47.934 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:48.192 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:48.192 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:48.192 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:48.192 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:48.450 "name": "BaseBdev4", 00:30:48.450 "aliases": [ 00:30:48.450 "0190b6ee-645f-4420-a035-0c540449dcdf" 00:30:48.450 ], 00:30:48.450 "product_name": "Malloc disk", 00:30:48.450 "block_size": 512, 00:30:48.450 "num_blocks": 65536, 00:30:48.450 "uuid": "0190b6ee-645f-4420-a035-0c540449dcdf", 00:30:48.450 "assigned_rate_limits": { 00:30:48.450 "rw_ios_per_sec": 0, 00:30:48.450 "rw_mbytes_per_sec": 0, 00:30:48.450 "r_mbytes_per_sec": 0, 00:30:48.450 "w_mbytes_per_sec": 0 00:30:48.450 }, 00:30:48.450 "claimed": true, 00:30:48.450 "claim_type": "exclusive_write", 00:30:48.450 "zoned": false, 00:30:48.450 "supported_io_types": { 00:30:48.450 "read": true, 00:30:48.450 "write": true, 00:30:48.450 "unmap": true, 00:30:48.450 "write_zeroes": true, 00:30:48.450 "flush": true, 00:30:48.450 "reset": true, 00:30:48.450 "compare": false, 00:30:48.450 "compare_and_write": false, 00:30:48.450 "abort": true, 00:30:48.450 "nvme_admin": false, 00:30:48.450 "nvme_io": false 00:30:48.450 }, 00:30:48.450 "memory_domains": [ 00:30:48.450 { 00:30:48.450 "dma_device_id": "system", 00:30:48.450 "dma_device_type": 1 00:30:48.450 }, 00:30:48.450 { 00:30:48.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:48.450 "dma_device_type": 2 00:30:48.450 } 00:30:48.450 ], 00:30:48.450 "driver_specific": {} 00:30:48.450 }' 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:48.450 12:33:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:48.450 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:30:48.450 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:48.450 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:48.709 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:30:48.709 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:48.709 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:48.709 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:30:48.709 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:48.967 [2024-06-07 12:33:12.496712] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:48.967 [2024-06-07 12:33:12.496969] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:48.967 [2024-06-07 12:33:12.497150] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.967 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:49.226 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:49.226 "name": "Existed_Raid", 00:30:49.226 "uuid": "664d057d-ee8e-4838-a60a-de44eb6215ab", 00:30:49.226 "strip_size_kb": 64, 00:30:49.226 "state": "offline", 00:30:49.226 "raid_level": "concat", 00:30:49.226 "superblock": false, 00:30:49.226 "num_base_bdevs": 4, 00:30:49.226 "num_base_bdevs_discovered": 3, 00:30:49.226 "num_base_bdevs_operational": 3, 00:30:49.226 "base_bdevs_list": [ 00:30:49.226 { 00:30:49.226 "name": null, 00:30:49.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:49.226 "is_configured": false, 00:30:49.226 "data_offset": 0, 00:30:49.226 "data_size": 65536 00:30:49.226 }, 00:30:49.226 { 00:30:49.226 "name": "BaseBdev2", 00:30:49.226 "uuid": "36f0386f-55ab-47d8-a910-0a82f8fe493c", 00:30:49.226 "is_configured": true, 00:30:49.226 "data_offset": 0, 00:30:49.226 "data_size": 65536 00:30:49.226 }, 00:30:49.226 { 00:30:49.226 "name": "BaseBdev3", 00:30:49.226 "uuid": "233647ee-7897-4f9c-a42f-18af7829b9a3", 00:30:49.226 "is_configured": true, 00:30:49.226 "data_offset": 0, 00:30:49.226 "data_size": 65536 00:30:49.226 }, 00:30:49.226 { 00:30:49.226 "name": "BaseBdev4", 00:30:49.226 "uuid": "0190b6ee-645f-4420-a035-0c540449dcdf", 00:30:49.226 "is_configured": true, 00:30:49.226 "data_offset": 0, 00:30:49.226 "data_size": 65536 00:30:49.226 } 00:30:49.226 ] 00:30:49.226 }' 00:30:49.226 12:33:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:49.226 12:33:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:49.793 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:49.793 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:49.793 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:49.793 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:50.360 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:50.360 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:50.360 12:33:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:50.618 [2024-06-07 12:33:14.092043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:50.618 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:50.618 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:50.618 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:50.618 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:50.877 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:50.877 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:50.877 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:30:51.135 [2024-06-07 12:33:14.586980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:51.135 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:51.135 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:51.135 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.135 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:51.394 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:51.394 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:51.394 12:33:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:30:51.653 [2024-06-07 12:33:15.092983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:30:51.653 [2024-06-07 12:33:15.093305] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:30:51.653 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:51.653 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:51.653 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.653 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:51.912 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:30:52.171 BaseBdev2 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:52.171 12:33:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:52.792 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:52.792 [ 00:30:52.792 { 00:30:52.792 "name": "BaseBdev2", 00:30:52.792 "aliases": [ 00:30:52.792 "05621067-10c6-4b64-9ab2-bd43cd4ca223" 00:30:52.792 ], 00:30:52.792 "product_name": "Malloc disk", 00:30:52.792 "block_size": 512, 00:30:52.792 "num_blocks": 65536, 00:30:52.792 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:30:52.792 "assigned_rate_limits": { 00:30:52.792 "rw_ios_per_sec": 0, 00:30:52.792 "rw_mbytes_per_sec": 0, 00:30:52.792 "r_mbytes_per_sec": 0, 00:30:52.792 "w_mbytes_per_sec": 0 00:30:52.792 }, 00:30:52.792 "claimed": false, 00:30:52.792 "zoned": false, 00:30:52.792 "supported_io_types": { 00:30:52.792 "read": true, 00:30:52.792 "write": true, 00:30:52.792 "unmap": true, 00:30:52.792 "write_zeroes": true, 00:30:52.792 "flush": true, 00:30:52.792 "reset": true, 00:30:52.792 "compare": false, 00:30:52.792 "compare_and_write": false, 00:30:52.792 "abort": true, 00:30:52.792 "nvme_admin": false, 00:30:52.792 "nvme_io": false 00:30:52.792 }, 00:30:52.792 "memory_domains": [ 00:30:52.792 { 00:30:52.792 "dma_device_id": "system", 00:30:52.792 "dma_device_type": 1 00:30:52.792 }, 00:30:52.792 { 00:30:52.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:52.792 "dma_device_type": 2 00:30:52.792 } 00:30:52.792 ], 00:30:52.792 "driver_specific": {} 00:30:52.792 } 00:30:52.792 ] 00:30:52.792 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:52.792 12:33:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:52.792 12:33:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:52.792 12:33:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:30:53.051 BaseBdev3 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:53.051 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:53.310 12:33:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:30:53.568 [ 00:30:53.568 { 00:30:53.568 "name": "BaseBdev3", 00:30:53.568 "aliases": [ 00:30:53.568 "0efe41d6-94f9-48d0-bfcd-a378881d380f" 00:30:53.568 ], 00:30:53.568 "product_name": "Malloc disk", 00:30:53.568 "block_size": 512, 00:30:53.568 "num_blocks": 65536, 00:30:53.568 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:30:53.568 "assigned_rate_limits": { 00:30:53.568 "rw_ios_per_sec": 0, 00:30:53.568 "rw_mbytes_per_sec": 0, 00:30:53.568 "r_mbytes_per_sec": 0, 00:30:53.568 "w_mbytes_per_sec": 0 00:30:53.568 }, 00:30:53.568 "claimed": false, 00:30:53.568 "zoned": false, 00:30:53.568 "supported_io_types": { 00:30:53.568 "read": true, 00:30:53.568 "write": true, 00:30:53.568 "unmap": true, 00:30:53.568 "write_zeroes": true, 00:30:53.568 "flush": true, 00:30:53.568 "reset": true, 00:30:53.568 "compare": false, 00:30:53.568 "compare_and_write": false, 00:30:53.568 "abort": true, 00:30:53.568 "nvme_admin": false, 00:30:53.568 "nvme_io": false 00:30:53.568 }, 00:30:53.568 "memory_domains": [ 00:30:53.568 { 00:30:53.568 "dma_device_id": "system", 00:30:53.568 "dma_device_type": 1 00:30:53.568 }, 00:30:53.568 { 00:30:53.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:53.568 "dma_device_type": 2 00:30:53.568 } 00:30:53.568 ], 00:30:53.568 "driver_specific": {} 00:30:53.568 } 00:30:53.568 ] 00:30:53.568 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:53.568 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:53.568 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:53.568 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:30:54.135 BaseBdev4 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:54.135 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:30:54.394 [ 00:30:54.394 { 00:30:54.394 "name": "BaseBdev4", 00:30:54.394 "aliases": [ 00:30:54.394 "21328411-181a-4371-a14c-3113dba33ca0" 00:30:54.394 ], 00:30:54.394 "product_name": "Malloc disk", 00:30:54.394 "block_size": 512, 00:30:54.394 "num_blocks": 65536, 00:30:54.394 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:30:54.394 "assigned_rate_limits": { 00:30:54.394 "rw_ios_per_sec": 0, 00:30:54.394 "rw_mbytes_per_sec": 0, 00:30:54.394 "r_mbytes_per_sec": 0, 00:30:54.394 "w_mbytes_per_sec": 0 00:30:54.394 }, 00:30:54.394 "claimed": false, 00:30:54.394 "zoned": false, 00:30:54.394 "supported_io_types": { 00:30:54.394 "read": true, 00:30:54.394 "write": true, 00:30:54.394 "unmap": true, 00:30:54.394 "write_zeroes": true, 00:30:54.394 "flush": true, 00:30:54.394 "reset": true, 00:30:54.394 "compare": false, 00:30:54.394 "compare_and_write": false, 00:30:54.394 "abort": true, 00:30:54.394 "nvme_admin": false, 00:30:54.394 "nvme_io": false 00:30:54.394 }, 00:30:54.394 "memory_domains": [ 00:30:54.394 { 00:30:54.394 "dma_device_id": "system", 00:30:54.394 "dma_device_type": 1 00:30:54.394 }, 00:30:54.394 { 00:30:54.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:54.394 "dma_device_type": 2 00:30:54.394 } 00:30:54.394 ], 00:30:54.394 "driver_specific": {} 00:30:54.394 } 00:30:54.394 ] 00:30:54.394 12:33:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:54.394 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:30:54.394 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:30:54.394 12:33:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:30:54.652 [2024-06-07 12:33:18.223900] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:54.652 [2024-06-07 12:33:18.224721] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:54.652 [2024-06-07 12:33:18.224895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:54.652 [2024-06-07 12:33:18.227191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:54.652 [2024-06-07 12:33:18.227393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:54.652 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:54.911 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:54.911 "name": "Existed_Raid", 00:30:54.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:54.911 "strip_size_kb": 64, 00:30:54.911 "state": "configuring", 00:30:54.911 "raid_level": "concat", 00:30:54.911 "superblock": false, 00:30:54.911 "num_base_bdevs": 4, 00:30:54.911 "num_base_bdevs_discovered": 3, 00:30:54.911 "num_base_bdevs_operational": 4, 00:30:54.911 "base_bdevs_list": [ 00:30:54.911 { 00:30:54.911 "name": "BaseBdev1", 00:30:54.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:54.911 "is_configured": false, 00:30:54.911 "data_offset": 0, 00:30:54.911 "data_size": 0 00:30:54.911 }, 00:30:54.911 { 00:30:54.911 "name": "BaseBdev2", 00:30:54.911 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:30:54.911 "is_configured": true, 00:30:54.911 "data_offset": 0, 00:30:54.911 "data_size": 65536 00:30:54.911 }, 00:30:54.911 { 00:30:54.911 "name": "BaseBdev3", 00:30:54.911 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:30:54.911 "is_configured": true, 00:30:54.911 "data_offset": 0, 00:30:54.911 "data_size": 65536 00:30:54.911 }, 00:30:54.911 { 00:30:54.911 "name": "BaseBdev4", 00:30:54.911 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:30:54.911 "is_configured": true, 00:30:54.911 "data_offset": 0, 00:30:54.911 "data_size": 65536 00:30:54.911 } 00:30:54.911 ] 00:30:54.911 }' 00:30:54.911 12:33:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:54.911 12:33:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:55.478 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:30:56.045 [2024-06-07 12:33:19.384009] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:56.045 "name": "Existed_Raid", 00:30:56.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:56.045 "strip_size_kb": 64, 00:30:56.045 "state": "configuring", 00:30:56.045 "raid_level": "concat", 00:30:56.045 "superblock": false, 00:30:56.045 "num_base_bdevs": 4, 00:30:56.045 "num_base_bdevs_discovered": 2, 00:30:56.045 "num_base_bdevs_operational": 4, 00:30:56.045 "base_bdevs_list": [ 00:30:56.045 { 00:30:56.045 "name": "BaseBdev1", 00:30:56.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:56.045 "is_configured": false, 00:30:56.045 "data_offset": 0, 00:30:56.045 "data_size": 0 00:30:56.045 }, 00:30:56.045 { 00:30:56.045 "name": null, 00:30:56.045 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:30:56.045 "is_configured": false, 00:30:56.045 "data_offset": 0, 00:30:56.045 "data_size": 65536 00:30:56.045 }, 00:30:56.045 { 00:30:56.045 "name": "BaseBdev3", 00:30:56.045 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:30:56.045 "is_configured": true, 00:30:56.045 "data_offset": 0, 00:30:56.045 "data_size": 65536 00:30:56.045 }, 00:30:56.045 { 00:30:56.045 "name": "BaseBdev4", 00:30:56.045 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:30:56.045 "is_configured": true, 00:30:56.045 "data_offset": 0, 00:30:56.045 "data_size": 65536 00:30:56.045 } 00:30:56.045 ] 00:30:56.045 }' 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:56.045 12:33:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:56.980 12:33:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.980 12:33:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:30:56.980 12:33:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:30:56.980 12:33:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:30:57.239 [2024-06-07 12:33:20.813831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:57.239 BaseBdev1 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:30:57.239 12:33:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:57.498 12:33:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:57.755 [ 00:30:57.755 { 00:30:57.755 "name": "BaseBdev1", 00:30:57.755 "aliases": [ 00:30:57.755 "e02ce86d-f030-43b6-9dbd-57f9fc952e8d" 00:30:57.755 ], 00:30:57.755 "product_name": "Malloc disk", 00:30:57.755 "block_size": 512, 00:30:57.755 "num_blocks": 65536, 00:30:57.755 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:30:57.755 "assigned_rate_limits": { 00:30:57.755 "rw_ios_per_sec": 0, 00:30:57.755 "rw_mbytes_per_sec": 0, 00:30:57.755 "r_mbytes_per_sec": 0, 00:30:57.755 "w_mbytes_per_sec": 0 00:30:57.755 }, 00:30:57.755 "claimed": true, 00:30:57.755 "claim_type": "exclusive_write", 00:30:57.755 "zoned": false, 00:30:57.755 "supported_io_types": { 00:30:57.755 "read": true, 00:30:57.755 "write": true, 00:30:57.755 "unmap": true, 00:30:57.755 "write_zeroes": true, 00:30:57.755 "flush": true, 00:30:57.755 "reset": true, 00:30:57.755 "compare": false, 00:30:57.755 "compare_and_write": false, 00:30:57.755 "abort": true, 00:30:57.755 "nvme_admin": false, 00:30:57.755 "nvme_io": false 00:30:57.755 }, 00:30:57.755 "memory_domains": [ 00:30:57.755 { 00:30:57.755 "dma_device_id": "system", 00:30:57.755 "dma_device_type": 1 00:30:57.755 }, 00:30:57.755 { 00:30:57.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:57.755 "dma_device_type": 2 00:30:57.755 } 00:30:57.755 ], 00:30:57.755 "driver_specific": {} 00:30:57.755 } 00:30:57.755 ] 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:58.028 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:58.028 "name": "Existed_Raid", 00:30:58.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:58.028 "strip_size_kb": 64, 00:30:58.028 "state": "configuring", 00:30:58.028 "raid_level": "concat", 00:30:58.028 "superblock": false, 00:30:58.028 "num_base_bdevs": 4, 00:30:58.028 "num_base_bdevs_discovered": 3, 00:30:58.028 "num_base_bdevs_operational": 4, 00:30:58.028 "base_bdevs_list": [ 00:30:58.028 { 00:30:58.028 "name": "BaseBdev1", 00:30:58.028 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:30:58.028 "is_configured": true, 00:30:58.028 "data_offset": 0, 00:30:58.028 "data_size": 65536 00:30:58.028 }, 00:30:58.028 { 00:30:58.028 "name": null, 00:30:58.028 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:30:58.028 "is_configured": false, 00:30:58.028 "data_offset": 0, 00:30:58.028 "data_size": 65536 00:30:58.028 }, 00:30:58.028 { 00:30:58.028 "name": "BaseBdev3", 00:30:58.028 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:30:58.028 "is_configured": true, 00:30:58.028 "data_offset": 0, 00:30:58.028 "data_size": 65536 00:30:58.028 }, 00:30:58.028 { 00:30:58.028 "name": "BaseBdev4", 00:30:58.028 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:30:58.028 "is_configured": true, 00:30:58.028 "data_offset": 0, 00:30:58.028 "data_size": 65536 00:30:58.028 } 00:30:58.028 ] 00:30:58.029 }' 00:30:58.029 12:33:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:58.029 12:33:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:58.593 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:30:58.593 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.851 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:30:58.851 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:30:59.141 [2024-06-07 12:33:22.582183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.141 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:59.400 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:59.400 "name": "Existed_Raid", 00:30:59.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:59.400 "strip_size_kb": 64, 00:30:59.400 "state": "configuring", 00:30:59.400 "raid_level": "concat", 00:30:59.400 "superblock": false, 00:30:59.400 "num_base_bdevs": 4, 00:30:59.400 "num_base_bdevs_discovered": 2, 00:30:59.400 "num_base_bdevs_operational": 4, 00:30:59.400 "base_bdevs_list": [ 00:30:59.400 { 00:30:59.400 "name": "BaseBdev1", 00:30:59.400 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:30:59.400 "is_configured": true, 00:30:59.400 "data_offset": 0, 00:30:59.400 "data_size": 65536 00:30:59.400 }, 00:30:59.400 { 00:30:59.400 "name": null, 00:30:59.400 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:30:59.400 "is_configured": false, 00:30:59.400 "data_offset": 0, 00:30:59.400 "data_size": 65536 00:30:59.400 }, 00:30:59.400 { 00:30:59.400 "name": null, 00:30:59.400 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:30:59.400 "is_configured": false, 00:30:59.400 "data_offset": 0, 00:30:59.401 "data_size": 65536 00:30:59.401 }, 00:30:59.401 { 00:30:59.401 "name": "BaseBdev4", 00:30:59.401 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:30:59.401 "is_configured": true, 00:30:59.401 "data_offset": 0, 00:30:59.401 "data_size": 65536 00:30:59.401 } 00:30:59.401 ] 00:30:59.401 }' 00:30:59.401 12:33:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:59.401 12:33:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:30:59.970 12:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:30:59.970 12:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:00.228 12:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:31:00.228 12:33:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:31:00.487 [2024-06-07 12:33:23.980776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:00.487 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:00.745 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:00.745 "name": "Existed_Raid", 00:31:00.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:00.745 "strip_size_kb": 64, 00:31:00.745 "state": "configuring", 00:31:00.745 "raid_level": "concat", 00:31:00.745 "superblock": false, 00:31:00.745 "num_base_bdevs": 4, 00:31:00.745 "num_base_bdevs_discovered": 3, 00:31:00.745 "num_base_bdevs_operational": 4, 00:31:00.745 "base_bdevs_list": [ 00:31:00.745 { 00:31:00.745 "name": "BaseBdev1", 00:31:00.745 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:00.745 "is_configured": true, 00:31:00.745 "data_offset": 0, 00:31:00.745 "data_size": 65536 00:31:00.745 }, 00:31:00.745 { 00:31:00.745 "name": null, 00:31:00.745 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:00.745 "is_configured": false, 00:31:00.745 "data_offset": 0, 00:31:00.745 "data_size": 65536 00:31:00.745 }, 00:31:00.745 { 00:31:00.745 "name": "BaseBdev3", 00:31:00.745 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:00.745 "is_configured": true, 00:31:00.745 "data_offset": 0, 00:31:00.745 "data_size": 65536 00:31:00.745 }, 00:31:00.745 { 00:31:00.745 "name": "BaseBdev4", 00:31:00.745 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:00.745 "is_configured": true, 00:31:00.745 "data_offset": 0, 00:31:00.745 "data_size": 65536 00:31:00.745 } 00:31:00.745 ] 00:31:00.745 }' 00:31:00.745 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:00.745 12:33:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:01.314 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:01.314 12:33:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:31:01.573 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:31:01.573 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:01.831 [2024-06-07 12:33:25.377006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:01.831 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:02.090 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:02.090 "name": "Existed_Raid", 00:31:02.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:02.090 "strip_size_kb": 64, 00:31:02.090 "state": "configuring", 00:31:02.090 "raid_level": "concat", 00:31:02.090 "superblock": false, 00:31:02.090 "num_base_bdevs": 4, 00:31:02.090 "num_base_bdevs_discovered": 2, 00:31:02.090 "num_base_bdevs_operational": 4, 00:31:02.090 "base_bdevs_list": [ 00:31:02.090 { 00:31:02.090 "name": null, 00:31:02.090 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:02.090 "is_configured": false, 00:31:02.090 "data_offset": 0, 00:31:02.090 "data_size": 65536 00:31:02.090 }, 00:31:02.090 { 00:31:02.090 "name": null, 00:31:02.090 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:02.090 "is_configured": false, 00:31:02.090 "data_offset": 0, 00:31:02.090 "data_size": 65536 00:31:02.090 }, 00:31:02.090 { 00:31:02.090 "name": "BaseBdev3", 00:31:02.090 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:02.090 "is_configured": true, 00:31:02.090 "data_offset": 0, 00:31:02.090 "data_size": 65536 00:31:02.090 }, 00:31:02.090 { 00:31:02.090 "name": "BaseBdev4", 00:31:02.090 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:02.090 "is_configured": true, 00:31:02.090 "data_offset": 0, 00:31:02.090 "data_size": 65536 00:31:02.090 } 00:31:02.090 ] 00:31:02.090 }' 00:31:02.090 12:33:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:02.091 12:33:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:02.661 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.661 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:31:02.919 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:31:02.919 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:31:03.178 [2024-06-07 12:33:26.800901] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:03.178 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.437 12:33:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:03.437 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:03.437 "name": "Existed_Raid", 00:31:03.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:03.437 "strip_size_kb": 64, 00:31:03.437 "state": "configuring", 00:31:03.437 "raid_level": "concat", 00:31:03.437 "superblock": false, 00:31:03.437 "num_base_bdevs": 4, 00:31:03.437 "num_base_bdevs_discovered": 3, 00:31:03.437 "num_base_bdevs_operational": 4, 00:31:03.437 "base_bdevs_list": [ 00:31:03.437 { 00:31:03.437 "name": null, 00:31:03.437 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:03.437 "is_configured": false, 00:31:03.437 "data_offset": 0, 00:31:03.437 "data_size": 65536 00:31:03.437 }, 00:31:03.437 { 00:31:03.437 "name": "BaseBdev2", 00:31:03.437 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:03.437 "is_configured": true, 00:31:03.437 "data_offset": 0, 00:31:03.437 "data_size": 65536 00:31:03.437 }, 00:31:03.437 { 00:31:03.437 "name": "BaseBdev3", 00:31:03.437 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:03.437 "is_configured": true, 00:31:03.437 "data_offset": 0, 00:31:03.437 "data_size": 65536 00:31:03.437 }, 00:31:03.437 { 00:31:03.437 "name": "BaseBdev4", 00:31:03.437 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:03.437 "is_configured": true, 00:31:03.437 "data_offset": 0, 00:31:03.437 "data_size": 65536 00:31:03.437 } 00:31:03.437 ] 00:31:03.437 }' 00:31:03.437 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:03.437 12:33:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:04.394 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:04.394 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:04.394 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:31:04.394 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:04.394 12:33:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:31:04.652 12:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e02ce86d-f030-43b6-9dbd-57f9fc952e8d 00:31:04.910 [2024-06-07 12:33:28.534472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:31:04.910 [2024-06-07 12:33:28.534535] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:31:04.910 [2024-06-07 12:33:28.534544] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:31:04.910 [2024-06-07 12:33:28.534615] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:31:04.910 [2024-06-07 12:33:28.534834] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:31:04.910 [2024-06-07 12:33:28.534845] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:31:04.910 [2024-06-07 12:33:28.534989] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:04.910 NewBaseBdev 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:05.168 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:05.427 12:33:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:31:05.427 [ 00:31:05.427 { 00:31:05.427 "name": "NewBaseBdev", 00:31:05.427 "aliases": [ 00:31:05.427 "e02ce86d-f030-43b6-9dbd-57f9fc952e8d" 00:31:05.427 ], 00:31:05.427 "product_name": "Malloc disk", 00:31:05.427 "block_size": 512, 00:31:05.427 "num_blocks": 65536, 00:31:05.427 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:05.427 "assigned_rate_limits": { 00:31:05.427 "rw_ios_per_sec": 0, 00:31:05.427 "rw_mbytes_per_sec": 0, 00:31:05.427 "r_mbytes_per_sec": 0, 00:31:05.427 "w_mbytes_per_sec": 0 00:31:05.427 }, 00:31:05.427 "claimed": true, 00:31:05.427 "claim_type": "exclusive_write", 00:31:05.427 "zoned": false, 00:31:05.427 "supported_io_types": { 00:31:05.427 "read": true, 00:31:05.427 "write": true, 00:31:05.427 "unmap": true, 00:31:05.427 "write_zeroes": true, 00:31:05.427 "flush": true, 00:31:05.427 "reset": true, 00:31:05.427 "compare": false, 00:31:05.427 "compare_and_write": false, 00:31:05.427 "abort": true, 00:31:05.427 "nvme_admin": false, 00:31:05.427 "nvme_io": false 00:31:05.427 }, 00:31:05.427 "memory_domains": [ 00:31:05.427 { 00:31:05.427 "dma_device_id": "system", 00:31:05.427 "dma_device_type": 1 00:31:05.427 }, 00:31:05.427 { 00:31:05.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:05.427 "dma_device_type": 2 00:31:05.427 } 00:31:05.427 ], 00:31:05.427 "driver_specific": {} 00:31:05.427 } 00:31:05.427 ] 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.427 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:05.684 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:05.684 "name": "Existed_Raid", 00:31:05.684 "uuid": "5620e6e4-8366-4dcd-b0a2-9cd129210395", 00:31:05.684 "strip_size_kb": 64, 00:31:05.684 "state": "online", 00:31:05.684 "raid_level": "concat", 00:31:05.684 "superblock": false, 00:31:05.684 "num_base_bdevs": 4, 00:31:05.684 "num_base_bdevs_discovered": 4, 00:31:05.684 "num_base_bdevs_operational": 4, 00:31:05.684 "base_bdevs_list": [ 00:31:05.684 { 00:31:05.684 "name": "NewBaseBdev", 00:31:05.684 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:05.684 "is_configured": true, 00:31:05.684 "data_offset": 0, 00:31:05.684 "data_size": 65536 00:31:05.684 }, 00:31:05.684 { 00:31:05.684 "name": "BaseBdev2", 00:31:05.684 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:05.684 "is_configured": true, 00:31:05.684 "data_offset": 0, 00:31:05.684 "data_size": 65536 00:31:05.684 }, 00:31:05.684 { 00:31:05.684 "name": "BaseBdev3", 00:31:05.684 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:05.684 "is_configured": true, 00:31:05.684 "data_offset": 0, 00:31:05.684 "data_size": 65536 00:31:05.684 }, 00:31:05.684 { 00:31:05.684 "name": "BaseBdev4", 00:31:05.684 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:05.684 "is_configured": true, 00:31:05.684 "data_offset": 0, 00:31:05.684 "data_size": 65536 00:31:05.684 } 00:31:05.684 ] 00:31:05.684 }' 00:31:05.684 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:05.684 12:33:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:06.251 12:33:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:06.510 [2024-06-07 12:33:30.114926] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:06.510 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:06.510 "name": "Existed_Raid", 00:31:06.510 "aliases": [ 00:31:06.510 "5620e6e4-8366-4dcd-b0a2-9cd129210395" 00:31:06.510 ], 00:31:06.510 "product_name": "Raid Volume", 00:31:06.510 "block_size": 512, 00:31:06.510 "num_blocks": 262144, 00:31:06.510 "uuid": "5620e6e4-8366-4dcd-b0a2-9cd129210395", 00:31:06.510 "assigned_rate_limits": { 00:31:06.510 "rw_ios_per_sec": 0, 00:31:06.510 "rw_mbytes_per_sec": 0, 00:31:06.510 "r_mbytes_per_sec": 0, 00:31:06.510 "w_mbytes_per_sec": 0 00:31:06.510 }, 00:31:06.510 "claimed": false, 00:31:06.510 "zoned": false, 00:31:06.510 "supported_io_types": { 00:31:06.510 "read": true, 00:31:06.510 "write": true, 00:31:06.510 "unmap": true, 00:31:06.510 "write_zeroes": true, 00:31:06.510 "flush": true, 00:31:06.510 "reset": true, 00:31:06.510 "compare": false, 00:31:06.510 "compare_and_write": false, 00:31:06.510 "abort": false, 00:31:06.510 "nvme_admin": false, 00:31:06.510 "nvme_io": false 00:31:06.510 }, 00:31:06.510 "memory_domains": [ 00:31:06.510 { 00:31:06.510 "dma_device_id": "system", 00:31:06.510 "dma_device_type": 1 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:06.510 "dma_device_type": 2 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "system", 00:31:06.510 "dma_device_type": 1 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:06.510 "dma_device_type": 2 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "system", 00:31:06.510 "dma_device_type": 1 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:06.510 "dma_device_type": 2 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "system", 00:31:06.510 "dma_device_type": 1 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:06.510 "dma_device_type": 2 00:31:06.510 } 00:31:06.510 ], 00:31:06.510 "driver_specific": { 00:31:06.510 "raid": { 00:31:06.510 "uuid": "5620e6e4-8366-4dcd-b0a2-9cd129210395", 00:31:06.510 "strip_size_kb": 64, 00:31:06.510 "state": "online", 00:31:06.510 "raid_level": "concat", 00:31:06.510 "superblock": false, 00:31:06.510 "num_base_bdevs": 4, 00:31:06.510 "num_base_bdevs_discovered": 4, 00:31:06.510 "num_base_bdevs_operational": 4, 00:31:06.510 "base_bdevs_list": [ 00:31:06.510 { 00:31:06.510 "name": "NewBaseBdev", 00:31:06.510 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:06.510 "is_configured": true, 00:31:06.510 "data_offset": 0, 00:31:06.510 "data_size": 65536 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "name": "BaseBdev2", 00:31:06.510 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:06.510 "is_configured": true, 00:31:06.510 "data_offset": 0, 00:31:06.510 "data_size": 65536 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "name": "BaseBdev3", 00:31:06.510 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:06.510 "is_configured": true, 00:31:06.510 "data_offset": 0, 00:31:06.510 "data_size": 65536 00:31:06.510 }, 00:31:06.510 { 00:31:06.510 "name": "BaseBdev4", 00:31:06.510 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:06.510 "is_configured": true, 00:31:06.510 "data_offset": 0, 00:31:06.511 "data_size": 65536 00:31:06.511 } 00:31:06.511 ] 00:31:06.511 } 00:31:06.511 } 00:31:06.511 }' 00:31:06.511 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:06.769 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:31:06.769 BaseBdev2 00:31:06.769 BaseBdev3 00:31:06.769 BaseBdev4' 00:31:06.769 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:06.769 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:31:06.769 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:07.028 "name": "NewBaseBdev", 00:31:07.028 "aliases": [ 00:31:07.028 "e02ce86d-f030-43b6-9dbd-57f9fc952e8d" 00:31:07.028 ], 00:31:07.028 "product_name": "Malloc disk", 00:31:07.028 "block_size": 512, 00:31:07.028 "num_blocks": 65536, 00:31:07.028 "uuid": "e02ce86d-f030-43b6-9dbd-57f9fc952e8d", 00:31:07.028 "assigned_rate_limits": { 00:31:07.028 "rw_ios_per_sec": 0, 00:31:07.028 "rw_mbytes_per_sec": 0, 00:31:07.028 "r_mbytes_per_sec": 0, 00:31:07.028 "w_mbytes_per_sec": 0 00:31:07.028 }, 00:31:07.028 "claimed": true, 00:31:07.028 "claim_type": "exclusive_write", 00:31:07.028 "zoned": false, 00:31:07.028 "supported_io_types": { 00:31:07.028 "read": true, 00:31:07.028 "write": true, 00:31:07.028 "unmap": true, 00:31:07.028 "write_zeroes": true, 00:31:07.028 "flush": true, 00:31:07.028 "reset": true, 00:31:07.028 "compare": false, 00:31:07.028 "compare_and_write": false, 00:31:07.028 "abort": true, 00:31:07.028 "nvme_admin": false, 00:31:07.028 "nvme_io": false 00:31:07.028 }, 00:31:07.028 "memory_domains": [ 00:31:07.028 { 00:31:07.028 "dma_device_id": "system", 00:31:07.028 "dma_device_type": 1 00:31:07.028 }, 00:31:07.028 { 00:31:07.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:07.028 "dma_device_type": 2 00:31:07.028 } 00:31:07.028 ], 00:31:07.028 "driver_specific": {} 00:31:07.028 }' 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:07.028 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:07.287 12:33:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:07.855 "name": "BaseBdev2", 00:31:07.855 "aliases": [ 00:31:07.855 "05621067-10c6-4b64-9ab2-bd43cd4ca223" 00:31:07.855 ], 00:31:07.855 "product_name": "Malloc disk", 00:31:07.855 "block_size": 512, 00:31:07.855 "num_blocks": 65536, 00:31:07.855 "uuid": "05621067-10c6-4b64-9ab2-bd43cd4ca223", 00:31:07.855 "assigned_rate_limits": { 00:31:07.855 "rw_ios_per_sec": 0, 00:31:07.855 "rw_mbytes_per_sec": 0, 00:31:07.855 "r_mbytes_per_sec": 0, 00:31:07.855 "w_mbytes_per_sec": 0 00:31:07.855 }, 00:31:07.855 "claimed": true, 00:31:07.855 "claim_type": "exclusive_write", 00:31:07.855 "zoned": false, 00:31:07.855 "supported_io_types": { 00:31:07.855 "read": true, 00:31:07.855 "write": true, 00:31:07.855 "unmap": true, 00:31:07.855 "write_zeroes": true, 00:31:07.855 "flush": true, 00:31:07.855 "reset": true, 00:31:07.855 "compare": false, 00:31:07.855 "compare_and_write": false, 00:31:07.855 "abort": true, 00:31:07.855 "nvme_admin": false, 00:31:07.855 "nvme_io": false 00:31:07.855 }, 00:31:07.855 "memory_domains": [ 00:31:07.855 { 00:31:07.855 "dma_device_id": "system", 00:31:07.855 "dma_device_type": 1 00:31:07.855 }, 00:31:07.855 { 00:31:07.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:07.855 "dma_device_type": 2 00:31:07.855 } 00:31:07.855 ], 00:31:07.855 "driver_specific": {} 00:31:07.855 }' 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:07.855 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:08.114 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:08.114 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:08.114 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:08.114 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:08.114 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:08.372 "name": "BaseBdev3", 00:31:08.372 "aliases": [ 00:31:08.372 "0efe41d6-94f9-48d0-bfcd-a378881d380f" 00:31:08.372 ], 00:31:08.372 "product_name": "Malloc disk", 00:31:08.372 "block_size": 512, 00:31:08.372 "num_blocks": 65536, 00:31:08.372 "uuid": "0efe41d6-94f9-48d0-bfcd-a378881d380f", 00:31:08.372 "assigned_rate_limits": { 00:31:08.372 "rw_ios_per_sec": 0, 00:31:08.372 "rw_mbytes_per_sec": 0, 00:31:08.372 "r_mbytes_per_sec": 0, 00:31:08.372 "w_mbytes_per_sec": 0 00:31:08.372 }, 00:31:08.372 "claimed": true, 00:31:08.372 "claim_type": "exclusive_write", 00:31:08.372 "zoned": false, 00:31:08.372 "supported_io_types": { 00:31:08.372 "read": true, 00:31:08.372 "write": true, 00:31:08.372 "unmap": true, 00:31:08.372 "write_zeroes": true, 00:31:08.372 "flush": true, 00:31:08.372 "reset": true, 00:31:08.372 "compare": false, 00:31:08.372 "compare_and_write": false, 00:31:08.372 "abort": true, 00:31:08.372 "nvme_admin": false, 00:31:08.372 "nvme_io": false 00:31:08.372 }, 00:31:08.372 "memory_domains": [ 00:31:08.372 { 00:31:08.372 "dma_device_id": "system", 00:31:08.372 "dma_device_type": 1 00:31:08.372 }, 00:31:08.372 { 00:31:08.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.372 "dma_device_type": 2 00:31:08.372 } 00:31:08.372 ], 00:31:08.372 "driver_specific": {} 00:31:08.372 }' 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:08.372 12:33:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:08.372 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:08.372 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:08.630 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:08.630 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:08.630 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:08.630 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:31:08.630 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:08.888 "name": "BaseBdev4", 00:31:08.888 "aliases": [ 00:31:08.888 "21328411-181a-4371-a14c-3113dba33ca0" 00:31:08.888 ], 00:31:08.888 "product_name": "Malloc disk", 00:31:08.888 "block_size": 512, 00:31:08.888 "num_blocks": 65536, 00:31:08.888 "uuid": "21328411-181a-4371-a14c-3113dba33ca0", 00:31:08.888 "assigned_rate_limits": { 00:31:08.888 "rw_ios_per_sec": 0, 00:31:08.888 "rw_mbytes_per_sec": 0, 00:31:08.888 "r_mbytes_per_sec": 0, 00:31:08.888 "w_mbytes_per_sec": 0 00:31:08.888 }, 00:31:08.888 "claimed": true, 00:31:08.888 "claim_type": "exclusive_write", 00:31:08.888 "zoned": false, 00:31:08.888 "supported_io_types": { 00:31:08.888 "read": true, 00:31:08.888 "write": true, 00:31:08.888 "unmap": true, 00:31:08.888 "write_zeroes": true, 00:31:08.888 "flush": true, 00:31:08.888 "reset": true, 00:31:08.888 "compare": false, 00:31:08.888 "compare_and_write": false, 00:31:08.888 "abort": true, 00:31:08.888 "nvme_admin": false, 00:31:08.888 "nvme_io": false 00:31:08.888 }, 00:31:08.888 "memory_domains": [ 00:31:08.888 { 00:31:08.888 "dma_device_id": "system", 00:31:08.888 "dma_device_type": 1 00:31:08.888 }, 00:31:08.888 { 00:31:08.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:08.888 "dma_device_type": 2 00:31:08.888 } 00:31:08.888 ], 00:31:08.888 "driver_specific": {} 00:31:08.888 }' 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:08.888 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:09.146 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:09.405 [2024-06-07 12:33:32.887079] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:09.405 [2024-06-07 12:33:32.887134] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:09.405 [2024-06-07 12:33:32.887203] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:09.405 [2024-06-07 12:33:32.887278] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:09.405 [2024-06-07 12:33:32.887291] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 214689 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 214689 ']' 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 214689 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 214689 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 214689' 00:31:09.405 killing process with pid 214689 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 214689 00:31:09.405 [2024-06-07 12:33:32.936372] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:09.405 12:33:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 214689 00:31:09.405 [2024-06-07 12:33:33.014051] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:31:09.971 00:31:09.971 real 0m35.201s 00:31:09.971 user 1m4.550s 00:31:09.971 sys 0m5.781s 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:31:09.971 ************************************ 00:31:09.971 END TEST raid_state_function_test 00:31:09.971 ************************************ 00:31:09.971 12:33:33 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:31:09.971 12:33:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:09.971 12:33:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:09.971 12:33:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:09.971 ************************************ 00:31:09.971 START TEST raid_state_function_test_sb 00:31:09.971 ************************************ 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 true 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=215802 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 215802' 00:31:09.971 Process raid pid: 215802 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 215802 /var/tmp/spdk-raid.sock 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 215802 ']' 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:09.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:09.971 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:09.971 [2024-06-07 12:33:33.488172] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:31:09.971 [2024-06-07 12:33:33.488869] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:10.230 [2024-06-07 12:33:33.635843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:10.230 [2024-06-07 12:33:33.729432] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:10.230 [2024-06-07 12:33:33.818724] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:10.488 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:10.488 12:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:31:10.488 12:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:31:10.488 [2024-06-07 12:33:34.094609] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:10.488 [2024-06-07 12:33:34.094901] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:10.488 [2024-06-07 12:33:34.094917] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:10.488 [2024-06-07 12:33:34.094977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:10.488 [2024-06-07 12:33:34.094986] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:10.488 [2024-06-07 12:33:34.095070] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:10.488 [2024-06-07 12:33:34.095080] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:31:10.488 [2024-06-07 12:33:34.095157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:10.488 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.054 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:11.054 "name": "Existed_Raid", 00:31:11.054 "uuid": "867ce009-81d9-4798-9142-36f25aa2ea20", 00:31:11.054 "strip_size_kb": 64, 00:31:11.054 "state": "configuring", 00:31:11.054 "raid_level": "concat", 00:31:11.054 "superblock": true, 00:31:11.054 "num_base_bdevs": 4, 00:31:11.054 "num_base_bdevs_discovered": 0, 00:31:11.054 "num_base_bdevs_operational": 4, 00:31:11.054 "base_bdevs_list": [ 00:31:11.054 { 00:31:11.054 "name": "BaseBdev1", 00:31:11.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.054 "is_configured": false, 00:31:11.054 "data_offset": 0, 00:31:11.054 "data_size": 0 00:31:11.054 }, 00:31:11.054 { 00:31:11.054 "name": "BaseBdev2", 00:31:11.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.054 "is_configured": false, 00:31:11.054 "data_offset": 0, 00:31:11.054 "data_size": 0 00:31:11.054 }, 00:31:11.054 { 00:31:11.054 "name": "BaseBdev3", 00:31:11.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.054 "is_configured": false, 00:31:11.054 "data_offset": 0, 00:31:11.054 "data_size": 0 00:31:11.054 }, 00:31:11.054 { 00:31:11.054 "name": "BaseBdev4", 00:31:11.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.054 "is_configured": false, 00:31:11.054 "data_offset": 0, 00:31:11.054 "data_size": 0 00:31:11.054 } 00:31:11.054 ] 00:31:11.054 }' 00:31:11.054 12:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:11.054 12:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:11.619 12:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:11.619 [2024-06-07 12:33:35.194613] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:11.619 [2024-06-07 12:33:35.194662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:31:11.619 12:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:31:11.876 [2024-06-07 12:33:35.390682] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:11.876 [2024-06-07 12:33:35.391152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:11.876 [2024-06-07 12:33:35.391183] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:11.876 [2024-06-07 12:33:35.391284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:11.876 [2024-06-07 12:33:35.391297] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:11.876 [2024-06-07 12:33:35.391361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:11.876 [2024-06-07 12:33:35.391370] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:31:11.876 [2024-06-07 12:33:35.391434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:31:11.876 12:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:12.134 [2024-06-07 12:33:35.682687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:12.134 BaseBdev1 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:12.134 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:12.391 12:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:12.649 [ 00:31:12.649 { 00:31:12.649 "name": "BaseBdev1", 00:31:12.649 "aliases": [ 00:31:12.649 "79d38e1d-0dcc-45c9-99c6-2fef3335119f" 00:31:12.649 ], 00:31:12.649 "product_name": "Malloc disk", 00:31:12.649 "block_size": 512, 00:31:12.649 "num_blocks": 65536, 00:31:12.649 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:12.649 "assigned_rate_limits": { 00:31:12.649 "rw_ios_per_sec": 0, 00:31:12.649 "rw_mbytes_per_sec": 0, 00:31:12.649 "r_mbytes_per_sec": 0, 00:31:12.649 "w_mbytes_per_sec": 0 00:31:12.649 }, 00:31:12.649 "claimed": true, 00:31:12.649 "claim_type": "exclusive_write", 00:31:12.649 "zoned": false, 00:31:12.650 "supported_io_types": { 00:31:12.650 "read": true, 00:31:12.650 "write": true, 00:31:12.650 "unmap": true, 00:31:12.650 "write_zeroes": true, 00:31:12.650 "flush": true, 00:31:12.650 "reset": true, 00:31:12.650 "compare": false, 00:31:12.650 "compare_and_write": false, 00:31:12.650 "abort": true, 00:31:12.650 "nvme_admin": false, 00:31:12.650 "nvme_io": false 00:31:12.650 }, 00:31:12.650 "memory_domains": [ 00:31:12.650 { 00:31:12.650 "dma_device_id": "system", 00:31:12.650 "dma_device_type": 1 00:31:12.650 }, 00:31:12.650 { 00:31:12.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:12.650 "dma_device_type": 2 00:31:12.650 } 00:31:12.650 ], 00:31:12.650 "driver_specific": {} 00:31:12.650 } 00:31:12.650 ] 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:12.650 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:12.908 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:12.908 "name": "Existed_Raid", 00:31:12.908 "uuid": "e20f42cc-8b89-41c4-8a7f-630c45233446", 00:31:12.908 "strip_size_kb": 64, 00:31:12.908 "state": "configuring", 00:31:12.908 "raid_level": "concat", 00:31:12.908 "superblock": true, 00:31:12.908 "num_base_bdevs": 4, 00:31:12.908 "num_base_bdevs_discovered": 1, 00:31:12.908 "num_base_bdevs_operational": 4, 00:31:12.908 "base_bdevs_list": [ 00:31:12.908 { 00:31:12.908 "name": "BaseBdev1", 00:31:12.908 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:12.909 "is_configured": true, 00:31:12.909 "data_offset": 2048, 00:31:12.909 "data_size": 63488 00:31:12.909 }, 00:31:12.909 { 00:31:12.909 "name": "BaseBdev2", 00:31:12.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:12.909 "is_configured": false, 00:31:12.909 "data_offset": 0, 00:31:12.909 "data_size": 0 00:31:12.909 }, 00:31:12.909 { 00:31:12.909 "name": "BaseBdev3", 00:31:12.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:12.909 "is_configured": false, 00:31:12.909 "data_offset": 0, 00:31:12.909 "data_size": 0 00:31:12.909 }, 00:31:12.909 { 00:31:12.909 "name": "BaseBdev4", 00:31:12.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:12.909 "is_configured": false, 00:31:12.909 "data_offset": 0, 00:31:12.909 "data_size": 0 00:31:12.909 } 00:31:12.909 ] 00:31:12.909 }' 00:31:12.909 12:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:12.909 12:33:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:13.476 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:13.735 [2024-06-07 12:33:37.302949] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:13.735 [2024-06-07 12:33:37.303023] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:31:13.735 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:31:13.994 [2024-06-07 12:33:37.503049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:13.994 [2024-06-07 12:33:37.505077] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:13.994 [2024-06-07 12:33:37.505614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:13.994 [2024-06-07 12:33:37.505643] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:31:13.994 [2024-06-07 12:33:37.505728] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:31:13.994 [2024-06-07 12:33:37.505740] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:31:13.994 [2024-06-07 12:33:37.505798] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:13.994 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:14.253 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:14.253 "name": "Existed_Raid", 00:31:14.253 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:14.253 "strip_size_kb": 64, 00:31:14.253 "state": "configuring", 00:31:14.253 "raid_level": "concat", 00:31:14.253 "superblock": true, 00:31:14.253 "num_base_bdevs": 4, 00:31:14.253 "num_base_bdevs_discovered": 1, 00:31:14.253 "num_base_bdevs_operational": 4, 00:31:14.253 "base_bdevs_list": [ 00:31:14.253 { 00:31:14.253 "name": "BaseBdev1", 00:31:14.253 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:14.253 "is_configured": true, 00:31:14.253 "data_offset": 2048, 00:31:14.253 "data_size": 63488 00:31:14.253 }, 00:31:14.253 { 00:31:14.253 "name": "BaseBdev2", 00:31:14.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:14.253 "is_configured": false, 00:31:14.253 "data_offset": 0, 00:31:14.253 "data_size": 0 00:31:14.253 }, 00:31:14.253 { 00:31:14.253 "name": "BaseBdev3", 00:31:14.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:14.253 "is_configured": false, 00:31:14.253 "data_offset": 0, 00:31:14.253 "data_size": 0 00:31:14.253 }, 00:31:14.253 { 00:31:14.253 "name": "BaseBdev4", 00:31:14.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:14.253 "is_configured": false, 00:31:14.253 "data_offset": 0, 00:31:14.253 "data_size": 0 00:31:14.253 } 00:31:14.253 ] 00:31:14.253 }' 00:31:14.253 12:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:14.253 12:33:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:14.819 12:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:15.078 [2024-06-07 12:33:38.513576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:15.078 BaseBdev2 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:15.078 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:15.336 12:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:15.595 [ 00:31:15.595 { 00:31:15.595 "name": "BaseBdev2", 00:31:15.595 "aliases": [ 00:31:15.595 "b155760e-f17a-468c-9112-281a8be44873" 00:31:15.595 ], 00:31:15.595 "product_name": "Malloc disk", 00:31:15.595 "block_size": 512, 00:31:15.595 "num_blocks": 65536, 00:31:15.595 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:15.595 "assigned_rate_limits": { 00:31:15.595 "rw_ios_per_sec": 0, 00:31:15.595 "rw_mbytes_per_sec": 0, 00:31:15.595 "r_mbytes_per_sec": 0, 00:31:15.595 "w_mbytes_per_sec": 0 00:31:15.595 }, 00:31:15.595 "claimed": true, 00:31:15.595 "claim_type": "exclusive_write", 00:31:15.595 "zoned": false, 00:31:15.595 "supported_io_types": { 00:31:15.595 "read": true, 00:31:15.595 "write": true, 00:31:15.595 "unmap": true, 00:31:15.595 "write_zeroes": true, 00:31:15.595 "flush": true, 00:31:15.595 "reset": true, 00:31:15.595 "compare": false, 00:31:15.595 "compare_and_write": false, 00:31:15.595 "abort": true, 00:31:15.595 "nvme_admin": false, 00:31:15.595 "nvme_io": false 00:31:15.595 }, 00:31:15.595 "memory_domains": [ 00:31:15.595 { 00:31:15.595 "dma_device_id": "system", 00:31:15.595 "dma_device_type": 1 00:31:15.595 }, 00:31:15.595 { 00:31:15.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:15.595 "dma_device_type": 2 00:31:15.595 } 00:31:15.595 ], 00:31:15.595 "driver_specific": {} 00:31:15.595 } 00:31:15.595 ] 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.595 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:15.854 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:15.854 "name": "Existed_Raid", 00:31:15.854 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:15.854 "strip_size_kb": 64, 00:31:15.854 "state": "configuring", 00:31:15.854 "raid_level": "concat", 00:31:15.854 "superblock": true, 00:31:15.854 "num_base_bdevs": 4, 00:31:15.854 "num_base_bdevs_discovered": 2, 00:31:15.854 "num_base_bdevs_operational": 4, 00:31:15.854 "base_bdevs_list": [ 00:31:15.854 { 00:31:15.854 "name": "BaseBdev1", 00:31:15.854 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:15.854 "is_configured": true, 00:31:15.854 "data_offset": 2048, 00:31:15.854 "data_size": 63488 00:31:15.854 }, 00:31:15.854 { 00:31:15.854 "name": "BaseBdev2", 00:31:15.854 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:15.854 "is_configured": true, 00:31:15.854 "data_offset": 2048, 00:31:15.854 "data_size": 63488 00:31:15.854 }, 00:31:15.854 { 00:31:15.854 "name": "BaseBdev3", 00:31:15.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:15.854 "is_configured": false, 00:31:15.854 "data_offset": 0, 00:31:15.854 "data_size": 0 00:31:15.854 }, 00:31:15.854 { 00:31:15.854 "name": "BaseBdev4", 00:31:15.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:15.854 "is_configured": false, 00:31:15.854 "data_offset": 0, 00:31:15.854 "data_size": 0 00:31:15.854 } 00:31:15.854 ] 00:31:15.854 }' 00:31:15.854 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:15.854 12:33:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:16.420 12:33:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:16.678 [2024-06-07 12:33:40.131421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:16.678 BaseBdev3 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:16.678 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:16.937 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:16.937 [ 00:31:16.937 { 00:31:16.937 "name": "BaseBdev3", 00:31:16.937 "aliases": [ 00:31:16.937 "0e7a2d47-b931-433c-b3e0-e141d783eb0f" 00:31:16.937 ], 00:31:16.937 "product_name": "Malloc disk", 00:31:16.937 "block_size": 512, 00:31:16.937 "num_blocks": 65536, 00:31:16.937 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:16.937 "assigned_rate_limits": { 00:31:16.937 "rw_ios_per_sec": 0, 00:31:16.937 "rw_mbytes_per_sec": 0, 00:31:16.937 "r_mbytes_per_sec": 0, 00:31:16.937 "w_mbytes_per_sec": 0 00:31:16.937 }, 00:31:16.937 "claimed": true, 00:31:16.937 "claim_type": "exclusive_write", 00:31:16.937 "zoned": false, 00:31:16.937 "supported_io_types": { 00:31:16.937 "read": true, 00:31:16.937 "write": true, 00:31:16.937 "unmap": true, 00:31:16.937 "write_zeroes": true, 00:31:16.937 "flush": true, 00:31:16.937 "reset": true, 00:31:16.937 "compare": false, 00:31:16.937 "compare_and_write": false, 00:31:16.937 "abort": true, 00:31:16.937 "nvme_admin": false, 00:31:16.937 "nvme_io": false 00:31:16.937 }, 00:31:16.937 "memory_domains": [ 00:31:16.937 { 00:31:16.937 "dma_device_id": "system", 00:31:16.938 "dma_device_type": 1 00:31:16.938 }, 00:31:16.938 { 00:31:16.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:16.938 "dma_device_type": 2 00:31:16.938 } 00:31:16.938 ], 00:31:16.938 "driver_specific": {} 00:31:16.938 } 00:31:16.938 ] 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.196 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:17.455 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:17.455 "name": "Existed_Raid", 00:31:17.455 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:17.455 "strip_size_kb": 64, 00:31:17.455 "state": "configuring", 00:31:17.455 "raid_level": "concat", 00:31:17.455 "superblock": true, 00:31:17.455 "num_base_bdevs": 4, 00:31:17.455 "num_base_bdevs_discovered": 3, 00:31:17.455 "num_base_bdevs_operational": 4, 00:31:17.455 "base_bdevs_list": [ 00:31:17.455 { 00:31:17.455 "name": "BaseBdev1", 00:31:17.455 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:17.455 "is_configured": true, 00:31:17.455 "data_offset": 2048, 00:31:17.455 "data_size": 63488 00:31:17.455 }, 00:31:17.455 { 00:31:17.455 "name": "BaseBdev2", 00:31:17.455 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:17.455 "is_configured": true, 00:31:17.455 "data_offset": 2048, 00:31:17.455 "data_size": 63488 00:31:17.455 }, 00:31:17.455 { 00:31:17.455 "name": "BaseBdev3", 00:31:17.455 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:17.455 "is_configured": true, 00:31:17.455 "data_offset": 2048, 00:31:17.455 "data_size": 63488 00:31:17.455 }, 00:31:17.455 { 00:31:17.455 "name": "BaseBdev4", 00:31:17.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:17.455 "is_configured": false, 00:31:17.455 "data_offset": 0, 00:31:17.455 "data_size": 0 00:31:17.455 } 00:31:17.455 ] 00:31:17.455 }' 00:31:17.455 12:33:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:17.455 12:33:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:18.022 12:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:31:18.281 [2024-06-07 12:33:41.802283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:31:18.281 [2024-06-07 12:33:41.802744] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:31:18.281 BaseBdev4 00:31:18.281 [2024-06-07 12:33:41.804776] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:31:18.281 [2024-06-07 12:33:41.805093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:31:18.281 [2024-06-07 12:33:41.805654] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:31:18.281 [2024-06-07 12:33:41.805796] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:31:18.281 [2024-06-07 12:33:41.806052] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:18.281 12:33:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:18.540 12:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:31:18.798 [ 00:31:18.798 { 00:31:18.798 "name": "BaseBdev4", 00:31:18.798 "aliases": [ 00:31:18.798 "d2fbf501-83ea-4392-8dca-8696478a0d67" 00:31:18.798 ], 00:31:18.798 "product_name": "Malloc disk", 00:31:18.798 "block_size": 512, 00:31:18.798 "num_blocks": 65536, 00:31:18.798 "uuid": "d2fbf501-83ea-4392-8dca-8696478a0d67", 00:31:18.798 "assigned_rate_limits": { 00:31:18.798 "rw_ios_per_sec": 0, 00:31:18.798 "rw_mbytes_per_sec": 0, 00:31:18.798 "r_mbytes_per_sec": 0, 00:31:18.798 "w_mbytes_per_sec": 0 00:31:18.798 }, 00:31:18.798 "claimed": true, 00:31:18.798 "claim_type": "exclusive_write", 00:31:18.798 "zoned": false, 00:31:18.798 "supported_io_types": { 00:31:18.798 "read": true, 00:31:18.798 "write": true, 00:31:18.798 "unmap": true, 00:31:18.798 "write_zeroes": true, 00:31:18.798 "flush": true, 00:31:18.798 "reset": true, 00:31:18.798 "compare": false, 00:31:18.798 "compare_and_write": false, 00:31:18.798 "abort": true, 00:31:18.798 "nvme_admin": false, 00:31:18.798 "nvme_io": false 00:31:18.798 }, 00:31:18.798 "memory_domains": [ 00:31:18.798 { 00:31:18.798 "dma_device_id": "system", 00:31:18.798 "dma_device_type": 1 00:31:18.798 }, 00:31:18.798 { 00:31:18.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:18.798 "dma_device_type": 2 00:31:18.798 } 00:31:18.798 ], 00:31:18.798 "driver_specific": {} 00:31:18.798 } 00:31:18.798 ] 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:18.798 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:19.055 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:19.055 "name": "Existed_Raid", 00:31:19.055 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:19.055 "strip_size_kb": 64, 00:31:19.055 "state": "online", 00:31:19.055 "raid_level": "concat", 00:31:19.055 "superblock": true, 00:31:19.055 "num_base_bdevs": 4, 00:31:19.055 "num_base_bdevs_discovered": 4, 00:31:19.055 "num_base_bdevs_operational": 4, 00:31:19.055 "base_bdevs_list": [ 00:31:19.055 { 00:31:19.055 "name": "BaseBdev1", 00:31:19.055 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:19.055 "is_configured": true, 00:31:19.055 "data_offset": 2048, 00:31:19.055 "data_size": 63488 00:31:19.055 }, 00:31:19.055 { 00:31:19.055 "name": "BaseBdev2", 00:31:19.056 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:19.056 "is_configured": true, 00:31:19.056 "data_offset": 2048, 00:31:19.056 "data_size": 63488 00:31:19.056 }, 00:31:19.056 { 00:31:19.056 "name": "BaseBdev3", 00:31:19.056 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:19.056 "is_configured": true, 00:31:19.056 "data_offset": 2048, 00:31:19.056 "data_size": 63488 00:31:19.056 }, 00:31:19.056 { 00:31:19.056 "name": "BaseBdev4", 00:31:19.056 "uuid": "d2fbf501-83ea-4392-8dca-8696478a0d67", 00:31:19.056 "is_configured": true, 00:31:19.056 "data_offset": 2048, 00:31:19.056 "data_size": 63488 00:31:19.056 } 00:31:19.056 ] 00:31:19.056 }' 00:31:19.056 12:33:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:19.056 12:33:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:19.620 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:19.879 [2024-06-07 12:33:43.406853] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:19.879 "name": "Existed_Raid", 00:31:19.879 "aliases": [ 00:31:19.879 "7f231699-8846-4f0d-949d-b2c39255daec" 00:31:19.879 ], 00:31:19.879 "product_name": "Raid Volume", 00:31:19.879 "block_size": 512, 00:31:19.879 "num_blocks": 253952, 00:31:19.879 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:19.879 "assigned_rate_limits": { 00:31:19.879 "rw_ios_per_sec": 0, 00:31:19.879 "rw_mbytes_per_sec": 0, 00:31:19.879 "r_mbytes_per_sec": 0, 00:31:19.879 "w_mbytes_per_sec": 0 00:31:19.879 }, 00:31:19.879 "claimed": false, 00:31:19.879 "zoned": false, 00:31:19.879 "supported_io_types": { 00:31:19.879 "read": true, 00:31:19.879 "write": true, 00:31:19.879 "unmap": true, 00:31:19.879 "write_zeroes": true, 00:31:19.879 "flush": true, 00:31:19.879 "reset": true, 00:31:19.879 "compare": false, 00:31:19.879 "compare_and_write": false, 00:31:19.879 "abort": false, 00:31:19.879 "nvme_admin": false, 00:31:19.879 "nvme_io": false 00:31:19.879 }, 00:31:19.879 "memory_domains": [ 00:31:19.879 { 00:31:19.879 "dma_device_id": "system", 00:31:19.879 "dma_device_type": 1 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:19.879 "dma_device_type": 2 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "system", 00:31:19.879 "dma_device_type": 1 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:19.879 "dma_device_type": 2 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "system", 00:31:19.879 "dma_device_type": 1 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:19.879 "dma_device_type": 2 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "system", 00:31:19.879 "dma_device_type": 1 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:19.879 "dma_device_type": 2 00:31:19.879 } 00:31:19.879 ], 00:31:19.879 "driver_specific": { 00:31:19.879 "raid": { 00:31:19.879 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:19.879 "strip_size_kb": 64, 00:31:19.879 "state": "online", 00:31:19.879 "raid_level": "concat", 00:31:19.879 "superblock": true, 00:31:19.879 "num_base_bdevs": 4, 00:31:19.879 "num_base_bdevs_discovered": 4, 00:31:19.879 "num_base_bdevs_operational": 4, 00:31:19.879 "base_bdevs_list": [ 00:31:19.879 { 00:31:19.879 "name": "BaseBdev1", 00:31:19.879 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:19.879 "is_configured": true, 00:31:19.879 "data_offset": 2048, 00:31:19.879 "data_size": 63488 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "name": "BaseBdev2", 00:31:19.879 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:19.879 "is_configured": true, 00:31:19.879 "data_offset": 2048, 00:31:19.879 "data_size": 63488 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "name": "BaseBdev3", 00:31:19.879 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:19.879 "is_configured": true, 00:31:19.879 "data_offset": 2048, 00:31:19.879 "data_size": 63488 00:31:19.879 }, 00:31:19.879 { 00:31:19.879 "name": "BaseBdev4", 00:31:19.879 "uuid": "d2fbf501-83ea-4392-8dca-8696478a0d67", 00:31:19.879 "is_configured": true, 00:31:19.879 "data_offset": 2048, 00:31:19.879 "data_size": 63488 00:31:19.879 } 00:31:19.879 ] 00:31:19.879 } 00:31:19.879 } 00:31:19.879 }' 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:31:19.879 BaseBdev2 00:31:19.879 BaseBdev3 00:31:19.879 BaseBdev4' 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:19.879 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:31:20.138 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:20.138 "name": "BaseBdev1", 00:31:20.138 "aliases": [ 00:31:20.138 "79d38e1d-0dcc-45c9-99c6-2fef3335119f" 00:31:20.138 ], 00:31:20.138 "product_name": "Malloc disk", 00:31:20.138 "block_size": 512, 00:31:20.138 "num_blocks": 65536, 00:31:20.138 "uuid": "79d38e1d-0dcc-45c9-99c6-2fef3335119f", 00:31:20.138 "assigned_rate_limits": { 00:31:20.138 "rw_ios_per_sec": 0, 00:31:20.138 "rw_mbytes_per_sec": 0, 00:31:20.138 "r_mbytes_per_sec": 0, 00:31:20.138 "w_mbytes_per_sec": 0 00:31:20.138 }, 00:31:20.138 "claimed": true, 00:31:20.138 "claim_type": "exclusive_write", 00:31:20.138 "zoned": false, 00:31:20.138 "supported_io_types": { 00:31:20.138 "read": true, 00:31:20.138 "write": true, 00:31:20.138 "unmap": true, 00:31:20.138 "write_zeroes": true, 00:31:20.138 "flush": true, 00:31:20.138 "reset": true, 00:31:20.138 "compare": false, 00:31:20.138 "compare_and_write": false, 00:31:20.138 "abort": true, 00:31:20.138 "nvme_admin": false, 00:31:20.138 "nvme_io": false 00:31:20.138 }, 00:31:20.138 "memory_domains": [ 00:31:20.138 { 00:31:20.138 "dma_device_id": "system", 00:31:20.138 "dma_device_type": 1 00:31:20.138 }, 00:31:20.138 { 00:31:20.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:20.138 "dma_device_type": 2 00:31:20.138 } 00:31:20.138 ], 00:31:20.138 "driver_specific": {} 00:31:20.138 }' 00:31:20.138 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:20.396 12:33:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:20.396 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:20.396 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:20.654 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:20.654 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:20.654 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:20.654 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:20.654 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:20.913 "name": "BaseBdev2", 00:31:20.913 "aliases": [ 00:31:20.913 "b155760e-f17a-468c-9112-281a8be44873" 00:31:20.913 ], 00:31:20.913 "product_name": "Malloc disk", 00:31:20.913 "block_size": 512, 00:31:20.913 "num_blocks": 65536, 00:31:20.913 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:20.913 "assigned_rate_limits": { 00:31:20.913 "rw_ios_per_sec": 0, 00:31:20.913 "rw_mbytes_per_sec": 0, 00:31:20.913 "r_mbytes_per_sec": 0, 00:31:20.913 "w_mbytes_per_sec": 0 00:31:20.913 }, 00:31:20.913 "claimed": true, 00:31:20.913 "claim_type": "exclusive_write", 00:31:20.913 "zoned": false, 00:31:20.913 "supported_io_types": { 00:31:20.913 "read": true, 00:31:20.913 "write": true, 00:31:20.913 "unmap": true, 00:31:20.913 "write_zeroes": true, 00:31:20.913 "flush": true, 00:31:20.913 "reset": true, 00:31:20.913 "compare": false, 00:31:20.913 "compare_and_write": false, 00:31:20.913 "abort": true, 00:31:20.913 "nvme_admin": false, 00:31:20.913 "nvme_io": false 00:31:20.913 }, 00:31:20.913 "memory_domains": [ 00:31:20.913 { 00:31:20.913 "dma_device_id": "system", 00:31:20.913 "dma_device_type": 1 00:31:20.913 }, 00:31:20.913 { 00:31:20.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:20.913 "dma_device_type": 2 00:31:20.913 } 00:31:20.913 ], 00:31:20.913 "driver_specific": {} 00:31:20.913 }' 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:20.913 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:21.171 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:21.430 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:21.430 "name": "BaseBdev3", 00:31:21.430 "aliases": [ 00:31:21.430 "0e7a2d47-b931-433c-b3e0-e141d783eb0f" 00:31:21.430 ], 00:31:21.430 "product_name": "Malloc disk", 00:31:21.430 "block_size": 512, 00:31:21.430 "num_blocks": 65536, 00:31:21.430 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:21.430 "assigned_rate_limits": { 00:31:21.430 "rw_ios_per_sec": 0, 00:31:21.430 "rw_mbytes_per_sec": 0, 00:31:21.430 "r_mbytes_per_sec": 0, 00:31:21.430 "w_mbytes_per_sec": 0 00:31:21.430 }, 00:31:21.430 "claimed": true, 00:31:21.430 "claim_type": "exclusive_write", 00:31:21.430 "zoned": false, 00:31:21.430 "supported_io_types": { 00:31:21.430 "read": true, 00:31:21.430 "write": true, 00:31:21.430 "unmap": true, 00:31:21.430 "write_zeroes": true, 00:31:21.430 "flush": true, 00:31:21.430 "reset": true, 00:31:21.430 "compare": false, 00:31:21.430 "compare_and_write": false, 00:31:21.430 "abort": true, 00:31:21.430 "nvme_admin": false, 00:31:21.430 "nvme_io": false 00:31:21.430 }, 00:31:21.430 "memory_domains": [ 00:31:21.430 { 00:31:21.430 "dma_device_id": "system", 00:31:21.430 "dma_device_type": 1 00:31:21.430 }, 00:31:21.430 { 00:31:21.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.430 "dma_device_type": 2 00:31:21.430 } 00:31:21.430 ], 00:31:21.430 "driver_specific": {} 00:31:21.430 }' 00:31:21.430 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:21.430 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:21.430 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:21.430 12:33:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:21.430 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:21.430 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:21.430 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:21.688 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:31:21.945 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:21.945 "name": "BaseBdev4", 00:31:21.945 "aliases": [ 00:31:21.945 "d2fbf501-83ea-4392-8dca-8696478a0d67" 00:31:21.945 ], 00:31:21.945 "product_name": "Malloc disk", 00:31:21.945 "block_size": 512, 00:31:21.945 "num_blocks": 65536, 00:31:21.945 "uuid": "d2fbf501-83ea-4392-8dca-8696478a0d67", 00:31:21.945 "assigned_rate_limits": { 00:31:21.945 "rw_ios_per_sec": 0, 00:31:21.945 "rw_mbytes_per_sec": 0, 00:31:21.945 "r_mbytes_per_sec": 0, 00:31:21.945 "w_mbytes_per_sec": 0 00:31:21.945 }, 00:31:21.945 "claimed": true, 00:31:21.945 "claim_type": "exclusive_write", 00:31:21.945 "zoned": false, 00:31:21.945 "supported_io_types": { 00:31:21.945 "read": true, 00:31:21.945 "write": true, 00:31:21.945 "unmap": true, 00:31:21.945 "write_zeroes": true, 00:31:21.945 "flush": true, 00:31:21.945 "reset": true, 00:31:21.945 "compare": false, 00:31:21.945 "compare_and_write": false, 00:31:21.945 "abort": true, 00:31:21.945 "nvme_admin": false, 00:31:21.945 "nvme_io": false 00:31:21.945 }, 00:31:21.945 "memory_domains": [ 00:31:21.945 { 00:31:21.945 "dma_device_id": "system", 00:31:21.945 "dma_device_type": 1 00:31:21.945 }, 00:31:21.945 { 00:31:21.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:21.945 "dma_device_type": 2 00:31:21.945 } 00:31:21.945 ], 00:31:21.945 "driver_specific": {} 00:31:21.945 }' 00:31:21.945 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:21.945 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.201 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:22.458 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:22.458 12:33:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:22.458 [2024-06-07 12:33:46.055043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:22.458 [2024-06-07 12:33:46.056400] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:22.458 [2024-06-07 12:33:46.056620] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:22.458 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:22.716 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:22.716 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:22.973 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:22.973 "name": "Existed_Raid", 00:31:22.973 "uuid": "7f231699-8846-4f0d-949d-b2c39255daec", 00:31:22.973 "strip_size_kb": 64, 00:31:22.973 "state": "offline", 00:31:22.973 "raid_level": "concat", 00:31:22.973 "superblock": true, 00:31:22.973 "num_base_bdevs": 4, 00:31:22.973 "num_base_bdevs_discovered": 3, 00:31:22.973 "num_base_bdevs_operational": 3, 00:31:22.973 "base_bdevs_list": [ 00:31:22.973 { 00:31:22.973 "name": null, 00:31:22.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:22.973 "is_configured": false, 00:31:22.973 "data_offset": 2048, 00:31:22.973 "data_size": 63488 00:31:22.973 }, 00:31:22.973 { 00:31:22.973 "name": "BaseBdev2", 00:31:22.973 "uuid": "b155760e-f17a-468c-9112-281a8be44873", 00:31:22.973 "is_configured": true, 00:31:22.973 "data_offset": 2048, 00:31:22.973 "data_size": 63488 00:31:22.973 }, 00:31:22.973 { 00:31:22.973 "name": "BaseBdev3", 00:31:22.973 "uuid": "0e7a2d47-b931-433c-b3e0-e141d783eb0f", 00:31:22.973 "is_configured": true, 00:31:22.973 "data_offset": 2048, 00:31:22.973 "data_size": 63488 00:31:22.973 }, 00:31:22.973 { 00:31:22.973 "name": "BaseBdev4", 00:31:22.973 "uuid": "d2fbf501-83ea-4392-8dca-8696478a0d67", 00:31:22.973 "is_configured": true, 00:31:22.973 "data_offset": 2048, 00:31:22.973 "data_size": 63488 00:31:22.973 } 00:31:22.973 ] 00:31:22.973 }' 00:31:22.973 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:22.973 12:33:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:23.539 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:31:23.539 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:23.539 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:23.539 12:33:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:23.539 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:23.539 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:23.539 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:31:23.798 [2024-06-07 12:33:47.329751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:23.798 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:23.798 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:23.798 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:23.798 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:24.055 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:24.055 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:24.055 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:31:24.313 [2024-06-07 12:33:47.818813] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:31:24.313 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:24.313 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:24.313 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:24.313 12:33:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:24.599 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:24.599 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:24.599 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:31:24.856 [2024-06-07 12:33:48.388664] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:31:24.856 [2024-06-07 12:33:48.388953] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:31:24.856 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:24.856 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:24.856 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:24.856 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:25.115 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:31:25.373 BaseBdev2 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:25.373 12:33:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:25.630 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:25.888 [ 00:31:25.888 { 00:31:25.888 "name": "BaseBdev2", 00:31:25.888 "aliases": [ 00:31:25.888 "b48ddbb2-9186-4c67-96e5-6870d5a74a57" 00:31:25.888 ], 00:31:25.888 "product_name": "Malloc disk", 00:31:25.888 "block_size": 512, 00:31:25.888 "num_blocks": 65536, 00:31:25.888 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:25.888 "assigned_rate_limits": { 00:31:25.888 "rw_ios_per_sec": 0, 00:31:25.888 "rw_mbytes_per_sec": 0, 00:31:25.888 "r_mbytes_per_sec": 0, 00:31:25.888 "w_mbytes_per_sec": 0 00:31:25.888 }, 00:31:25.888 "claimed": false, 00:31:25.888 "zoned": false, 00:31:25.888 "supported_io_types": { 00:31:25.888 "read": true, 00:31:25.888 "write": true, 00:31:25.888 "unmap": true, 00:31:25.888 "write_zeroes": true, 00:31:25.888 "flush": true, 00:31:25.888 "reset": true, 00:31:25.888 "compare": false, 00:31:25.888 "compare_and_write": false, 00:31:25.888 "abort": true, 00:31:25.888 "nvme_admin": false, 00:31:25.888 "nvme_io": false 00:31:25.888 }, 00:31:25.888 "memory_domains": [ 00:31:25.888 { 00:31:25.888 "dma_device_id": "system", 00:31:25.888 "dma_device_type": 1 00:31:25.888 }, 00:31:25.888 { 00:31:25.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:25.888 "dma_device_type": 2 00:31:25.888 } 00:31:25.888 ], 00:31:25.888 "driver_specific": {} 00:31:25.888 } 00:31:25.888 ] 00:31:25.888 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:25.888 12:33:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:25.888 12:33:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:25.888 12:33:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:31:26.454 BaseBdev3 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:26.454 12:33:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:26.454 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:31:26.711 [ 00:31:26.711 { 00:31:26.711 "name": "BaseBdev3", 00:31:26.711 "aliases": [ 00:31:26.711 "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1" 00:31:26.711 ], 00:31:26.711 "product_name": "Malloc disk", 00:31:26.711 "block_size": 512, 00:31:26.711 "num_blocks": 65536, 00:31:26.711 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:26.711 "assigned_rate_limits": { 00:31:26.711 "rw_ios_per_sec": 0, 00:31:26.711 "rw_mbytes_per_sec": 0, 00:31:26.711 "r_mbytes_per_sec": 0, 00:31:26.711 "w_mbytes_per_sec": 0 00:31:26.711 }, 00:31:26.711 "claimed": false, 00:31:26.711 "zoned": false, 00:31:26.711 "supported_io_types": { 00:31:26.711 "read": true, 00:31:26.711 "write": true, 00:31:26.711 "unmap": true, 00:31:26.711 "write_zeroes": true, 00:31:26.711 "flush": true, 00:31:26.711 "reset": true, 00:31:26.711 "compare": false, 00:31:26.711 "compare_and_write": false, 00:31:26.711 "abort": true, 00:31:26.711 "nvme_admin": false, 00:31:26.711 "nvme_io": false 00:31:26.711 }, 00:31:26.711 "memory_domains": [ 00:31:26.711 { 00:31:26.711 "dma_device_id": "system", 00:31:26.711 "dma_device_type": 1 00:31:26.711 }, 00:31:26.711 { 00:31:26.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:26.711 "dma_device_type": 2 00:31:26.711 } 00:31:26.711 ], 00:31:26.711 "driver_specific": {} 00:31:26.711 } 00:31:26.711 ] 00:31:26.711 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:26.711 12:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:26.711 12:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:26.712 12:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:31:26.979 BaseBdev4 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:26.979 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:27.251 12:33:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:31:27.509 [ 00:31:27.509 { 00:31:27.509 "name": "BaseBdev4", 00:31:27.509 "aliases": [ 00:31:27.509 "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f" 00:31:27.509 ], 00:31:27.509 "product_name": "Malloc disk", 00:31:27.509 "block_size": 512, 00:31:27.509 "num_blocks": 65536, 00:31:27.509 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:27.509 "assigned_rate_limits": { 00:31:27.509 "rw_ios_per_sec": 0, 00:31:27.510 "rw_mbytes_per_sec": 0, 00:31:27.510 "r_mbytes_per_sec": 0, 00:31:27.510 "w_mbytes_per_sec": 0 00:31:27.510 }, 00:31:27.510 "claimed": false, 00:31:27.510 "zoned": false, 00:31:27.510 "supported_io_types": { 00:31:27.510 "read": true, 00:31:27.510 "write": true, 00:31:27.510 "unmap": true, 00:31:27.510 "write_zeroes": true, 00:31:27.510 "flush": true, 00:31:27.510 "reset": true, 00:31:27.510 "compare": false, 00:31:27.510 "compare_and_write": false, 00:31:27.510 "abort": true, 00:31:27.510 "nvme_admin": false, 00:31:27.510 "nvme_io": false 00:31:27.510 }, 00:31:27.510 "memory_domains": [ 00:31:27.510 { 00:31:27.510 "dma_device_id": "system", 00:31:27.510 "dma_device_type": 1 00:31:27.510 }, 00:31:27.510 { 00:31:27.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:27.510 "dma_device_type": 2 00:31:27.510 } 00:31:27.510 ], 00:31:27.510 "driver_specific": {} 00:31:27.510 } 00:31:27.510 ] 00:31:27.510 12:33:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:27.510 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:31:27.510 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:31:27.510 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:31:27.767 [2024-06-07 12:33:51.362045] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:27.767 [2024-06-07 12:33:51.362395] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:27.767 [2024-06-07 12:33:51.362522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:27.767 [2024-06-07 12:33:51.364645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:27.767 [2024-06-07 12:33:51.364812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:27.768 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:28.025 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:28.025 "name": "Existed_Raid", 00:31:28.025 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:28.025 "strip_size_kb": 64, 00:31:28.025 "state": "configuring", 00:31:28.025 "raid_level": "concat", 00:31:28.025 "superblock": true, 00:31:28.025 "num_base_bdevs": 4, 00:31:28.025 "num_base_bdevs_discovered": 3, 00:31:28.025 "num_base_bdevs_operational": 4, 00:31:28.025 "base_bdevs_list": [ 00:31:28.025 { 00:31:28.025 "name": "BaseBdev1", 00:31:28.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:28.025 "is_configured": false, 00:31:28.025 "data_offset": 0, 00:31:28.025 "data_size": 0 00:31:28.025 }, 00:31:28.025 { 00:31:28.025 "name": "BaseBdev2", 00:31:28.025 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:28.025 "is_configured": true, 00:31:28.025 "data_offset": 2048, 00:31:28.025 "data_size": 63488 00:31:28.025 }, 00:31:28.025 { 00:31:28.025 "name": "BaseBdev3", 00:31:28.025 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:28.025 "is_configured": true, 00:31:28.025 "data_offset": 2048, 00:31:28.025 "data_size": 63488 00:31:28.025 }, 00:31:28.025 { 00:31:28.025 "name": "BaseBdev4", 00:31:28.025 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:28.025 "is_configured": true, 00:31:28.025 "data_offset": 2048, 00:31:28.025 "data_size": 63488 00:31:28.025 } 00:31:28.025 ] 00:31:28.025 }' 00:31:28.025 12:33:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:28.025 12:33:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:31:28.958 [2024-06-07 12:33:52.522132] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:28.958 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:29.525 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:29.525 "name": "Existed_Raid", 00:31:29.525 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:29.525 "strip_size_kb": 64, 00:31:29.525 "state": "configuring", 00:31:29.525 "raid_level": "concat", 00:31:29.525 "superblock": true, 00:31:29.525 "num_base_bdevs": 4, 00:31:29.525 "num_base_bdevs_discovered": 2, 00:31:29.525 "num_base_bdevs_operational": 4, 00:31:29.525 "base_bdevs_list": [ 00:31:29.525 { 00:31:29.525 "name": "BaseBdev1", 00:31:29.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:29.525 "is_configured": false, 00:31:29.525 "data_offset": 0, 00:31:29.525 "data_size": 0 00:31:29.525 }, 00:31:29.525 { 00:31:29.525 "name": null, 00:31:29.525 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:29.525 "is_configured": false, 00:31:29.525 "data_offset": 2048, 00:31:29.525 "data_size": 63488 00:31:29.525 }, 00:31:29.525 { 00:31:29.525 "name": "BaseBdev3", 00:31:29.525 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:29.525 "is_configured": true, 00:31:29.525 "data_offset": 2048, 00:31:29.525 "data_size": 63488 00:31:29.525 }, 00:31:29.525 { 00:31:29.525 "name": "BaseBdev4", 00:31:29.525 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:29.525 "is_configured": true, 00:31:29.525 "data_offset": 2048, 00:31:29.525 "data_size": 63488 00:31:29.525 } 00:31:29.525 ] 00:31:29.525 }' 00:31:29.525 12:33:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:29.525 12:33:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:30.091 12:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:30.091 12:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:30.349 12:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:31:30.349 12:33:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:31:30.607 [2024-06-07 12:33:54.143825] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:30.607 BaseBdev1 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:30.607 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:31.174 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:31.174 [ 00:31:31.174 { 00:31:31.174 "name": "BaseBdev1", 00:31:31.174 "aliases": [ 00:31:31.174 "f985f0bd-06ae-45de-93cc-4eefa27dd0b9" 00:31:31.174 ], 00:31:31.174 "product_name": "Malloc disk", 00:31:31.174 "block_size": 512, 00:31:31.174 "num_blocks": 65536, 00:31:31.174 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:31.174 "assigned_rate_limits": { 00:31:31.174 "rw_ios_per_sec": 0, 00:31:31.174 "rw_mbytes_per_sec": 0, 00:31:31.174 "r_mbytes_per_sec": 0, 00:31:31.174 "w_mbytes_per_sec": 0 00:31:31.174 }, 00:31:31.174 "claimed": true, 00:31:31.174 "claim_type": "exclusive_write", 00:31:31.174 "zoned": false, 00:31:31.174 "supported_io_types": { 00:31:31.174 "read": true, 00:31:31.174 "write": true, 00:31:31.174 "unmap": true, 00:31:31.174 "write_zeroes": true, 00:31:31.174 "flush": true, 00:31:31.174 "reset": true, 00:31:31.174 "compare": false, 00:31:31.174 "compare_and_write": false, 00:31:31.174 "abort": true, 00:31:31.174 "nvme_admin": false, 00:31:31.174 "nvme_io": false 00:31:31.174 }, 00:31:31.174 "memory_domains": [ 00:31:31.174 { 00:31:31.174 "dma_device_id": "system", 00:31:31.174 "dma_device_type": 1 00:31:31.174 }, 00:31:31.174 { 00:31:31.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:31.175 "dma_device_type": 2 00:31:31.175 } 00:31:31.175 ], 00:31:31.175 "driver_specific": {} 00:31:31.175 } 00:31:31.175 ] 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:31.175 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:31.433 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:31.433 "name": "Existed_Raid", 00:31:31.433 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:31.433 "strip_size_kb": 64, 00:31:31.433 "state": "configuring", 00:31:31.433 "raid_level": "concat", 00:31:31.433 "superblock": true, 00:31:31.433 "num_base_bdevs": 4, 00:31:31.433 "num_base_bdevs_discovered": 3, 00:31:31.433 "num_base_bdevs_operational": 4, 00:31:31.433 "base_bdevs_list": [ 00:31:31.433 { 00:31:31.433 "name": "BaseBdev1", 00:31:31.433 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:31.433 "is_configured": true, 00:31:31.433 "data_offset": 2048, 00:31:31.433 "data_size": 63488 00:31:31.433 }, 00:31:31.433 { 00:31:31.433 "name": null, 00:31:31.433 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:31.433 "is_configured": false, 00:31:31.433 "data_offset": 2048, 00:31:31.433 "data_size": 63488 00:31:31.433 }, 00:31:31.433 { 00:31:31.433 "name": "BaseBdev3", 00:31:31.433 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:31.433 "is_configured": true, 00:31:31.433 "data_offset": 2048, 00:31:31.433 "data_size": 63488 00:31:31.433 }, 00:31:31.433 { 00:31:31.433 "name": "BaseBdev4", 00:31:31.433 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:31.434 "is_configured": true, 00:31:31.434 "data_offset": 2048, 00:31:31.434 "data_size": 63488 00:31:31.434 } 00:31:31.434 ] 00:31:31.434 }' 00:31:31.434 12:33:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:31.434 12:33:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:32.029 12:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:32.029 12:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:31:32.300 12:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:31:32.300 12:33:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:31:32.559 [2024-06-07 12:33:56.117607] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:32.559 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:32.826 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:32.826 "name": "Existed_Raid", 00:31:32.826 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:32.826 "strip_size_kb": 64, 00:31:32.826 "state": "configuring", 00:31:32.826 "raid_level": "concat", 00:31:32.826 "superblock": true, 00:31:32.826 "num_base_bdevs": 4, 00:31:32.826 "num_base_bdevs_discovered": 2, 00:31:32.826 "num_base_bdevs_operational": 4, 00:31:32.826 "base_bdevs_list": [ 00:31:32.826 { 00:31:32.826 "name": "BaseBdev1", 00:31:32.826 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:32.826 "is_configured": true, 00:31:32.826 "data_offset": 2048, 00:31:32.826 "data_size": 63488 00:31:32.826 }, 00:31:32.826 { 00:31:32.826 "name": null, 00:31:32.826 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:32.826 "is_configured": false, 00:31:32.826 "data_offset": 2048, 00:31:32.826 "data_size": 63488 00:31:32.826 }, 00:31:32.826 { 00:31:32.826 "name": null, 00:31:32.826 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:32.826 "is_configured": false, 00:31:32.826 "data_offset": 2048, 00:31:32.826 "data_size": 63488 00:31:32.826 }, 00:31:32.826 { 00:31:32.826 "name": "BaseBdev4", 00:31:32.826 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:32.826 "is_configured": true, 00:31:32.826 "data_offset": 2048, 00:31:32.826 "data_size": 63488 00:31:32.826 } 00:31:32.826 ] 00:31:32.826 }' 00:31:32.826 12:33:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:32.826 12:33:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:33.393 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:33.393 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:31:33.652 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:31:33.652 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:31:33.910 [2024-06-07 12:33:57.437819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:33.910 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:34.168 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:34.168 "name": "Existed_Raid", 00:31:34.168 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:34.168 "strip_size_kb": 64, 00:31:34.168 "state": "configuring", 00:31:34.168 "raid_level": "concat", 00:31:34.168 "superblock": true, 00:31:34.168 "num_base_bdevs": 4, 00:31:34.168 "num_base_bdevs_discovered": 3, 00:31:34.168 "num_base_bdevs_operational": 4, 00:31:34.168 "base_bdevs_list": [ 00:31:34.168 { 00:31:34.168 "name": "BaseBdev1", 00:31:34.168 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:34.168 "is_configured": true, 00:31:34.168 "data_offset": 2048, 00:31:34.168 "data_size": 63488 00:31:34.168 }, 00:31:34.168 { 00:31:34.168 "name": null, 00:31:34.168 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:34.168 "is_configured": false, 00:31:34.168 "data_offset": 2048, 00:31:34.168 "data_size": 63488 00:31:34.168 }, 00:31:34.168 { 00:31:34.168 "name": "BaseBdev3", 00:31:34.168 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:34.168 "is_configured": true, 00:31:34.168 "data_offset": 2048, 00:31:34.168 "data_size": 63488 00:31:34.168 }, 00:31:34.168 { 00:31:34.168 "name": "BaseBdev4", 00:31:34.168 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:34.168 "is_configured": true, 00:31:34.168 "data_offset": 2048, 00:31:34.168 "data_size": 63488 00:31:34.168 } 00:31:34.168 ] 00:31:34.168 }' 00:31:34.168 12:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:34.168 12:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:35.103 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:35.103 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:31:35.103 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:31:35.103 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:35.363 [2024-06-07 12:33:58.901554] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:35.363 12:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:35.621 12:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:35.621 "name": "Existed_Raid", 00:31:35.621 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:35.621 "strip_size_kb": 64, 00:31:35.621 "state": "configuring", 00:31:35.621 "raid_level": "concat", 00:31:35.621 "superblock": true, 00:31:35.621 "num_base_bdevs": 4, 00:31:35.621 "num_base_bdevs_discovered": 2, 00:31:35.621 "num_base_bdevs_operational": 4, 00:31:35.621 "base_bdevs_list": [ 00:31:35.621 { 00:31:35.621 "name": null, 00:31:35.621 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:35.621 "is_configured": false, 00:31:35.621 "data_offset": 2048, 00:31:35.621 "data_size": 63488 00:31:35.621 }, 00:31:35.621 { 00:31:35.621 "name": null, 00:31:35.621 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:35.621 "is_configured": false, 00:31:35.621 "data_offset": 2048, 00:31:35.621 "data_size": 63488 00:31:35.621 }, 00:31:35.621 { 00:31:35.621 "name": "BaseBdev3", 00:31:35.621 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:35.621 "is_configured": true, 00:31:35.621 "data_offset": 2048, 00:31:35.621 "data_size": 63488 00:31:35.621 }, 00:31:35.621 { 00:31:35.621 "name": "BaseBdev4", 00:31:35.621 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:35.621 "is_configured": true, 00:31:35.621 "data_offset": 2048, 00:31:35.621 "data_size": 63488 00:31:35.621 } 00:31:35.621 ] 00:31:35.621 }' 00:31:35.621 12:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:35.621 12:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:36.190 12:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:36.190 12:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:31:36.448 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:31:36.448 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:31:36.706 [2024-06-07 12:34:00.259498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:36.706 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:37.069 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:37.069 "name": "Existed_Raid", 00:31:37.069 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:37.069 "strip_size_kb": 64, 00:31:37.069 "state": "configuring", 00:31:37.069 "raid_level": "concat", 00:31:37.069 "superblock": true, 00:31:37.069 "num_base_bdevs": 4, 00:31:37.069 "num_base_bdevs_discovered": 3, 00:31:37.069 "num_base_bdevs_operational": 4, 00:31:37.069 "base_bdevs_list": [ 00:31:37.069 { 00:31:37.069 "name": null, 00:31:37.069 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:37.069 "is_configured": false, 00:31:37.069 "data_offset": 2048, 00:31:37.069 "data_size": 63488 00:31:37.069 }, 00:31:37.069 { 00:31:37.069 "name": "BaseBdev2", 00:31:37.069 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:37.069 "is_configured": true, 00:31:37.069 "data_offset": 2048, 00:31:37.069 "data_size": 63488 00:31:37.069 }, 00:31:37.069 { 00:31:37.069 "name": "BaseBdev3", 00:31:37.069 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:37.069 "is_configured": true, 00:31:37.069 "data_offset": 2048, 00:31:37.069 "data_size": 63488 00:31:37.069 }, 00:31:37.069 { 00:31:37.069 "name": "BaseBdev4", 00:31:37.069 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:37.069 "is_configured": true, 00:31:37.069 "data_offset": 2048, 00:31:37.069 "data_size": 63488 00:31:37.069 } 00:31:37.069 ] 00:31:37.069 }' 00:31:37.069 12:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:37.069 12:34:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:37.634 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:37.634 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:31:37.634 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:31:37.634 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:37.634 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:31:37.891 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f985f0bd-06ae-45de-93cc-4eefa27dd0b9 00:31:38.150 [2024-06-07 12:34:01.729463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:31:38.150 [2024-06-07 12:34:01.729872] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:31:38.150 [2024-06-07 12:34:01.730011] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:31:38.150 [2024-06-07 12:34:01.730110] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:31:38.150 [2024-06-07 12:34:01.730457] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:31:38.150 [2024-06-07 12:34:01.730570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:31:38.150 [2024-06-07 12:34:01.730741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:38.150 NewBaseBdev 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:38.150 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:38.408 12:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:31:38.666 [ 00:31:38.666 { 00:31:38.666 "name": "NewBaseBdev", 00:31:38.666 "aliases": [ 00:31:38.666 "f985f0bd-06ae-45de-93cc-4eefa27dd0b9" 00:31:38.666 ], 00:31:38.666 "product_name": "Malloc disk", 00:31:38.666 "block_size": 512, 00:31:38.666 "num_blocks": 65536, 00:31:38.666 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:38.666 "assigned_rate_limits": { 00:31:38.666 "rw_ios_per_sec": 0, 00:31:38.666 "rw_mbytes_per_sec": 0, 00:31:38.666 "r_mbytes_per_sec": 0, 00:31:38.666 "w_mbytes_per_sec": 0 00:31:38.666 }, 00:31:38.666 "claimed": true, 00:31:38.666 "claim_type": "exclusive_write", 00:31:38.666 "zoned": false, 00:31:38.666 "supported_io_types": { 00:31:38.666 "read": true, 00:31:38.666 "write": true, 00:31:38.666 "unmap": true, 00:31:38.666 "write_zeroes": true, 00:31:38.666 "flush": true, 00:31:38.666 "reset": true, 00:31:38.666 "compare": false, 00:31:38.666 "compare_and_write": false, 00:31:38.666 "abort": true, 00:31:38.666 "nvme_admin": false, 00:31:38.666 "nvme_io": false 00:31:38.666 }, 00:31:38.666 "memory_domains": [ 00:31:38.666 { 00:31:38.666 "dma_device_id": "system", 00:31:38.666 "dma_device_type": 1 00:31:38.666 }, 00:31:38.666 { 00:31:38.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:38.666 "dma_device_type": 2 00:31:38.666 } 00:31:38.666 ], 00:31:38.666 "driver_specific": {} 00:31:38.666 } 00:31:38.666 ] 00:31:38.666 12:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:31:38.666 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:31:38.666 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:38.666 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:38.666 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:38.667 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:38.924 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:38.924 "name": "Existed_Raid", 00:31:38.924 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:38.924 "strip_size_kb": 64, 00:31:38.924 "state": "online", 00:31:38.924 "raid_level": "concat", 00:31:38.924 "superblock": true, 00:31:38.924 "num_base_bdevs": 4, 00:31:38.924 "num_base_bdevs_discovered": 4, 00:31:38.924 "num_base_bdevs_operational": 4, 00:31:38.924 "base_bdevs_list": [ 00:31:38.924 { 00:31:38.924 "name": "NewBaseBdev", 00:31:38.924 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:38.924 "is_configured": true, 00:31:38.924 "data_offset": 2048, 00:31:38.924 "data_size": 63488 00:31:38.924 }, 00:31:38.924 { 00:31:38.924 "name": "BaseBdev2", 00:31:38.924 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:38.924 "is_configured": true, 00:31:38.924 "data_offset": 2048, 00:31:38.924 "data_size": 63488 00:31:38.924 }, 00:31:38.924 { 00:31:38.924 "name": "BaseBdev3", 00:31:38.924 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:38.924 "is_configured": true, 00:31:38.924 "data_offset": 2048, 00:31:38.924 "data_size": 63488 00:31:38.924 }, 00:31:38.924 { 00:31:38.924 "name": "BaseBdev4", 00:31:38.924 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:38.924 "is_configured": true, 00:31:38.924 "data_offset": 2048, 00:31:38.924 "data_size": 63488 00:31:38.924 } 00:31:38.924 ] 00:31:38.924 }' 00:31:38.924 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:38.924 12:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:39.489 12:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:39.748 [2024-06-07 12:34:03.225881] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:39.748 "name": "Existed_Raid", 00:31:39.748 "aliases": [ 00:31:39.748 "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa" 00:31:39.748 ], 00:31:39.748 "product_name": "Raid Volume", 00:31:39.748 "block_size": 512, 00:31:39.748 "num_blocks": 253952, 00:31:39.748 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:39.748 "assigned_rate_limits": { 00:31:39.748 "rw_ios_per_sec": 0, 00:31:39.748 "rw_mbytes_per_sec": 0, 00:31:39.748 "r_mbytes_per_sec": 0, 00:31:39.748 "w_mbytes_per_sec": 0 00:31:39.748 }, 00:31:39.748 "claimed": false, 00:31:39.748 "zoned": false, 00:31:39.748 "supported_io_types": { 00:31:39.748 "read": true, 00:31:39.748 "write": true, 00:31:39.748 "unmap": true, 00:31:39.748 "write_zeroes": true, 00:31:39.748 "flush": true, 00:31:39.748 "reset": true, 00:31:39.748 "compare": false, 00:31:39.748 "compare_and_write": false, 00:31:39.748 "abort": false, 00:31:39.748 "nvme_admin": false, 00:31:39.748 "nvme_io": false 00:31:39.748 }, 00:31:39.748 "memory_domains": [ 00:31:39.748 { 00:31:39.748 "dma_device_id": "system", 00:31:39.748 "dma_device_type": 1 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:39.748 "dma_device_type": 2 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "system", 00:31:39.748 "dma_device_type": 1 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:39.748 "dma_device_type": 2 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "system", 00:31:39.748 "dma_device_type": 1 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:39.748 "dma_device_type": 2 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "system", 00:31:39.748 "dma_device_type": 1 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:39.748 "dma_device_type": 2 00:31:39.748 } 00:31:39.748 ], 00:31:39.748 "driver_specific": { 00:31:39.748 "raid": { 00:31:39.748 "uuid": "edf58e4b-ae4a-47e2-8f63-bdc8eeea58aa", 00:31:39.748 "strip_size_kb": 64, 00:31:39.748 "state": "online", 00:31:39.748 "raid_level": "concat", 00:31:39.748 "superblock": true, 00:31:39.748 "num_base_bdevs": 4, 00:31:39.748 "num_base_bdevs_discovered": 4, 00:31:39.748 "num_base_bdevs_operational": 4, 00:31:39.748 "base_bdevs_list": [ 00:31:39.748 { 00:31:39.748 "name": "NewBaseBdev", 00:31:39.748 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:39.748 "is_configured": true, 00:31:39.748 "data_offset": 2048, 00:31:39.748 "data_size": 63488 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "name": "BaseBdev2", 00:31:39.748 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:39.748 "is_configured": true, 00:31:39.748 "data_offset": 2048, 00:31:39.748 "data_size": 63488 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "name": "BaseBdev3", 00:31:39.748 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:39.748 "is_configured": true, 00:31:39.748 "data_offset": 2048, 00:31:39.748 "data_size": 63488 00:31:39.748 }, 00:31:39.748 { 00:31:39.748 "name": "BaseBdev4", 00:31:39.748 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:39.748 "is_configured": true, 00:31:39.748 "data_offset": 2048, 00:31:39.748 "data_size": 63488 00:31:39.748 } 00:31:39.748 ] 00:31:39.748 } 00:31:39.748 } 00:31:39.748 }' 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:31:39.748 BaseBdev2 00:31:39.748 BaseBdev3 00:31:39.748 BaseBdev4' 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:31:39.748 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:40.005 "name": "NewBaseBdev", 00:31:40.005 "aliases": [ 00:31:40.005 "f985f0bd-06ae-45de-93cc-4eefa27dd0b9" 00:31:40.005 ], 00:31:40.005 "product_name": "Malloc disk", 00:31:40.005 "block_size": 512, 00:31:40.005 "num_blocks": 65536, 00:31:40.005 "uuid": "f985f0bd-06ae-45de-93cc-4eefa27dd0b9", 00:31:40.005 "assigned_rate_limits": { 00:31:40.005 "rw_ios_per_sec": 0, 00:31:40.005 "rw_mbytes_per_sec": 0, 00:31:40.005 "r_mbytes_per_sec": 0, 00:31:40.005 "w_mbytes_per_sec": 0 00:31:40.005 }, 00:31:40.005 "claimed": true, 00:31:40.005 "claim_type": "exclusive_write", 00:31:40.005 "zoned": false, 00:31:40.005 "supported_io_types": { 00:31:40.005 "read": true, 00:31:40.005 "write": true, 00:31:40.005 "unmap": true, 00:31:40.005 "write_zeroes": true, 00:31:40.005 "flush": true, 00:31:40.005 "reset": true, 00:31:40.005 "compare": false, 00:31:40.005 "compare_and_write": false, 00:31:40.005 "abort": true, 00:31:40.005 "nvme_admin": false, 00:31:40.005 "nvme_io": false 00:31:40.005 }, 00:31:40.005 "memory_domains": [ 00:31:40.005 { 00:31:40.005 "dma_device_id": "system", 00:31:40.005 "dma_device_type": 1 00:31:40.005 }, 00:31:40.005 { 00:31:40.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:40.005 "dma_device_type": 2 00:31:40.005 } 00:31:40.005 ], 00:31:40.005 "driver_specific": {} 00:31:40.005 }' 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:40.005 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:40.264 12:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:40.522 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:40.522 "name": "BaseBdev2", 00:31:40.522 "aliases": [ 00:31:40.522 "b48ddbb2-9186-4c67-96e5-6870d5a74a57" 00:31:40.522 ], 00:31:40.522 "product_name": "Malloc disk", 00:31:40.522 "block_size": 512, 00:31:40.522 "num_blocks": 65536, 00:31:40.522 "uuid": "b48ddbb2-9186-4c67-96e5-6870d5a74a57", 00:31:40.522 "assigned_rate_limits": { 00:31:40.522 "rw_ios_per_sec": 0, 00:31:40.522 "rw_mbytes_per_sec": 0, 00:31:40.522 "r_mbytes_per_sec": 0, 00:31:40.522 "w_mbytes_per_sec": 0 00:31:40.522 }, 00:31:40.522 "claimed": true, 00:31:40.522 "claim_type": "exclusive_write", 00:31:40.522 "zoned": false, 00:31:40.522 "supported_io_types": { 00:31:40.522 "read": true, 00:31:40.522 "write": true, 00:31:40.522 "unmap": true, 00:31:40.522 "write_zeroes": true, 00:31:40.522 "flush": true, 00:31:40.522 "reset": true, 00:31:40.522 "compare": false, 00:31:40.522 "compare_and_write": false, 00:31:40.522 "abort": true, 00:31:40.522 "nvme_admin": false, 00:31:40.522 "nvme_io": false 00:31:40.522 }, 00:31:40.522 "memory_domains": [ 00:31:40.522 { 00:31:40.522 "dma_device_id": "system", 00:31:40.522 "dma_device_type": 1 00:31:40.522 }, 00:31:40.522 { 00:31:40.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:40.522 "dma_device_type": 2 00:31:40.522 } 00:31:40.522 ], 00:31:40.522 "driver_specific": {} 00:31:40.522 }' 00:31:40.522 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:40.522 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:40.522 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:40.522 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:40.780 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:40.780 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:40.780 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:40.780 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:40.781 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:41.038 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:31:41.038 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:41.038 "name": "BaseBdev3", 00:31:41.038 "aliases": [ 00:31:41.039 "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1" 00:31:41.039 ], 00:31:41.039 "product_name": "Malloc disk", 00:31:41.039 "block_size": 512, 00:31:41.039 "num_blocks": 65536, 00:31:41.039 "uuid": "6f11e119-0c54-4743-a4cc-d0ce1ccf02a1", 00:31:41.039 "assigned_rate_limits": { 00:31:41.039 "rw_ios_per_sec": 0, 00:31:41.039 "rw_mbytes_per_sec": 0, 00:31:41.039 "r_mbytes_per_sec": 0, 00:31:41.039 "w_mbytes_per_sec": 0 00:31:41.039 }, 00:31:41.039 "claimed": true, 00:31:41.039 "claim_type": "exclusive_write", 00:31:41.039 "zoned": false, 00:31:41.039 "supported_io_types": { 00:31:41.039 "read": true, 00:31:41.039 "write": true, 00:31:41.039 "unmap": true, 00:31:41.039 "write_zeroes": true, 00:31:41.039 "flush": true, 00:31:41.039 "reset": true, 00:31:41.039 "compare": false, 00:31:41.039 "compare_and_write": false, 00:31:41.039 "abort": true, 00:31:41.039 "nvme_admin": false, 00:31:41.039 "nvme_io": false 00:31:41.039 }, 00:31:41.039 "memory_domains": [ 00:31:41.039 { 00:31:41.039 "dma_device_id": "system", 00:31:41.039 "dma_device_type": 1 00:31:41.039 }, 00:31:41.039 { 00:31:41.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:41.039 "dma_device_type": 2 00:31:41.039 } 00:31:41.039 ], 00:31:41.039 "driver_specific": {} 00:31:41.039 }' 00:31:41.039 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:41.296 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:41.553 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:41.554 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:41.554 12:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:41.554 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:41.554 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:41.554 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:31:41.554 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:41.811 "name": "BaseBdev4", 00:31:41.811 "aliases": [ 00:31:41.811 "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f" 00:31:41.811 ], 00:31:41.811 "product_name": "Malloc disk", 00:31:41.811 "block_size": 512, 00:31:41.811 "num_blocks": 65536, 00:31:41.811 "uuid": "dd4c6f5b-5138-45ec-b875-d221d1ed0c5f", 00:31:41.811 "assigned_rate_limits": { 00:31:41.811 "rw_ios_per_sec": 0, 00:31:41.811 "rw_mbytes_per_sec": 0, 00:31:41.811 "r_mbytes_per_sec": 0, 00:31:41.811 "w_mbytes_per_sec": 0 00:31:41.811 }, 00:31:41.811 "claimed": true, 00:31:41.811 "claim_type": "exclusive_write", 00:31:41.811 "zoned": false, 00:31:41.811 "supported_io_types": { 00:31:41.811 "read": true, 00:31:41.811 "write": true, 00:31:41.811 "unmap": true, 00:31:41.811 "write_zeroes": true, 00:31:41.811 "flush": true, 00:31:41.811 "reset": true, 00:31:41.811 "compare": false, 00:31:41.811 "compare_and_write": false, 00:31:41.811 "abort": true, 00:31:41.811 "nvme_admin": false, 00:31:41.811 "nvme_io": false 00:31:41.811 }, 00:31:41.811 "memory_domains": [ 00:31:41.811 { 00:31:41.811 "dma_device_id": "system", 00:31:41.811 "dma_device_type": 1 00:31:41.811 }, 00:31:41.811 { 00:31:41.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:41.811 "dma_device_type": 2 00:31:41.811 } 00:31:41.811 ], 00:31:41.811 "driver_specific": {} 00:31:41.811 }' 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:41.811 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:42.069 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:42.069 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.069 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.069 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:42.069 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:42.327 [2024-06-07 12:34:05.833967] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:42.327 [2024-06-07 12:34:05.834277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:42.327 [2024-06-07 12:34:05.834441] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:42.327 [2024-06-07 12:34:05.834609] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:42.327 [2024-06-07 12:34:05.834701] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 215802 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 215802 ']' 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 215802 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 215802 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 215802' 00:31:42.327 killing process with pid 215802 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 215802 00:31:42.327 [2024-06-07 12:34:05.889543] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:42.327 12:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 215802 00:31:42.327 [2024-06-07 12:34:05.970099] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:42.914 12:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:31:42.914 00:31:42.914 real 0m32.885s 00:31:42.914 user 1m0.383s 00:31:42.914 sys 0m5.589s 00:31:42.914 12:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:42.914 12:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:31:42.914 ************************************ 00:31:42.914 END TEST raid_state_function_test_sb 00:31:42.914 ************************************ 00:31:42.914 12:34:06 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:31:42.914 12:34:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:31:42.914 12:34:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:42.914 12:34:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:42.914 ************************************ 00:31:42.914 START TEST raid_superblock_test 00:31:42.914 ************************************ 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 4 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=216879 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 216879 /var/tmp/spdk-raid.sock 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 216879 ']' 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:42.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:42.914 12:34:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:42.914 [2024-06-07 12:34:06.454773] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:31:42.914 [2024-06-07 12:34:06.455857] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216879 ] 00:31:43.173 [2024-06-07 12:34:06.603131] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:43.173 [2024-06-07 12:34:06.721022] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:43.173 [2024-06-07 12:34:06.816093] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:31:44.109 malloc1 00:31:44.109 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:44.368 [2024-06-07 12:34:07.884122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:44.368 [2024-06-07 12:34:07.884448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:44.368 [2024-06-07 12:34:07.884635] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:31:44.368 [2024-06-07 12:34:07.884782] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:44.368 [2024-06-07 12:34:07.887157] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:44.368 [2024-06-07 12:34:07.887356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:44.368 pt1 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:44.368 12:34:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:31:44.627 malloc2 00:31:44.627 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:44.885 [2024-06-07 12:34:08.427441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:44.885 [2024-06-07 12:34:08.427810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:44.885 [2024-06-07 12:34:08.427895] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:31:44.885 [2024-06-07 12:34:08.428025] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:44.885 [2024-06-07 12:34:08.430258] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:44.885 [2024-06-07 12:34:08.430441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:44.885 pt2 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:44.885 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:31:45.143 malloc3 00:31:45.143 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:31:45.401 [2024-06-07 12:34:08.885608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:31:45.401 [2024-06-07 12:34:08.885958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:45.401 [2024-06-07 12:34:08.886033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:31:45.401 [2024-06-07 12:34:08.886166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:45.401 [2024-06-07 12:34:08.888413] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:45.401 [2024-06-07 12:34:08.888602] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:31:45.401 pt3 00:31:45.401 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:45.401 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:45.401 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:31:45.401 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:31:45.401 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:31:45.402 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:45.402 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:45.402 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:45.402 12:34:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:31:45.660 malloc4 00:31:45.660 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:31:45.919 [2024-06-07 12:34:09.433292] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:31:45.919 [2024-06-07 12:34:09.433623] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:45.919 [2024-06-07 12:34:09.433730] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:31:45.919 [2024-06-07 12:34:09.433875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:45.919 [2024-06-07 12:34:09.436111] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:45.919 [2024-06-07 12:34:09.436298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:31:45.919 pt4 00:31:45.919 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:45.919 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:45.919 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:31:46.178 [2024-06-07 12:34:09.661438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:46.178 [2024-06-07 12:34:09.663699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:46.178 [2024-06-07 12:34:09.663907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:31:46.178 [2024-06-07 12:34:09.663987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:31:46.178 [2024-06-07 12:34:09.664289] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008480 00:31:46.178 [2024-06-07 12:34:09.664428] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:31:46.178 [2024-06-07 12:34:09.664621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:31:46.178 [2024-06-07 12:34:09.665045] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008480 00:31:46.178 [2024-06-07 12:34:09.665148] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008480 00:31:46.178 [2024-06-07 12:34:09.665412] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:46.178 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:46.437 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:46.437 "name": "raid_bdev1", 00:31:46.437 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:46.437 "strip_size_kb": 64, 00:31:46.437 "state": "online", 00:31:46.437 "raid_level": "concat", 00:31:46.437 "superblock": true, 00:31:46.437 "num_base_bdevs": 4, 00:31:46.437 "num_base_bdevs_discovered": 4, 00:31:46.437 "num_base_bdevs_operational": 4, 00:31:46.437 "base_bdevs_list": [ 00:31:46.437 { 00:31:46.437 "name": "pt1", 00:31:46.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:46.437 "is_configured": true, 00:31:46.437 "data_offset": 2048, 00:31:46.437 "data_size": 63488 00:31:46.437 }, 00:31:46.437 { 00:31:46.437 "name": "pt2", 00:31:46.437 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:46.437 "is_configured": true, 00:31:46.437 "data_offset": 2048, 00:31:46.437 "data_size": 63488 00:31:46.437 }, 00:31:46.437 { 00:31:46.437 "name": "pt3", 00:31:46.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:46.437 "is_configured": true, 00:31:46.437 "data_offset": 2048, 00:31:46.437 "data_size": 63488 00:31:46.437 }, 00:31:46.437 { 00:31:46.437 "name": "pt4", 00:31:46.437 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:46.437 "is_configured": true, 00:31:46.437 "data_offset": 2048, 00:31:46.437 "data_size": 63488 00:31:46.437 } 00:31:46.437 ] 00:31:46.437 }' 00:31:46.437 12:34:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:46.437 12:34:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:47.004 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:47.004 [2024-06-07 12:34:10.645788] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:47.262 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:47.263 "name": "raid_bdev1", 00:31:47.263 "aliases": [ 00:31:47.263 "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0" 00:31:47.263 ], 00:31:47.263 "product_name": "Raid Volume", 00:31:47.263 "block_size": 512, 00:31:47.263 "num_blocks": 253952, 00:31:47.263 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:47.263 "assigned_rate_limits": { 00:31:47.263 "rw_ios_per_sec": 0, 00:31:47.263 "rw_mbytes_per_sec": 0, 00:31:47.263 "r_mbytes_per_sec": 0, 00:31:47.263 "w_mbytes_per_sec": 0 00:31:47.263 }, 00:31:47.263 "claimed": false, 00:31:47.263 "zoned": false, 00:31:47.263 "supported_io_types": { 00:31:47.263 "read": true, 00:31:47.263 "write": true, 00:31:47.263 "unmap": true, 00:31:47.263 "write_zeroes": true, 00:31:47.263 "flush": true, 00:31:47.263 "reset": true, 00:31:47.263 "compare": false, 00:31:47.263 "compare_and_write": false, 00:31:47.263 "abort": false, 00:31:47.263 "nvme_admin": false, 00:31:47.263 "nvme_io": false 00:31:47.263 }, 00:31:47.263 "memory_domains": [ 00:31:47.263 { 00:31:47.263 "dma_device_id": "system", 00:31:47.263 "dma_device_type": 1 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.263 "dma_device_type": 2 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "system", 00:31:47.263 "dma_device_type": 1 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.263 "dma_device_type": 2 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "system", 00:31:47.263 "dma_device_type": 1 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.263 "dma_device_type": 2 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "system", 00:31:47.263 "dma_device_type": 1 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.263 "dma_device_type": 2 00:31:47.263 } 00:31:47.263 ], 00:31:47.263 "driver_specific": { 00:31:47.263 "raid": { 00:31:47.263 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:47.263 "strip_size_kb": 64, 00:31:47.263 "state": "online", 00:31:47.263 "raid_level": "concat", 00:31:47.263 "superblock": true, 00:31:47.263 "num_base_bdevs": 4, 00:31:47.263 "num_base_bdevs_discovered": 4, 00:31:47.263 "num_base_bdevs_operational": 4, 00:31:47.263 "base_bdevs_list": [ 00:31:47.263 { 00:31:47.263 "name": "pt1", 00:31:47.263 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:47.263 "is_configured": true, 00:31:47.263 "data_offset": 2048, 00:31:47.263 "data_size": 63488 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "name": "pt2", 00:31:47.263 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:47.263 "is_configured": true, 00:31:47.263 "data_offset": 2048, 00:31:47.263 "data_size": 63488 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "name": "pt3", 00:31:47.263 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:47.263 "is_configured": true, 00:31:47.263 "data_offset": 2048, 00:31:47.263 "data_size": 63488 00:31:47.263 }, 00:31:47.263 { 00:31:47.263 "name": "pt4", 00:31:47.263 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:47.263 "is_configured": true, 00:31:47.263 "data_offset": 2048, 00:31:47.263 "data_size": 63488 00:31:47.263 } 00:31:47.263 ] 00:31:47.263 } 00:31:47.263 } 00:31:47.263 }' 00:31:47.263 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:47.263 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:47.263 pt2 00:31:47.263 pt3 00:31:47.263 pt4' 00:31:47.263 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:47.263 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:47.263 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:47.521 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:47.521 "name": "pt1", 00:31:47.521 "aliases": [ 00:31:47.521 "00000000-0000-0000-0000-000000000001" 00:31:47.521 ], 00:31:47.521 "product_name": "passthru", 00:31:47.521 "block_size": 512, 00:31:47.521 "num_blocks": 65536, 00:31:47.521 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:47.521 "assigned_rate_limits": { 00:31:47.521 "rw_ios_per_sec": 0, 00:31:47.521 "rw_mbytes_per_sec": 0, 00:31:47.521 "r_mbytes_per_sec": 0, 00:31:47.521 "w_mbytes_per_sec": 0 00:31:47.521 }, 00:31:47.521 "claimed": true, 00:31:47.521 "claim_type": "exclusive_write", 00:31:47.521 "zoned": false, 00:31:47.521 "supported_io_types": { 00:31:47.521 "read": true, 00:31:47.521 "write": true, 00:31:47.521 "unmap": true, 00:31:47.521 "write_zeroes": true, 00:31:47.521 "flush": true, 00:31:47.521 "reset": true, 00:31:47.521 "compare": false, 00:31:47.521 "compare_and_write": false, 00:31:47.521 "abort": true, 00:31:47.521 "nvme_admin": false, 00:31:47.521 "nvme_io": false 00:31:47.521 }, 00:31:47.521 "memory_domains": [ 00:31:47.521 { 00:31:47.521 "dma_device_id": "system", 00:31:47.521 "dma_device_type": 1 00:31:47.521 }, 00:31:47.521 { 00:31:47.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.521 "dma_device_type": 2 00:31:47.521 } 00:31:47.521 ], 00:31:47.521 "driver_specific": { 00:31:47.521 "passthru": { 00:31:47.521 "name": "pt1", 00:31:47.521 "base_bdev_name": "malloc1" 00:31:47.521 } 00:31:47.521 } 00:31:47.521 }' 00:31:47.521 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:47.521 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:47.521 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:47.521 12:34:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:47.521 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:47.780 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:47.780 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:47.780 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:47.780 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:47.780 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:48.039 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:48.039 "name": "pt2", 00:31:48.039 "aliases": [ 00:31:48.039 "00000000-0000-0000-0000-000000000002" 00:31:48.039 ], 00:31:48.039 "product_name": "passthru", 00:31:48.039 "block_size": 512, 00:31:48.039 "num_blocks": 65536, 00:31:48.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:48.039 "assigned_rate_limits": { 00:31:48.039 "rw_ios_per_sec": 0, 00:31:48.039 "rw_mbytes_per_sec": 0, 00:31:48.039 "r_mbytes_per_sec": 0, 00:31:48.039 "w_mbytes_per_sec": 0 00:31:48.039 }, 00:31:48.039 "claimed": true, 00:31:48.039 "claim_type": "exclusive_write", 00:31:48.039 "zoned": false, 00:31:48.039 "supported_io_types": { 00:31:48.039 "read": true, 00:31:48.039 "write": true, 00:31:48.039 "unmap": true, 00:31:48.039 "write_zeroes": true, 00:31:48.039 "flush": true, 00:31:48.039 "reset": true, 00:31:48.039 "compare": false, 00:31:48.039 "compare_and_write": false, 00:31:48.039 "abort": true, 00:31:48.039 "nvme_admin": false, 00:31:48.039 "nvme_io": false 00:31:48.039 }, 00:31:48.039 "memory_domains": [ 00:31:48.039 { 00:31:48.040 "dma_device_id": "system", 00:31:48.040 "dma_device_type": 1 00:31:48.040 }, 00:31:48.040 { 00:31:48.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:48.040 "dma_device_type": 2 00:31:48.040 } 00:31:48.040 ], 00:31:48.040 "driver_specific": { 00:31:48.040 "passthru": { 00:31:48.040 "name": "pt2", 00:31:48.040 "base_bdev_name": "malloc2" 00:31:48.040 } 00:31:48.040 } 00:31:48.040 }' 00:31:48.040 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.040 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.040 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:48.040 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:48.298 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:48.556 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:48.556 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:48.556 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:48.556 12:34:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:48.815 "name": "pt3", 00:31:48.815 "aliases": [ 00:31:48.815 "00000000-0000-0000-0000-000000000003" 00:31:48.815 ], 00:31:48.815 "product_name": "passthru", 00:31:48.815 "block_size": 512, 00:31:48.815 "num_blocks": 65536, 00:31:48.815 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:48.815 "assigned_rate_limits": { 00:31:48.815 "rw_ios_per_sec": 0, 00:31:48.815 "rw_mbytes_per_sec": 0, 00:31:48.815 "r_mbytes_per_sec": 0, 00:31:48.815 "w_mbytes_per_sec": 0 00:31:48.815 }, 00:31:48.815 "claimed": true, 00:31:48.815 "claim_type": "exclusive_write", 00:31:48.815 "zoned": false, 00:31:48.815 "supported_io_types": { 00:31:48.815 "read": true, 00:31:48.815 "write": true, 00:31:48.815 "unmap": true, 00:31:48.815 "write_zeroes": true, 00:31:48.815 "flush": true, 00:31:48.815 "reset": true, 00:31:48.815 "compare": false, 00:31:48.815 "compare_and_write": false, 00:31:48.815 "abort": true, 00:31:48.815 "nvme_admin": false, 00:31:48.815 "nvme_io": false 00:31:48.815 }, 00:31:48.815 "memory_domains": [ 00:31:48.815 { 00:31:48.815 "dma_device_id": "system", 00:31:48.815 "dma_device_type": 1 00:31:48.815 }, 00:31:48.815 { 00:31:48.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:48.815 "dma_device_type": 2 00:31:48.815 } 00:31:48.815 ], 00:31:48.815 "driver_specific": { 00:31:48.815 "passthru": { 00:31:48.815 "name": "pt3", 00:31:48.815 "base_bdev_name": "malloc3" 00:31:48.815 } 00:31:48.815 } 00:31:48.815 }' 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:48.815 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:49.073 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:31:49.332 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:49.332 "name": "pt4", 00:31:49.332 "aliases": [ 00:31:49.332 "00000000-0000-0000-0000-000000000004" 00:31:49.332 ], 00:31:49.332 "product_name": "passthru", 00:31:49.332 "block_size": 512, 00:31:49.332 "num_blocks": 65536, 00:31:49.332 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:49.332 "assigned_rate_limits": { 00:31:49.332 "rw_ios_per_sec": 0, 00:31:49.332 "rw_mbytes_per_sec": 0, 00:31:49.332 "r_mbytes_per_sec": 0, 00:31:49.332 "w_mbytes_per_sec": 0 00:31:49.332 }, 00:31:49.332 "claimed": true, 00:31:49.332 "claim_type": "exclusive_write", 00:31:49.332 "zoned": false, 00:31:49.332 "supported_io_types": { 00:31:49.332 "read": true, 00:31:49.332 "write": true, 00:31:49.332 "unmap": true, 00:31:49.332 "write_zeroes": true, 00:31:49.332 "flush": true, 00:31:49.332 "reset": true, 00:31:49.332 "compare": false, 00:31:49.332 "compare_and_write": false, 00:31:49.332 "abort": true, 00:31:49.332 "nvme_admin": false, 00:31:49.332 "nvme_io": false 00:31:49.332 }, 00:31:49.332 "memory_domains": [ 00:31:49.332 { 00:31:49.332 "dma_device_id": "system", 00:31:49.332 "dma_device_type": 1 00:31:49.332 }, 00:31:49.332 { 00:31:49.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:49.332 "dma_device_type": 2 00:31:49.332 } 00:31:49.332 ], 00:31:49.332 "driver_specific": { 00:31:49.332 "passthru": { 00:31:49.332 "name": "pt4", 00:31:49.333 "base_bdev_name": "malloc4" 00:31:49.333 } 00:31:49.333 } 00:31:49.333 }' 00:31:49.333 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:49.333 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:49.333 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:49.333 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:49.591 12:34:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:49.591 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:49.850 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:49.850 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:31:49.850 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:50.108 [2024-06-07 12:34:13.506125] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:50.108 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8ecc948d-bcd9-4aa6-98cc-2455cd34aff0 00:31:50.108 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8ecc948d-bcd9-4aa6-98cc-2455cd34aff0 ']' 00:31:50.108 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:50.366 [2024-06-07 12:34:13.789973] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:50.366 [2024-06-07 12:34:13.790289] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:50.366 [2024-06-07 12:34:13.790497] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:50.366 [2024-06-07 12:34:13.790654] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:50.366 [2024-06-07 12:34:13.790739] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008480 name raid_bdev1, state offline 00:31:50.366 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:50.366 12:34:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:31:50.625 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:31:50.625 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:31:50.625 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:50.625 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:31:50.883 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:50.883 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:51.142 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:51.142 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:31:51.401 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:51.401 12:34:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:31:51.659 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:31:51.659 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:51.917 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:31:51.918 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:31:51.918 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:31:51.918 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:31:51.918 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:31:52.175 [2024-06-07 12:34:15.639674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:31:52.175 [2024-06-07 12:34:15.641979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:31:52.175 [2024-06-07 12:34:15.642146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:31:52.175 [2024-06-07 12:34:15.642205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:31:52.175 [2024-06-07 12:34:15.642373] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:31:52.175 [2024-06-07 12:34:15.642627] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:31:52.175 [2024-06-07 12:34:15.642784] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:31:52.175 [2024-06-07 12:34:15.642945] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:31:52.175 [2024-06-07 12:34:15.643004] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:52.175 [2024-06-07 12:34:15.643135] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state configuring 00:31:52.175 request: 00:31:52.175 { 00:31:52.175 "name": "raid_bdev1", 00:31:52.175 "raid_level": "concat", 00:31:52.175 "base_bdevs": [ 00:31:52.175 "malloc1", 00:31:52.175 "malloc2", 00:31:52.175 "malloc3", 00:31:52.175 "malloc4" 00:31:52.175 ], 00:31:52.175 "superblock": false, 00:31:52.175 "strip_size_kb": 64, 00:31:52.175 "method": "bdev_raid_create", 00:31:52.175 "req_id": 1 00:31:52.175 } 00:31:52.175 Got JSON-RPC error response 00:31:52.175 response: 00:31:52.175 { 00:31:52.175 "code": -17, 00:31:52.175 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:31:52.175 } 00:31:52.175 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:31:52.175 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:31:52.175 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:31:52.175 12:34:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:31:52.176 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:52.176 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:31:52.433 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:31:52.433 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:31:52.433 12:34:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:52.693 [2024-06-07 12:34:16.083692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:52.693 [2024-06-07 12:34:16.084097] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:52.693 [2024-06-07 12:34:16.084186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:31:52.693 [2024-06-07 12:34:16.084462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:52.693 [2024-06-07 12:34:16.086839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:52.693 [2024-06-07 12:34:16.087042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:52.693 [2024-06-07 12:34:16.087257] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:31:52.693 [2024-06-07 12:34:16.087393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:52.693 pt1 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:52.693 "name": "raid_bdev1", 00:31:52.693 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:52.693 "strip_size_kb": 64, 00:31:52.693 "state": "configuring", 00:31:52.693 "raid_level": "concat", 00:31:52.693 "superblock": true, 00:31:52.693 "num_base_bdevs": 4, 00:31:52.693 "num_base_bdevs_discovered": 1, 00:31:52.693 "num_base_bdevs_operational": 4, 00:31:52.693 "base_bdevs_list": [ 00:31:52.693 { 00:31:52.693 "name": "pt1", 00:31:52.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:52.693 "is_configured": true, 00:31:52.693 "data_offset": 2048, 00:31:52.693 "data_size": 63488 00:31:52.693 }, 00:31:52.693 { 00:31:52.693 "name": null, 00:31:52.693 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:52.693 "is_configured": false, 00:31:52.693 "data_offset": 2048, 00:31:52.693 "data_size": 63488 00:31:52.693 }, 00:31:52.693 { 00:31:52.693 "name": null, 00:31:52.693 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:52.693 "is_configured": false, 00:31:52.693 "data_offset": 2048, 00:31:52.693 "data_size": 63488 00:31:52.693 }, 00:31:52.693 { 00:31:52.693 "name": null, 00:31:52.693 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:52.693 "is_configured": false, 00:31:52.693 "data_offset": 2048, 00:31:52.693 "data_size": 63488 00:31:52.693 } 00:31:52.693 ] 00:31:52.693 }' 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:52.693 12:34:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:53.628 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:31:53.628 12:34:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:53.628 [2024-06-07 12:34:17.112003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:53.628 [2024-06-07 12:34:17.112407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:53.628 [2024-06-07 12:34:17.112493] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:31:53.628 [2024-06-07 12:34:17.112735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:53.628 [2024-06-07 12:34:17.113177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:53.628 [2024-06-07 12:34:17.113360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:53.628 [2024-06-07 12:34:17.113555] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:53.628 [2024-06-07 12:34:17.113652] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:53.628 pt2 00:31:53.628 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:53.886 [2024-06-07 12:34:17.320018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:53.886 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:54.144 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:54.144 "name": "raid_bdev1", 00:31:54.144 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:54.144 "strip_size_kb": 64, 00:31:54.144 "state": "configuring", 00:31:54.144 "raid_level": "concat", 00:31:54.144 "superblock": true, 00:31:54.144 "num_base_bdevs": 4, 00:31:54.144 "num_base_bdevs_discovered": 1, 00:31:54.144 "num_base_bdevs_operational": 4, 00:31:54.144 "base_bdevs_list": [ 00:31:54.144 { 00:31:54.144 "name": "pt1", 00:31:54.144 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:54.144 "is_configured": true, 00:31:54.144 "data_offset": 2048, 00:31:54.144 "data_size": 63488 00:31:54.144 }, 00:31:54.144 { 00:31:54.144 "name": null, 00:31:54.144 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:54.144 "is_configured": false, 00:31:54.144 "data_offset": 2048, 00:31:54.144 "data_size": 63488 00:31:54.144 }, 00:31:54.144 { 00:31:54.144 "name": null, 00:31:54.144 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:54.144 "is_configured": false, 00:31:54.144 "data_offset": 2048, 00:31:54.144 "data_size": 63488 00:31:54.144 }, 00:31:54.144 { 00:31:54.144 "name": null, 00:31:54.144 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:54.144 "is_configured": false, 00:31:54.144 "data_offset": 2048, 00:31:54.144 "data_size": 63488 00:31:54.144 } 00:31:54.144 ] 00:31:54.144 }' 00:31:54.144 12:34:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:54.144 12:34:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:55.108 [2024-06-07 12:34:18.668124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:55.108 [2024-06-07 12:34:18.668533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:55.108 [2024-06-07 12:34:18.668632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009c80 00:31:55.108 [2024-06-07 12:34:18.668861] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:55.108 [2024-06-07 12:34:18.669357] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:55.108 [2024-06-07 12:34:18.669556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:55.108 [2024-06-07 12:34:18.669776] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:55.108 [2024-06-07 12:34:18.669902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:55.108 pt2 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:55.108 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:31:55.366 [2024-06-07 12:34:18.940160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:31:55.366 [2024-06-07 12:34:18.940489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:55.366 [2024-06-07 12:34:18.940636] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:31:55.366 [2024-06-07 12:34:18.940756] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:55.366 [2024-06-07 12:34:18.941253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:55.366 [2024-06-07 12:34:18.941437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:31:55.366 [2024-06-07 12:34:18.941601] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:31:55.366 [2024-06-07 12:34:18.941712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:31:55.366 pt3 00:31:55.366 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:55.366 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:55.366 12:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:31:55.623 [2024-06-07 12:34:19.152170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:31:55.623 [2024-06-07 12:34:19.152488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:55.623 [2024-06-07 12:34:19.152568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a280 00:31:55.623 [2024-06-07 12:34:19.152708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:55.623 [2024-06-07 12:34:19.153131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:55.623 [2024-06-07 12:34:19.153339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:31:55.623 [2024-06-07 12:34:19.153559] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:31:55.623 [2024-06-07 12:34:19.153664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:31:55.623 [2024-06-07 12:34:19.153809] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:31:55.623 [2024-06-07 12:34:19.153949] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:31:55.623 [2024-06-07 12:34:19.154052] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:31:55.623 [2024-06-07 12:34:19.154479] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:31:55.623 [2024-06-07 12:34:19.154589] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:31:55.623 [2024-06-07 12:34:19.154736] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:55.623 pt4 00:31:55.623 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:55.623 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:55.624 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:55.881 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:55.881 "name": "raid_bdev1", 00:31:55.881 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:55.881 "strip_size_kb": 64, 00:31:55.881 "state": "online", 00:31:55.881 "raid_level": "concat", 00:31:55.881 "superblock": true, 00:31:55.881 "num_base_bdevs": 4, 00:31:55.881 "num_base_bdevs_discovered": 4, 00:31:55.881 "num_base_bdevs_operational": 4, 00:31:55.881 "base_bdevs_list": [ 00:31:55.881 { 00:31:55.881 "name": "pt1", 00:31:55.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:55.881 "is_configured": true, 00:31:55.881 "data_offset": 2048, 00:31:55.881 "data_size": 63488 00:31:55.881 }, 00:31:55.881 { 00:31:55.881 "name": "pt2", 00:31:55.881 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:55.881 "is_configured": true, 00:31:55.881 "data_offset": 2048, 00:31:55.881 "data_size": 63488 00:31:55.881 }, 00:31:55.881 { 00:31:55.881 "name": "pt3", 00:31:55.881 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:55.881 "is_configured": true, 00:31:55.881 "data_offset": 2048, 00:31:55.881 "data_size": 63488 00:31:55.881 }, 00:31:55.881 { 00:31:55.881 "name": "pt4", 00:31:55.881 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:55.881 "is_configured": true, 00:31:55.881 "data_offset": 2048, 00:31:55.881 "data_size": 63488 00:31:55.881 } 00:31:55.881 ] 00:31:55.881 }' 00:31:55.881 12:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:56.139 12:34:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:56.705 [2024-06-07 12:34:20.318679] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:56.705 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:56.705 "name": "raid_bdev1", 00:31:56.705 "aliases": [ 00:31:56.705 "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0" 00:31:56.705 ], 00:31:56.705 "product_name": "Raid Volume", 00:31:56.705 "block_size": 512, 00:31:56.705 "num_blocks": 253952, 00:31:56.705 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:56.705 "assigned_rate_limits": { 00:31:56.705 "rw_ios_per_sec": 0, 00:31:56.705 "rw_mbytes_per_sec": 0, 00:31:56.705 "r_mbytes_per_sec": 0, 00:31:56.705 "w_mbytes_per_sec": 0 00:31:56.705 }, 00:31:56.705 "claimed": false, 00:31:56.705 "zoned": false, 00:31:56.705 "supported_io_types": { 00:31:56.705 "read": true, 00:31:56.705 "write": true, 00:31:56.705 "unmap": true, 00:31:56.705 "write_zeroes": true, 00:31:56.705 "flush": true, 00:31:56.705 "reset": true, 00:31:56.705 "compare": false, 00:31:56.705 "compare_and_write": false, 00:31:56.705 "abort": false, 00:31:56.705 "nvme_admin": false, 00:31:56.705 "nvme_io": false 00:31:56.705 }, 00:31:56.705 "memory_domains": [ 00:31:56.705 { 00:31:56.705 "dma_device_id": "system", 00:31:56.705 "dma_device_type": 1 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.705 "dma_device_type": 2 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "system", 00:31:56.705 "dma_device_type": 1 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.705 "dma_device_type": 2 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "system", 00:31:56.705 "dma_device_type": 1 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.705 "dma_device_type": 2 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "system", 00:31:56.705 "dma_device_type": 1 00:31:56.705 }, 00:31:56.705 { 00:31:56.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:56.705 "dma_device_type": 2 00:31:56.705 } 00:31:56.705 ], 00:31:56.705 "driver_specific": { 00:31:56.705 "raid": { 00:31:56.705 "uuid": "8ecc948d-bcd9-4aa6-98cc-2455cd34aff0", 00:31:56.706 "strip_size_kb": 64, 00:31:56.706 "state": "online", 00:31:56.706 "raid_level": "concat", 00:31:56.706 "superblock": true, 00:31:56.706 "num_base_bdevs": 4, 00:31:56.706 "num_base_bdevs_discovered": 4, 00:31:56.706 "num_base_bdevs_operational": 4, 00:31:56.706 "base_bdevs_list": [ 00:31:56.706 { 00:31:56.706 "name": "pt1", 00:31:56.706 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:56.706 "is_configured": true, 00:31:56.706 "data_offset": 2048, 00:31:56.706 "data_size": 63488 00:31:56.706 }, 00:31:56.706 { 00:31:56.706 "name": "pt2", 00:31:56.706 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:56.706 "is_configured": true, 00:31:56.706 "data_offset": 2048, 00:31:56.706 "data_size": 63488 00:31:56.706 }, 00:31:56.706 { 00:31:56.706 "name": "pt3", 00:31:56.706 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:56.706 "is_configured": true, 00:31:56.706 "data_offset": 2048, 00:31:56.706 "data_size": 63488 00:31:56.706 }, 00:31:56.706 { 00:31:56.706 "name": "pt4", 00:31:56.706 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:56.706 "is_configured": true, 00:31:56.706 "data_offset": 2048, 00:31:56.706 "data_size": 63488 00:31:56.706 } 00:31:56.706 ] 00:31:56.706 } 00:31:56.706 } 00:31:56.706 }' 00:31:57.006 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:57.006 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:57.006 pt2 00:31:57.006 pt3 00:31:57.006 pt4' 00:31:57.006 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:57.006 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:57.006 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:57.285 "name": "pt1", 00:31:57.285 "aliases": [ 00:31:57.285 "00000000-0000-0000-0000-000000000001" 00:31:57.285 ], 00:31:57.285 "product_name": "passthru", 00:31:57.285 "block_size": 512, 00:31:57.285 "num_blocks": 65536, 00:31:57.285 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:57.285 "assigned_rate_limits": { 00:31:57.285 "rw_ios_per_sec": 0, 00:31:57.285 "rw_mbytes_per_sec": 0, 00:31:57.285 "r_mbytes_per_sec": 0, 00:31:57.285 "w_mbytes_per_sec": 0 00:31:57.285 }, 00:31:57.285 "claimed": true, 00:31:57.285 "claim_type": "exclusive_write", 00:31:57.285 "zoned": false, 00:31:57.285 "supported_io_types": { 00:31:57.285 "read": true, 00:31:57.285 "write": true, 00:31:57.285 "unmap": true, 00:31:57.285 "write_zeroes": true, 00:31:57.285 "flush": true, 00:31:57.285 "reset": true, 00:31:57.285 "compare": false, 00:31:57.285 "compare_and_write": false, 00:31:57.285 "abort": true, 00:31:57.285 "nvme_admin": false, 00:31:57.285 "nvme_io": false 00:31:57.285 }, 00:31:57.285 "memory_domains": [ 00:31:57.285 { 00:31:57.285 "dma_device_id": "system", 00:31:57.285 "dma_device_type": 1 00:31:57.285 }, 00:31:57.285 { 00:31:57.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:57.285 "dma_device_type": 2 00:31:57.285 } 00:31:57.285 ], 00:31:57.285 "driver_specific": { 00:31:57.285 "passthru": { 00:31:57.285 "name": "pt1", 00:31:57.285 "base_bdev_name": "malloc1" 00:31:57.285 } 00:31:57.285 } 00:31:57.285 }' 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:57.285 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:57.543 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:57.543 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:57.543 12:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:57.543 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:57.543 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:57.543 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:57.543 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:57.801 "name": "pt2", 00:31:57.801 "aliases": [ 00:31:57.801 "00000000-0000-0000-0000-000000000002" 00:31:57.801 ], 00:31:57.801 "product_name": "passthru", 00:31:57.801 "block_size": 512, 00:31:57.801 "num_blocks": 65536, 00:31:57.801 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:57.801 "assigned_rate_limits": { 00:31:57.801 "rw_ios_per_sec": 0, 00:31:57.801 "rw_mbytes_per_sec": 0, 00:31:57.801 "r_mbytes_per_sec": 0, 00:31:57.801 "w_mbytes_per_sec": 0 00:31:57.801 }, 00:31:57.801 "claimed": true, 00:31:57.801 "claim_type": "exclusive_write", 00:31:57.801 "zoned": false, 00:31:57.801 "supported_io_types": { 00:31:57.801 "read": true, 00:31:57.801 "write": true, 00:31:57.801 "unmap": true, 00:31:57.801 "write_zeroes": true, 00:31:57.801 "flush": true, 00:31:57.801 "reset": true, 00:31:57.801 "compare": false, 00:31:57.801 "compare_and_write": false, 00:31:57.801 "abort": true, 00:31:57.801 "nvme_admin": false, 00:31:57.801 "nvme_io": false 00:31:57.801 }, 00:31:57.801 "memory_domains": [ 00:31:57.801 { 00:31:57.801 "dma_device_id": "system", 00:31:57.801 "dma_device_type": 1 00:31:57.801 }, 00:31:57.801 { 00:31:57.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:57.801 "dma_device_type": 2 00:31:57.801 } 00:31:57.801 ], 00:31:57.801 "driver_specific": { 00:31:57.801 "passthru": { 00:31:57.801 "name": "pt2", 00:31:57.801 "base_bdev_name": "malloc2" 00:31:57.801 } 00:31:57.801 } 00:31:57.801 }' 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:57.801 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:31:58.058 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:58.317 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:58.317 "name": "pt3", 00:31:58.317 "aliases": [ 00:31:58.317 "00000000-0000-0000-0000-000000000003" 00:31:58.317 ], 00:31:58.317 "product_name": "passthru", 00:31:58.317 "block_size": 512, 00:31:58.317 "num_blocks": 65536, 00:31:58.317 "uuid": "00000000-0000-0000-0000-000000000003", 00:31:58.317 "assigned_rate_limits": { 00:31:58.317 "rw_ios_per_sec": 0, 00:31:58.317 "rw_mbytes_per_sec": 0, 00:31:58.317 "r_mbytes_per_sec": 0, 00:31:58.317 "w_mbytes_per_sec": 0 00:31:58.317 }, 00:31:58.317 "claimed": true, 00:31:58.317 "claim_type": "exclusive_write", 00:31:58.317 "zoned": false, 00:31:58.317 "supported_io_types": { 00:31:58.317 "read": true, 00:31:58.317 "write": true, 00:31:58.317 "unmap": true, 00:31:58.317 "write_zeroes": true, 00:31:58.317 "flush": true, 00:31:58.317 "reset": true, 00:31:58.317 "compare": false, 00:31:58.317 "compare_and_write": false, 00:31:58.317 "abort": true, 00:31:58.317 "nvme_admin": false, 00:31:58.317 "nvme_io": false 00:31:58.317 }, 00:31:58.317 "memory_domains": [ 00:31:58.317 { 00:31:58.317 "dma_device_id": "system", 00:31:58.317 "dma_device_type": 1 00:31:58.317 }, 00:31:58.317 { 00:31:58.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:58.317 "dma_device_type": 2 00:31:58.317 } 00:31:58.317 ], 00:31:58.317 "driver_specific": { 00:31:58.317 "passthru": { 00:31:58.317 "name": "pt3", 00:31:58.317 "base_bdev_name": "malloc3" 00:31:58.317 } 00:31:58.317 } 00:31:58.317 }' 00:31:58.317 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:58.317 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:58.575 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:58.575 12:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:58.575 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:58.834 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:58.834 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:58.834 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:31:58.834 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:59.092 "name": "pt4", 00:31:59.092 "aliases": [ 00:31:59.092 "00000000-0000-0000-0000-000000000004" 00:31:59.092 ], 00:31:59.092 "product_name": "passthru", 00:31:59.092 "block_size": 512, 00:31:59.092 "num_blocks": 65536, 00:31:59.092 "uuid": "00000000-0000-0000-0000-000000000004", 00:31:59.092 "assigned_rate_limits": { 00:31:59.092 "rw_ios_per_sec": 0, 00:31:59.092 "rw_mbytes_per_sec": 0, 00:31:59.092 "r_mbytes_per_sec": 0, 00:31:59.092 "w_mbytes_per_sec": 0 00:31:59.092 }, 00:31:59.092 "claimed": true, 00:31:59.092 "claim_type": "exclusive_write", 00:31:59.092 "zoned": false, 00:31:59.092 "supported_io_types": { 00:31:59.092 "read": true, 00:31:59.092 "write": true, 00:31:59.092 "unmap": true, 00:31:59.092 "write_zeroes": true, 00:31:59.092 "flush": true, 00:31:59.092 "reset": true, 00:31:59.092 "compare": false, 00:31:59.092 "compare_and_write": false, 00:31:59.092 "abort": true, 00:31:59.092 "nvme_admin": false, 00:31:59.092 "nvme_io": false 00:31:59.092 }, 00:31:59.092 "memory_domains": [ 00:31:59.092 { 00:31:59.092 "dma_device_id": "system", 00:31:59.092 "dma_device_type": 1 00:31:59.092 }, 00:31:59.092 { 00:31:59.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:59.092 "dma_device_type": 2 00:31:59.092 } 00:31:59.092 ], 00:31:59.092 "driver_specific": { 00:31:59.092 "passthru": { 00:31:59.092 "name": "pt4", 00:31:59.092 "base_bdev_name": "malloc4" 00:31:59.092 } 00:31:59.092 } 00:31:59.092 }' 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:59.092 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:59.350 12:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:31:59.621 [2024-06-07 12:34:23.170931] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8ecc948d-bcd9-4aa6-98cc-2455cd34aff0 '!=' 8ecc948d-bcd9-4aa6-98cc-2455cd34aff0 ']' 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 216879 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 216879 ']' 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 216879 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 216879 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 216879' 00:31:59.621 killing process with pid 216879 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 216879 00:31:59.621 [2024-06-07 12:34:23.234233] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:59.621 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 216879 00:31:59.621 [2024-06-07 12:34:23.234516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:59.621 [2024-06-07 12:34:23.234582] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:59.621 [2024-06-07 12:34:23.234592] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:31:59.879 [2024-06-07 12:34:23.320691] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:00.138 12:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:32:00.138 00:32:00.138 real 0m17.252s 00:32:00.138 user 0m30.870s 00:32:00.138 sys 0m2.905s 00:32:00.138 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:00.138 12:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:32:00.138 ************************************ 00:32:00.138 END TEST raid_superblock_test 00:32:00.138 ************************************ 00:32:00.138 12:34:23 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:32:00.138 12:34:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:00.138 12:34:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:00.138 12:34:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:00.138 ************************************ 00:32:00.138 START TEST raid_read_error_test 00:32:00.138 ************************************ 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 read 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZqFrrK4ret 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=217421 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 217421 /var/tmp/spdk-raid.sock 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 217421 ']' 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:00.138 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:00.138 12:34:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:00.397 [2024-06-07 12:34:23.797434] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:32:00.397 [2024-06-07 12:34:23.797914] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217421 ] 00:32:00.397 [2024-06-07 12:34:23.947516] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:00.397 [2024-06-07 12:34:24.036036] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:00.655 [2024-06-07 12:34:24.127000] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:00.655 12:34:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:00.655 12:34:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:32:00.655 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:00.655 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:32:00.913 BaseBdev1_malloc 00:32:00.913 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:32:01.171 true 00:32:01.171 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:32:01.430 [2024-06-07 12:34:24.874056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:32:01.430 [2024-06-07 12:34:24.874442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:01.430 [2024-06-07 12:34:24.874528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:32:01.430 [2024-06-07 12:34:24.874671] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:01.430 [2024-06-07 12:34:24.877325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:01.430 [2024-06-07 12:34:24.877535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:01.430 BaseBdev1 00:32:01.430 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:01.430 12:34:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:32:01.688 BaseBdev2_malloc 00:32:01.688 12:34:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:32:01.946 true 00:32:01.946 12:34:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:32:02.205 [2024-06-07 12:34:25.593125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:32:02.205 [2024-06-07 12:34:25.593434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:02.205 [2024-06-07 12:34:25.593580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:32:02.205 [2024-06-07 12:34:25.593725] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:02.205 [2024-06-07 12:34:25.596001] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:02.205 [2024-06-07 12:34:25.596179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:32:02.205 BaseBdev2 00:32:02.205 12:34:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:02.205 12:34:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:32:02.205 BaseBdev3_malloc 00:32:02.463 12:34:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:32:02.463 true 00:32:02.721 12:34:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:32:02.721 [2024-06-07 12:34:26.303303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:32:02.721 [2024-06-07 12:34:26.303669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:02.721 [2024-06-07 12:34:26.303766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:32:02.721 [2024-06-07 12:34:26.303917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:02.721 [2024-06-07 12:34:26.306545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:02.721 [2024-06-07 12:34:26.306753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:32:02.721 BaseBdev3 00:32:02.721 12:34:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:02.721 12:34:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:32:02.978 BaseBdev4_malloc 00:32:03.236 12:34:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:32:03.236 true 00:32:03.236 12:34:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:32:03.494 [2024-06-07 12:34:27.035182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:32:03.494 [2024-06-07 12:34:27.035593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:03.494 [2024-06-07 12:34:27.035690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:32:03.494 [2024-06-07 12:34:27.035851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:03.494 [2024-06-07 12:34:27.038279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:03.494 [2024-06-07 12:34:27.038476] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:32:03.494 BaseBdev4 00:32:03.494 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:32:03.752 [2024-06-07 12:34:27.251355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:03.752 [2024-06-07 12:34:27.253664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:03.752 [2024-06-07 12:34:27.253871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:03.752 [2024-06-07 12:34:27.253955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:32:03.752 [2024-06-07 12:34:27.254236] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:32:03.752 [2024-06-07 12:34:27.254392] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:32:03.753 [2024-06-07 12:34:27.254598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:32:03.753 [2024-06-07 12:34:27.254970] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:32:03.753 [2024-06-07 12:34:27.255072] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:32:03.753 [2024-06-07 12:34:27.255365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:03.753 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:04.010 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:04.010 "name": "raid_bdev1", 00:32:04.010 "uuid": "d29ccff9-b8cf-4151-b180-ad11245e5d0c", 00:32:04.010 "strip_size_kb": 64, 00:32:04.010 "state": "online", 00:32:04.010 "raid_level": "concat", 00:32:04.010 "superblock": true, 00:32:04.010 "num_base_bdevs": 4, 00:32:04.010 "num_base_bdevs_discovered": 4, 00:32:04.010 "num_base_bdevs_operational": 4, 00:32:04.010 "base_bdevs_list": [ 00:32:04.010 { 00:32:04.010 "name": "BaseBdev1", 00:32:04.010 "uuid": "2692b985-a1d2-571b-95af-ac309e0ae9af", 00:32:04.010 "is_configured": true, 00:32:04.010 "data_offset": 2048, 00:32:04.010 "data_size": 63488 00:32:04.010 }, 00:32:04.010 { 00:32:04.010 "name": "BaseBdev2", 00:32:04.010 "uuid": "863b683f-36cd-56bf-ac83-ee08e76bf648", 00:32:04.010 "is_configured": true, 00:32:04.010 "data_offset": 2048, 00:32:04.010 "data_size": 63488 00:32:04.010 }, 00:32:04.010 { 00:32:04.010 "name": "BaseBdev3", 00:32:04.010 "uuid": "c2904d2b-12b7-5004-b70e-8809eee7ff21", 00:32:04.010 "is_configured": true, 00:32:04.010 "data_offset": 2048, 00:32:04.010 "data_size": 63488 00:32:04.010 }, 00:32:04.010 { 00:32:04.010 "name": "BaseBdev4", 00:32:04.010 "uuid": "291dc9b1-d311-51fa-83e4-59b02df127a1", 00:32:04.010 "is_configured": true, 00:32:04.010 "data_offset": 2048, 00:32:04.010 "data_size": 63488 00:32:04.010 } 00:32:04.010 ] 00:32:04.010 }' 00:32:04.010 12:34:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:04.010 12:34:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:04.577 12:34:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:32:04.577 12:34:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:32:04.577 [2024-06-07 12:34:28.147769] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:32:05.566 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:05.825 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:06.083 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:06.083 "name": "raid_bdev1", 00:32:06.083 "uuid": "d29ccff9-b8cf-4151-b180-ad11245e5d0c", 00:32:06.083 "strip_size_kb": 64, 00:32:06.083 "state": "online", 00:32:06.083 "raid_level": "concat", 00:32:06.083 "superblock": true, 00:32:06.083 "num_base_bdevs": 4, 00:32:06.083 "num_base_bdevs_discovered": 4, 00:32:06.083 "num_base_bdevs_operational": 4, 00:32:06.083 "base_bdevs_list": [ 00:32:06.083 { 00:32:06.083 "name": "BaseBdev1", 00:32:06.083 "uuid": "2692b985-a1d2-571b-95af-ac309e0ae9af", 00:32:06.083 "is_configured": true, 00:32:06.083 "data_offset": 2048, 00:32:06.083 "data_size": 63488 00:32:06.083 }, 00:32:06.083 { 00:32:06.083 "name": "BaseBdev2", 00:32:06.083 "uuid": "863b683f-36cd-56bf-ac83-ee08e76bf648", 00:32:06.083 "is_configured": true, 00:32:06.083 "data_offset": 2048, 00:32:06.083 "data_size": 63488 00:32:06.083 }, 00:32:06.083 { 00:32:06.083 "name": "BaseBdev3", 00:32:06.083 "uuid": "c2904d2b-12b7-5004-b70e-8809eee7ff21", 00:32:06.083 "is_configured": true, 00:32:06.083 "data_offset": 2048, 00:32:06.083 "data_size": 63488 00:32:06.083 }, 00:32:06.083 { 00:32:06.083 "name": "BaseBdev4", 00:32:06.083 "uuid": "291dc9b1-d311-51fa-83e4-59b02df127a1", 00:32:06.083 "is_configured": true, 00:32:06.083 "data_offset": 2048, 00:32:06.083 "data_size": 63488 00:32:06.083 } 00:32:06.083 ] 00:32:06.083 }' 00:32:06.083 12:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:06.083 12:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:06.651 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:06.909 [2024-06-07 12:34:30.408289] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:06.909 [2024-06-07 12:34:30.408345] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:06.909 [2024-06-07 12:34:30.409480] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:06.909 [2024-06-07 12:34:30.409539] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:06.909 [2024-06-07 12:34:30.409572] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:06.909 [2024-06-07 12:34:30.409583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:32:06.909 0 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 217421 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 217421 ']' 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 217421 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 217421 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 217421' 00:32:06.909 killing process with pid 217421 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 217421 00:32:06.909 [2024-06-07 12:34:30.459694] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:06.909 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 217421 00:32:06.909 [2024-06-07 12:34:30.527916] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZqFrrK4ret 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:32:07.476 00:32:07.476 real 0m7.178s 00:32:07.476 user 0m11.431s 00:32:07.476 sys 0m1.262s 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:07.476 12:34:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:07.476 ************************************ 00:32:07.476 END TEST raid_read_error_test 00:32:07.476 ************************************ 00:32:07.476 12:34:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:32:07.476 12:34:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:07.476 12:34:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:07.476 12:34:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:07.476 ************************************ 00:32:07.476 START TEST raid_write_error_test 00:32:07.476 ************************************ 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 write 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:32:07.476 12:34:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.gxxmVyOPwg 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=217615 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 217615 /var/tmp/spdk-raid.sock 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 217615 ']' 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:07.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:07.476 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:07.477 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:07.477 [2024-06-07 12:34:31.046783] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:32:07.477 [2024-06-07 12:34:31.047673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217615 ] 00:32:07.734 [2024-06-07 12:34:31.197884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:07.734 [2024-06-07 12:34:31.286344] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:07.734 [2024-06-07 12:34:31.367199] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:08.668 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:08.668 12:34:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:32:08.668 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:08.668 12:34:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:32:08.668 BaseBdev1_malloc 00:32:08.668 12:34:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:32:08.926 true 00:32:08.926 12:34:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:32:09.184 [2024-06-07 12:34:32.702446] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:32:09.184 [2024-06-07 12:34:32.702579] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:09.184 [2024-06-07 12:34:32.702630] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:32:09.185 [2024-06-07 12:34:32.702696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:09.185 [2024-06-07 12:34:32.705201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:09.185 [2024-06-07 12:34:32.705278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:09.185 BaseBdev1 00:32:09.185 12:34:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:09.185 12:34:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:32:09.442 BaseBdev2_malloc 00:32:09.442 12:34:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:32:09.700 true 00:32:09.700 12:34:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:32:09.958 [2024-06-07 12:34:33.422103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:32:09.958 [2024-06-07 12:34:33.422239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:09.958 [2024-06-07 12:34:33.422290] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:32:09.958 [2024-06-07 12:34:33.422343] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:09.958 [2024-06-07 12:34:33.424660] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:09.958 [2024-06-07 12:34:33.424711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:32:09.958 BaseBdev2 00:32:09.958 12:34:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:09.958 12:34:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:32:10.216 BaseBdev3_malloc 00:32:10.216 12:34:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:32:10.216 true 00:32:10.216 12:34:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:32:10.473 [2024-06-07 12:34:34.036148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:32:10.473 [2024-06-07 12:34:34.036272] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:10.473 [2024-06-07 12:34:34.036323] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:32:10.473 [2024-06-07 12:34:34.036390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:10.473 [2024-06-07 12:34:34.038591] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:10.473 [2024-06-07 12:34:34.038650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:32:10.473 BaseBdev3 00:32:10.473 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:32:10.473 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:32:10.731 BaseBdev4_malloc 00:32:10.731 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:32:10.990 true 00:32:10.990 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:32:11.258 [2024-06-07 12:34:34.715677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:32:11.258 [2024-06-07 12:34:34.715997] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:11.258 [2024-06-07 12:34:34.716116] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:32:11.258 [2024-06-07 12:34:34.716218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:11.258 [2024-06-07 12:34:34.718658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:11.258 [2024-06-07 12:34:34.718791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:32:11.258 BaseBdev4 00:32:11.258 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:32:11.525 [2024-06-07 12:34:34.931859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:11.525 [2024-06-07 12:34:34.933989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:11.525 [2024-06-07 12:34:34.934051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:11.525 [2024-06-07 12:34:34.934094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:32:11.525 [2024-06-07 12:34:34.934277] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:32:11.525 [2024-06-07 12:34:34.934289] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:32:11.525 [2024-06-07 12:34:34.934421] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:32:11.525 [2024-06-07 12:34:34.934715] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:32:11.525 [2024-06-07 12:34:34.934730] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:32:11.525 [2024-06-07 12:34:34.934868] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:11.525 12:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:11.783 12:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:11.783 "name": "raid_bdev1", 00:32:11.783 "uuid": "a0dab688-0a9d-4239-8465-3bb6d71c0ae7", 00:32:11.783 "strip_size_kb": 64, 00:32:11.783 "state": "online", 00:32:11.783 "raid_level": "concat", 00:32:11.783 "superblock": true, 00:32:11.783 "num_base_bdevs": 4, 00:32:11.783 "num_base_bdevs_discovered": 4, 00:32:11.783 "num_base_bdevs_operational": 4, 00:32:11.783 "base_bdevs_list": [ 00:32:11.783 { 00:32:11.783 "name": "BaseBdev1", 00:32:11.783 "uuid": "282de95a-c4ba-5b4d-8b90-40715210d3b0", 00:32:11.783 "is_configured": true, 00:32:11.783 "data_offset": 2048, 00:32:11.783 "data_size": 63488 00:32:11.783 }, 00:32:11.783 { 00:32:11.783 "name": "BaseBdev2", 00:32:11.783 "uuid": "2063423d-715c-5142-be64-624774dfb246", 00:32:11.783 "is_configured": true, 00:32:11.783 "data_offset": 2048, 00:32:11.783 "data_size": 63488 00:32:11.783 }, 00:32:11.783 { 00:32:11.783 "name": "BaseBdev3", 00:32:11.783 "uuid": "b3dc6ead-d168-5fe4-ad19-c9ed512f4e16", 00:32:11.783 "is_configured": true, 00:32:11.783 "data_offset": 2048, 00:32:11.783 "data_size": 63488 00:32:11.783 }, 00:32:11.783 { 00:32:11.783 "name": "BaseBdev4", 00:32:11.783 "uuid": "bf9cd54d-70ab-5c3d-99a2-8df2b5e1cb5d", 00:32:11.783 "is_configured": true, 00:32:11.783 "data_offset": 2048, 00:32:11.783 "data_size": 63488 00:32:11.783 } 00:32:11.783 ] 00:32:11.784 }' 00:32:11.784 12:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:11.784 12:34:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:12.351 12:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:32:12.351 12:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:32:12.351 [2024-06-07 12:34:35.888361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:32:13.287 12:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:32:13.545 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:32:13.545 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:13.546 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:13.804 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:13.804 "name": "raid_bdev1", 00:32:13.804 "uuid": "a0dab688-0a9d-4239-8465-3bb6d71c0ae7", 00:32:13.804 "strip_size_kb": 64, 00:32:13.804 "state": "online", 00:32:13.804 "raid_level": "concat", 00:32:13.804 "superblock": true, 00:32:13.804 "num_base_bdevs": 4, 00:32:13.804 "num_base_bdevs_discovered": 4, 00:32:13.804 "num_base_bdevs_operational": 4, 00:32:13.804 "base_bdevs_list": [ 00:32:13.804 { 00:32:13.804 "name": "BaseBdev1", 00:32:13.804 "uuid": "282de95a-c4ba-5b4d-8b90-40715210d3b0", 00:32:13.804 "is_configured": true, 00:32:13.804 "data_offset": 2048, 00:32:13.804 "data_size": 63488 00:32:13.804 }, 00:32:13.804 { 00:32:13.804 "name": "BaseBdev2", 00:32:13.804 "uuid": "2063423d-715c-5142-be64-624774dfb246", 00:32:13.804 "is_configured": true, 00:32:13.804 "data_offset": 2048, 00:32:13.804 "data_size": 63488 00:32:13.804 }, 00:32:13.804 { 00:32:13.804 "name": "BaseBdev3", 00:32:13.804 "uuid": "b3dc6ead-d168-5fe4-ad19-c9ed512f4e16", 00:32:13.804 "is_configured": true, 00:32:13.804 "data_offset": 2048, 00:32:13.804 "data_size": 63488 00:32:13.804 }, 00:32:13.804 { 00:32:13.804 "name": "BaseBdev4", 00:32:13.804 "uuid": "bf9cd54d-70ab-5c3d-99a2-8df2b5e1cb5d", 00:32:13.804 "is_configured": true, 00:32:13.804 "data_offset": 2048, 00:32:13.804 "data_size": 63488 00:32:13.804 } 00:32:13.804 ] 00:32:13.804 }' 00:32:13.804 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:13.804 12:34:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:14.371 12:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:14.629 [2024-06-07 12:34:38.123589] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:14.629 [2024-06-07 12:34:38.123648] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:14.629 [2024-06-07 12:34:38.124853] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:14.629 [2024-06-07 12:34:38.124918] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:14.629 [2024-06-07 12:34:38.124953] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:14.629 [2024-06-07 12:34:38.124962] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:32:14.629 0 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 217615 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 217615 ']' 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 217615 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 217615 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:14.629 killing process with pid 217615 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 217615' 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 217615 00:32:14.629 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 217615 00:32:14.629 [2024-06-07 12:34:38.170784] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:14.629 [2024-06-07 12:34:38.238762] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.gxxmVyOPwg 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:32:15.195 00:32:15.195 real 0m7.638s 00:32:15.195 user 0m11.938s 00:32:15.195 sys 0m1.315s 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:15.195 12:34:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:32:15.195 ************************************ 00:32:15.195 END TEST raid_write_error_test 00:32:15.195 ************************************ 00:32:15.195 12:34:38 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:32:15.195 12:34:38 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:32:15.195 12:34:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:15.195 12:34:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:15.195 12:34:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:15.195 ************************************ 00:32:15.195 START TEST raid_state_function_test 00:32:15.195 ************************************ 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 false 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=217809 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:32:15.195 Process raid pid: 217809 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 217809' 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 217809 /var/tmp/spdk-raid.sock 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 217809 ']' 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:15.195 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:15.195 12:34:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:15.195 [2024-06-07 12:34:38.752077] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:32:15.195 [2024-06-07 12:34:38.753022] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:15.453 [2024-06-07 12:34:38.896819] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:15.453 [2024-06-07 12:34:38.996608] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:15.453 [2024-06-07 12:34:39.075851] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:16.389 [2024-06-07 12:34:39.957828] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:16.389 [2024-06-07 12:34:39.957946] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:16.389 [2024-06-07 12:34:39.957958] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:16.389 [2024-06-07 12:34:39.957980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:16.389 [2024-06-07 12:34:39.957987] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:16.389 [2024-06-07 12:34:39.958033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:16.389 [2024-06-07 12:34:39.958041] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:16.389 [2024-06-07 12:34:39.958066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:16.389 12:34:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:16.647 12:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:16.647 "name": "Existed_Raid", 00:32:16.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.647 "strip_size_kb": 0, 00:32:16.647 "state": "configuring", 00:32:16.647 "raid_level": "raid1", 00:32:16.647 "superblock": false, 00:32:16.647 "num_base_bdevs": 4, 00:32:16.647 "num_base_bdevs_discovered": 0, 00:32:16.647 "num_base_bdevs_operational": 4, 00:32:16.647 "base_bdevs_list": [ 00:32:16.647 { 00:32:16.647 "name": "BaseBdev1", 00:32:16.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.647 "is_configured": false, 00:32:16.647 "data_offset": 0, 00:32:16.647 "data_size": 0 00:32:16.647 }, 00:32:16.647 { 00:32:16.647 "name": "BaseBdev2", 00:32:16.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.647 "is_configured": false, 00:32:16.647 "data_offset": 0, 00:32:16.647 "data_size": 0 00:32:16.647 }, 00:32:16.647 { 00:32:16.647 "name": "BaseBdev3", 00:32:16.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.647 "is_configured": false, 00:32:16.647 "data_offset": 0, 00:32:16.647 "data_size": 0 00:32:16.647 }, 00:32:16.647 { 00:32:16.647 "name": "BaseBdev4", 00:32:16.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:16.647 "is_configured": false, 00:32:16.647 "data_offset": 0, 00:32:16.647 "data_size": 0 00:32:16.647 } 00:32:16.647 ] 00:32:16.647 }' 00:32:16.647 12:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:16.647 12:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:17.213 12:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:17.470 [2024-06-07 12:34:40.961856] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:17.470 [2024-06-07 12:34:40.961924] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:32:17.470 12:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:17.727 [2024-06-07 12:34:41.177922] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:17.727 [2024-06-07 12:34:41.178053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:17.727 [2024-06-07 12:34:41.178065] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:17.727 [2024-06-07 12:34:41.178098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:17.727 [2024-06-07 12:34:41.178107] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:17.727 [2024-06-07 12:34:41.178128] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:17.727 [2024-06-07 12:34:41.178136] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:17.727 [2024-06-07 12:34:41.178169] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:17.727 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:32:17.984 [2024-06-07 12:34:41.409969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:17.984 BaseBdev1 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:17.984 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:18.241 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:18.498 [ 00:32:18.498 { 00:32:18.498 "name": "BaseBdev1", 00:32:18.498 "aliases": [ 00:32:18.498 "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c" 00:32:18.498 ], 00:32:18.498 "product_name": "Malloc disk", 00:32:18.498 "block_size": 512, 00:32:18.498 "num_blocks": 65536, 00:32:18.498 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:18.498 "assigned_rate_limits": { 00:32:18.498 "rw_ios_per_sec": 0, 00:32:18.498 "rw_mbytes_per_sec": 0, 00:32:18.498 "r_mbytes_per_sec": 0, 00:32:18.498 "w_mbytes_per_sec": 0 00:32:18.498 }, 00:32:18.498 "claimed": true, 00:32:18.498 "claim_type": "exclusive_write", 00:32:18.498 "zoned": false, 00:32:18.498 "supported_io_types": { 00:32:18.498 "read": true, 00:32:18.498 "write": true, 00:32:18.498 "unmap": true, 00:32:18.498 "write_zeroes": true, 00:32:18.498 "flush": true, 00:32:18.498 "reset": true, 00:32:18.498 "compare": false, 00:32:18.498 "compare_and_write": false, 00:32:18.498 "abort": true, 00:32:18.498 "nvme_admin": false, 00:32:18.498 "nvme_io": false 00:32:18.498 }, 00:32:18.498 "memory_domains": [ 00:32:18.498 { 00:32:18.498 "dma_device_id": "system", 00:32:18.498 "dma_device_type": 1 00:32:18.498 }, 00:32:18.498 { 00:32:18.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:18.498 "dma_device_type": 2 00:32:18.498 } 00:32:18.498 ], 00:32:18.498 "driver_specific": {} 00:32:18.498 } 00:32:18.498 ] 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:18.498 12:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:18.757 12:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:18.757 "name": "Existed_Raid", 00:32:18.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:18.757 "strip_size_kb": 0, 00:32:18.757 "state": "configuring", 00:32:18.757 "raid_level": "raid1", 00:32:18.757 "superblock": false, 00:32:18.757 "num_base_bdevs": 4, 00:32:18.757 "num_base_bdevs_discovered": 1, 00:32:18.757 "num_base_bdevs_operational": 4, 00:32:18.757 "base_bdevs_list": [ 00:32:18.757 { 00:32:18.757 "name": "BaseBdev1", 00:32:18.757 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:18.757 "is_configured": true, 00:32:18.757 "data_offset": 0, 00:32:18.757 "data_size": 65536 00:32:18.757 }, 00:32:18.757 { 00:32:18.757 "name": "BaseBdev2", 00:32:18.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:18.757 "is_configured": false, 00:32:18.757 "data_offset": 0, 00:32:18.757 "data_size": 0 00:32:18.757 }, 00:32:18.757 { 00:32:18.757 "name": "BaseBdev3", 00:32:18.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:18.757 "is_configured": false, 00:32:18.757 "data_offset": 0, 00:32:18.757 "data_size": 0 00:32:18.757 }, 00:32:18.757 { 00:32:18.757 "name": "BaseBdev4", 00:32:18.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:18.757 "is_configured": false, 00:32:18.757 "data_offset": 0, 00:32:18.757 "data_size": 0 00:32:18.757 } 00:32:18.757 ] 00:32:18.757 }' 00:32:18.757 12:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:18.757 12:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:19.324 12:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:19.588 [2024-06-07 12:34:43.038262] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:19.588 [2024-06-07 12:34:43.038357] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:32:19.588 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:19.845 [2024-06-07 12:34:43.310337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:19.845 [2024-06-07 12:34:43.312341] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:19.845 [2024-06-07 12:34:43.312440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:19.845 [2024-06-07 12:34:43.312453] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:19.845 [2024-06-07 12:34:43.312499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:19.845 [2024-06-07 12:34:43.312509] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:19.845 [2024-06-07 12:34:43.312529] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:19.845 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:20.103 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:20.103 "name": "Existed_Raid", 00:32:20.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:20.103 "strip_size_kb": 0, 00:32:20.103 "state": "configuring", 00:32:20.103 "raid_level": "raid1", 00:32:20.103 "superblock": false, 00:32:20.103 "num_base_bdevs": 4, 00:32:20.103 "num_base_bdevs_discovered": 1, 00:32:20.103 "num_base_bdevs_operational": 4, 00:32:20.103 "base_bdevs_list": [ 00:32:20.103 { 00:32:20.103 "name": "BaseBdev1", 00:32:20.103 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:20.103 "is_configured": true, 00:32:20.103 "data_offset": 0, 00:32:20.103 "data_size": 65536 00:32:20.103 }, 00:32:20.103 { 00:32:20.103 "name": "BaseBdev2", 00:32:20.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:20.103 "is_configured": false, 00:32:20.103 "data_offset": 0, 00:32:20.103 "data_size": 0 00:32:20.103 }, 00:32:20.103 { 00:32:20.103 "name": "BaseBdev3", 00:32:20.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:20.103 "is_configured": false, 00:32:20.103 "data_offset": 0, 00:32:20.103 "data_size": 0 00:32:20.103 }, 00:32:20.103 { 00:32:20.103 "name": "BaseBdev4", 00:32:20.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:20.103 "is_configured": false, 00:32:20.103 "data_offset": 0, 00:32:20.103 "data_size": 0 00:32:20.103 } 00:32:20.103 ] 00:32:20.103 }' 00:32:20.103 12:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:20.103 12:34:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:20.668 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:32:20.925 [2024-06-07 12:34:44.410964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:20.925 BaseBdev2 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:20.925 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:20.926 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:21.183 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:21.442 [ 00:32:21.442 { 00:32:21.442 "name": "BaseBdev2", 00:32:21.442 "aliases": [ 00:32:21.442 "f6a995de-d582-4f28-8d6b-6840fcf30b85" 00:32:21.442 ], 00:32:21.442 "product_name": "Malloc disk", 00:32:21.442 "block_size": 512, 00:32:21.442 "num_blocks": 65536, 00:32:21.443 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:21.443 "assigned_rate_limits": { 00:32:21.443 "rw_ios_per_sec": 0, 00:32:21.443 "rw_mbytes_per_sec": 0, 00:32:21.443 "r_mbytes_per_sec": 0, 00:32:21.443 "w_mbytes_per_sec": 0 00:32:21.443 }, 00:32:21.443 "claimed": true, 00:32:21.443 "claim_type": "exclusive_write", 00:32:21.443 "zoned": false, 00:32:21.443 "supported_io_types": { 00:32:21.443 "read": true, 00:32:21.443 "write": true, 00:32:21.443 "unmap": true, 00:32:21.443 "write_zeroes": true, 00:32:21.443 "flush": true, 00:32:21.443 "reset": true, 00:32:21.443 "compare": false, 00:32:21.443 "compare_and_write": false, 00:32:21.443 "abort": true, 00:32:21.443 "nvme_admin": false, 00:32:21.443 "nvme_io": false 00:32:21.443 }, 00:32:21.443 "memory_domains": [ 00:32:21.443 { 00:32:21.443 "dma_device_id": "system", 00:32:21.443 "dma_device_type": 1 00:32:21.443 }, 00:32:21.443 { 00:32:21.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:21.443 "dma_device_type": 2 00:32:21.443 } 00:32:21.443 ], 00:32:21.443 "driver_specific": {} 00:32:21.443 } 00:32:21.443 ] 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:21.443 12:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:21.701 12:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:21.701 "name": "Existed_Raid", 00:32:21.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:21.701 "strip_size_kb": 0, 00:32:21.701 "state": "configuring", 00:32:21.701 "raid_level": "raid1", 00:32:21.701 "superblock": false, 00:32:21.701 "num_base_bdevs": 4, 00:32:21.701 "num_base_bdevs_discovered": 2, 00:32:21.701 "num_base_bdevs_operational": 4, 00:32:21.701 "base_bdevs_list": [ 00:32:21.701 { 00:32:21.702 "name": "BaseBdev1", 00:32:21.702 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:21.702 "is_configured": true, 00:32:21.702 "data_offset": 0, 00:32:21.702 "data_size": 65536 00:32:21.702 }, 00:32:21.702 { 00:32:21.702 "name": "BaseBdev2", 00:32:21.702 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:21.702 "is_configured": true, 00:32:21.702 "data_offset": 0, 00:32:21.702 "data_size": 65536 00:32:21.702 }, 00:32:21.702 { 00:32:21.702 "name": "BaseBdev3", 00:32:21.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:21.702 "is_configured": false, 00:32:21.702 "data_offset": 0, 00:32:21.702 "data_size": 0 00:32:21.702 }, 00:32:21.702 { 00:32:21.702 "name": "BaseBdev4", 00:32:21.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:21.702 "is_configured": false, 00:32:21.702 "data_offset": 0, 00:32:21.702 "data_size": 0 00:32:21.702 } 00:32:21.702 ] 00:32:21.702 }' 00:32:21.702 12:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:21.702 12:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:22.269 12:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:32:22.527 [2024-06-07 12:34:46.044958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:22.527 BaseBdev3 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:22.527 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:22.786 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:32:23.046 [ 00:32:23.046 { 00:32:23.046 "name": "BaseBdev3", 00:32:23.046 "aliases": [ 00:32:23.046 "1a6b7a17-de90-491c-89f5-16181e55b72b" 00:32:23.046 ], 00:32:23.046 "product_name": "Malloc disk", 00:32:23.046 "block_size": 512, 00:32:23.046 "num_blocks": 65536, 00:32:23.046 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:23.046 "assigned_rate_limits": { 00:32:23.046 "rw_ios_per_sec": 0, 00:32:23.046 "rw_mbytes_per_sec": 0, 00:32:23.046 "r_mbytes_per_sec": 0, 00:32:23.046 "w_mbytes_per_sec": 0 00:32:23.046 }, 00:32:23.046 "claimed": true, 00:32:23.046 "claim_type": "exclusive_write", 00:32:23.046 "zoned": false, 00:32:23.046 "supported_io_types": { 00:32:23.046 "read": true, 00:32:23.046 "write": true, 00:32:23.046 "unmap": true, 00:32:23.046 "write_zeroes": true, 00:32:23.046 "flush": true, 00:32:23.046 "reset": true, 00:32:23.046 "compare": false, 00:32:23.046 "compare_and_write": false, 00:32:23.046 "abort": true, 00:32:23.046 "nvme_admin": false, 00:32:23.046 "nvme_io": false 00:32:23.046 }, 00:32:23.046 "memory_domains": [ 00:32:23.046 { 00:32:23.046 "dma_device_id": "system", 00:32:23.046 "dma_device_type": 1 00:32:23.046 }, 00:32:23.046 { 00:32:23.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:23.046 "dma_device_type": 2 00:32:23.046 } 00:32:23.046 ], 00:32:23.046 "driver_specific": {} 00:32:23.046 } 00:32:23.046 ] 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:23.046 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:23.304 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:23.304 "name": "Existed_Raid", 00:32:23.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:23.304 "strip_size_kb": 0, 00:32:23.304 "state": "configuring", 00:32:23.304 "raid_level": "raid1", 00:32:23.304 "superblock": false, 00:32:23.304 "num_base_bdevs": 4, 00:32:23.304 "num_base_bdevs_discovered": 3, 00:32:23.304 "num_base_bdevs_operational": 4, 00:32:23.304 "base_bdevs_list": [ 00:32:23.304 { 00:32:23.304 "name": "BaseBdev1", 00:32:23.304 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:23.304 "is_configured": true, 00:32:23.304 "data_offset": 0, 00:32:23.305 "data_size": 65536 00:32:23.305 }, 00:32:23.305 { 00:32:23.305 "name": "BaseBdev2", 00:32:23.305 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:23.305 "is_configured": true, 00:32:23.305 "data_offset": 0, 00:32:23.305 "data_size": 65536 00:32:23.305 }, 00:32:23.305 { 00:32:23.305 "name": "BaseBdev3", 00:32:23.305 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:23.305 "is_configured": true, 00:32:23.305 "data_offset": 0, 00:32:23.305 "data_size": 65536 00:32:23.305 }, 00:32:23.305 { 00:32:23.305 "name": "BaseBdev4", 00:32:23.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:23.305 "is_configured": false, 00:32:23.305 "data_offset": 0, 00:32:23.305 "data_size": 0 00:32:23.305 } 00:32:23.305 ] 00:32:23.305 }' 00:32:23.305 12:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:23.305 12:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:23.872 12:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:32:24.130 [2024-06-07 12:34:47.550876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:32:24.130 [2024-06-07 12:34:47.550961] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:32:24.130 [2024-06-07 12:34:47.550972] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:32:24.130 [2024-06-07 12:34:47.551114] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:32:24.130 [2024-06-07 12:34:47.551436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:32:24.130 [2024-06-07 12:34:47.551447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:32:24.130 [2024-06-07 12:34:47.551671] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:24.130 BaseBdev4 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:24.130 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:24.389 12:34:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:32:24.649 [ 00:32:24.649 { 00:32:24.649 "name": "BaseBdev4", 00:32:24.649 "aliases": [ 00:32:24.649 "c26f4165-3c72-474d-94fc-65a2c93b3ea7" 00:32:24.649 ], 00:32:24.649 "product_name": "Malloc disk", 00:32:24.649 "block_size": 512, 00:32:24.649 "num_blocks": 65536, 00:32:24.649 "uuid": "c26f4165-3c72-474d-94fc-65a2c93b3ea7", 00:32:24.649 "assigned_rate_limits": { 00:32:24.649 "rw_ios_per_sec": 0, 00:32:24.649 "rw_mbytes_per_sec": 0, 00:32:24.649 "r_mbytes_per_sec": 0, 00:32:24.649 "w_mbytes_per_sec": 0 00:32:24.649 }, 00:32:24.649 "claimed": true, 00:32:24.649 "claim_type": "exclusive_write", 00:32:24.649 "zoned": false, 00:32:24.649 "supported_io_types": { 00:32:24.649 "read": true, 00:32:24.649 "write": true, 00:32:24.649 "unmap": true, 00:32:24.649 "write_zeroes": true, 00:32:24.649 "flush": true, 00:32:24.649 "reset": true, 00:32:24.649 "compare": false, 00:32:24.649 "compare_and_write": false, 00:32:24.649 "abort": true, 00:32:24.649 "nvme_admin": false, 00:32:24.649 "nvme_io": false 00:32:24.649 }, 00:32:24.649 "memory_domains": [ 00:32:24.649 { 00:32:24.649 "dma_device_id": "system", 00:32:24.649 "dma_device_type": 1 00:32:24.649 }, 00:32:24.649 { 00:32:24.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:24.649 "dma_device_type": 2 00:32:24.649 } 00:32:24.649 ], 00:32:24.649 "driver_specific": {} 00:32:24.649 } 00:32:24.649 ] 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:24.649 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:24.908 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:24.908 "name": "Existed_Raid", 00:32:24.908 "uuid": "fc99cff1-b7ae-4c83-bf50-a435449e3a53", 00:32:24.908 "strip_size_kb": 0, 00:32:24.908 "state": "online", 00:32:24.908 "raid_level": "raid1", 00:32:24.908 "superblock": false, 00:32:24.908 "num_base_bdevs": 4, 00:32:24.908 "num_base_bdevs_discovered": 4, 00:32:24.908 "num_base_bdevs_operational": 4, 00:32:24.908 "base_bdevs_list": [ 00:32:24.908 { 00:32:24.908 "name": "BaseBdev1", 00:32:24.908 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:24.908 "is_configured": true, 00:32:24.908 "data_offset": 0, 00:32:24.908 "data_size": 65536 00:32:24.908 }, 00:32:24.908 { 00:32:24.908 "name": "BaseBdev2", 00:32:24.908 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:24.908 "is_configured": true, 00:32:24.908 "data_offset": 0, 00:32:24.908 "data_size": 65536 00:32:24.908 }, 00:32:24.908 { 00:32:24.908 "name": "BaseBdev3", 00:32:24.908 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:24.908 "is_configured": true, 00:32:24.908 "data_offset": 0, 00:32:24.908 "data_size": 65536 00:32:24.908 }, 00:32:24.908 { 00:32:24.908 "name": "BaseBdev4", 00:32:24.908 "uuid": "c26f4165-3c72-474d-94fc-65a2c93b3ea7", 00:32:24.908 "is_configured": true, 00:32:24.908 "data_offset": 0, 00:32:24.908 "data_size": 65536 00:32:24.908 } 00:32:24.908 ] 00:32:24.908 }' 00:32:24.908 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:24.908 12:34:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:25.493 12:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:25.751 [2024-06-07 12:34:49.191356] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:25.751 "name": "Existed_Raid", 00:32:25.751 "aliases": [ 00:32:25.751 "fc99cff1-b7ae-4c83-bf50-a435449e3a53" 00:32:25.751 ], 00:32:25.751 "product_name": "Raid Volume", 00:32:25.751 "block_size": 512, 00:32:25.751 "num_blocks": 65536, 00:32:25.751 "uuid": "fc99cff1-b7ae-4c83-bf50-a435449e3a53", 00:32:25.751 "assigned_rate_limits": { 00:32:25.751 "rw_ios_per_sec": 0, 00:32:25.751 "rw_mbytes_per_sec": 0, 00:32:25.751 "r_mbytes_per_sec": 0, 00:32:25.751 "w_mbytes_per_sec": 0 00:32:25.751 }, 00:32:25.751 "claimed": false, 00:32:25.751 "zoned": false, 00:32:25.751 "supported_io_types": { 00:32:25.751 "read": true, 00:32:25.751 "write": true, 00:32:25.751 "unmap": false, 00:32:25.751 "write_zeroes": true, 00:32:25.751 "flush": false, 00:32:25.751 "reset": true, 00:32:25.751 "compare": false, 00:32:25.751 "compare_and_write": false, 00:32:25.751 "abort": false, 00:32:25.751 "nvme_admin": false, 00:32:25.751 "nvme_io": false 00:32:25.751 }, 00:32:25.751 "memory_domains": [ 00:32:25.751 { 00:32:25.751 "dma_device_id": "system", 00:32:25.751 "dma_device_type": 1 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:25.751 "dma_device_type": 2 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "system", 00:32:25.751 "dma_device_type": 1 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:25.751 "dma_device_type": 2 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "system", 00:32:25.751 "dma_device_type": 1 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:25.751 "dma_device_type": 2 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "system", 00:32:25.751 "dma_device_type": 1 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:25.751 "dma_device_type": 2 00:32:25.751 } 00:32:25.751 ], 00:32:25.751 "driver_specific": { 00:32:25.751 "raid": { 00:32:25.751 "uuid": "fc99cff1-b7ae-4c83-bf50-a435449e3a53", 00:32:25.751 "strip_size_kb": 0, 00:32:25.751 "state": "online", 00:32:25.751 "raid_level": "raid1", 00:32:25.751 "superblock": false, 00:32:25.751 "num_base_bdevs": 4, 00:32:25.751 "num_base_bdevs_discovered": 4, 00:32:25.751 "num_base_bdevs_operational": 4, 00:32:25.751 "base_bdevs_list": [ 00:32:25.751 { 00:32:25.751 "name": "BaseBdev1", 00:32:25.751 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:25.751 "is_configured": true, 00:32:25.751 "data_offset": 0, 00:32:25.751 "data_size": 65536 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "name": "BaseBdev2", 00:32:25.751 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:25.751 "is_configured": true, 00:32:25.751 "data_offset": 0, 00:32:25.751 "data_size": 65536 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "name": "BaseBdev3", 00:32:25.751 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:25.751 "is_configured": true, 00:32:25.751 "data_offset": 0, 00:32:25.751 "data_size": 65536 00:32:25.751 }, 00:32:25.751 { 00:32:25.751 "name": "BaseBdev4", 00:32:25.751 "uuid": "c26f4165-3c72-474d-94fc-65a2c93b3ea7", 00:32:25.751 "is_configured": true, 00:32:25.751 "data_offset": 0, 00:32:25.751 "data_size": 65536 00:32:25.751 } 00:32:25.751 ] 00:32:25.751 } 00:32:25.751 } 00:32:25.751 }' 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:32:25.751 BaseBdev2 00:32:25.751 BaseBdev3 00:32:25.751 BaseBdev4' 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:32:25.751 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:26.009 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:26.009 "name": "BaseBdev1", 00:32:26.009 "aliases": [ 00:32:26.009 "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c" 00:32:26.009 ], 00:32:26.009 "product_name": "Malloc disk", 00:32:26.009 "block_size": 512, 00:32:26.009 "num_blocks": 65536, 00:32:26.009 "uuid": "ca277bc9-c1dd-4cc6-80e8-bdde2f35b65c", 00:32:26.009 "assigned_rate_limits": { 00:32:26.009 "rw_ios_per_sec": 0, 00:32:26.009 "rw_mbytes_per_sec": 0, 00:32:26.009 "r_mbytes_per_sec": 0, 00:32:26.009 "w_mbytes_per_sec": 0 00:32:26.009 }, 00:32:26.009 "claimed": true, 00:32:26.009 "claim_type": "exclusive_write", 00:32:26.009 "zoned": false, 00:32:26.009 "supported_io_types": { 00:32:26.009 "read": true, 00:32:26.009 "write": true, 00:32:26.009 "unmap": true, 00:32:26.009 "write_zeroes": true, 00:32:26.009 "flush": true, 00:32:26.009 "reset": true, 00:32:26.009 "compare": false, 00:32:26.009 "compare_and_write": false, 00:32:26.009 "abort": true, 00:32:26.009 "nvme_admin": false, 00:32:26.009 "nvme_io": false 00:32:26.009 }, 00:32:26.009 "memory_domains": [ 00:32:26.009 { 00:32:26.010 "dma_device_id": "system", 00:32:26.010 "dma_device_type": 1 00:32:26.010 }, 00:32:26.010 { 00:32:26.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:26.010 "dma_device_type": 2 00:32:26.010 } 00:32:26.010 ], 00:32:26.010 "driver_specific": {} 00:32:26.010 }' 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:26.010 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:26.268 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:26.526 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:26.526 "name": "BaseBdev2", 00:32:26.526 "aliases": [ 00:32:26.526 "f6a995de-d582-4f28-8d6b-6840fcf30b85" 00:32:26.526 ], 00:32:26.526 "product_name": "Malloc disk", 00:32:26.526 "block_size": 512, 00:32:26.526 "num_blocks": 65536, 00:32:26.526 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:26.526 "assigned_rate_limits": { 00:32:26.526 "rw_ios_per_sec": 0, 00:32:26.526 "rw_mbytes_per_sec": 0, 00:32:26.526 "r_mbytes_per_sec": 0, 00:32:26.526 "w_mbytes_per_sec": 0 00:32:26.526 }, 00:32:26.526 "claimed": true, 00:32:26.526 "claim_type": "exclusive_write", 00:32:26.526 "zoned": false, 00:32:26.526 "supported_io_types": { 00:32:26.526 "read": true, 00:32:26.526 "write": true, 00:32:26.526 "unmap": true, 00:32:26.526 "write_zeroes": true, 00:32:26.526 "flush": true, 00:32:26.526 "reset": true, 00:32:26.526 "compare": false, 00:32:26.526 "compare_and_write": false, 00:32:26.526 "abort": true, 00:32:26.526 "nvme_admin": false, 00:32:26.526 "nvme_io": false 00:32:26.526 }, 00:32:26.526 "memory_domains": [ 00:32:26.526 { 00:32:26.526 "dma_device_id": "system", 00:32:26.526 "dma_device_type": 1 00:32:26.526 }, 00:32:26.527 { 00:32:26.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:26.527 "dma_device_type": 2 00:32:26.527 } 00:32:26.527 ], 00:32:26.527 "driver_specific": {} 00:32:26.527 }' 00:32:26.527 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:26.527 12:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:26.527 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:32:26.785 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:27.044 "name": "BaseBdev3", 00:32:27.044 "aliases": [ 00:32:27.044 "1a6b7a17-de90-491c-89f5-16181e55b72b" 00:32:27.044 ], 00:32:27.044 "product_name": "Malloc disk", 00:32:27.044 "block_size": 512, 00:32:27.044 "num_blocks": 65536, 00:32:27.044 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:27.044 "assigned_rate_limits": { 00:32:27.044 "rw_ios_per_sec": 0, 00:32:27.044 "rw_mbytes_per_sec": 0, 00:32:27.044 "r_mbytes_per_sec": 0, 00:32:27.044 "w_mbytes_per_sec": 0 00:32:27.044 }, 00:32:27.044 "claimed": true, 00:32:27.044 "claim_type": "exclusive_write", 00:32:27.044 "zoned": false, 00:32:27.044 "supported_io_types": { 00:32:27.044 "read": true, 00:32:27.044 "write": true, 00:32:27.044 "unmap": true, 00:32:27.044 "write_zeroes": true, 00:32:27.044 "flush": true, 00:32:27.044 "reset": true, 00:32:27.044 "compare": false, 00:32:27.044 "compare_and_write": false, 00:32:27.044 "abort": true, 00:32:27.044 "nvme_admin": false, 00:32:27.044 "nvme_io": false 00:32:27.044 }, 00:32:27.044 "memory_domains": [ 00:32:27.044 { 00:32:27.044 "dma_device_id": "system", 00:32:27.044 "dma_device_type": 1 00:32:27.044 }, 00:32:27.044 { 00:32:27.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.044 "dma_device_type": 2 00:32:27.044 } 00:32:27.044 ], 00:32:27.044 "driver_specific": {} 00:32:27.044 }' 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:27.044 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:27.302 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:27.303 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:27.303 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:27.303 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:32:27.303 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:27.562 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:27.562 "name": "BaseBdev4", 00:32:27.562 "aliases": [ 00:32:27.562 "c26f4165-3c72-474d-94fc-65a2c93b3ea7" 00:32:27.562 ], 00:32:27.562 "product_name": "Malloc disk", 00:32:27.562 "block_size": 512, 00:32:27.562 "num_blocks": 65536, 00:32:27.562 "uuid": "c26f4165-3c72-474d-94fc-65a2c93b3ea7", 00:32:27.562 "assigned_rate_limits": { 00:32:27.562 "rw_ios_per_sec": 0, 00:32:27.562 "rw_mbytes_per_sec": 0, 00:32:27.562 "r_mbytes_per_sec": 0, 00:32:27.562 "w_mbytes_per_sec": 0 00:32:27.562 }, 00:32:27.562 "claimed": true, 00:32:27.562 "claim_type": "exclusive_write", 00:32:27.562 "zoned": false, 00:32:27.562 "supported_io_types": { 00:32:27.562 "read": true, 00:32:27.562 "write": true, 00:32:27.562 "unmap": true, 00:32:27.562 "write_zeroes": true, 00:32:27.562 "flush": true, 00:32:27.562 "reset": true, 00:32:27.562 "compare": false, 00:32:27.562 "compare_and_write": false, 00:32:27.562 "abort": true, 00:32:27.562 "nvme_admin": false, 00:32:27.562 "nvme_io": false 00:32:27.562 }, 00:32:27.562 "memory_domains": [ 00:32:27.562 { 00:32:27.562 "dma_device_id": "system", 00:32:27.562 "dma_device_type": 1 00:32:27.562 }, 00:32:27.562 { 00:32:27.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.562 "dma_device_type": 2 00:32:27.562 } 00:32:27.562 ], 00:32:27.562 "driver_specific": {} 00:32:27.562 }' 00:32:27.562 12:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:27.562 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:27.821 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:27.821 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:27.821 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:28.136 [2024-06-07 12:34:51.535519] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:28.136 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:28.137 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:28.137 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:28.396 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:28.396 "name": "Existed_Raid", 00:32:28.396 "uuid": "fc99cff1-b7ae-4c83-bf50-a435449e3a53", 00:32:28.396 "strip_size_kb": 0, 00:32:28.396 "state": "online", 00:32:28.396 "raid_level": "raid1", 00:32:28.396 "superblock": false, 00:32:28.396 "num_base_bdevs": 4, 00:32:28.396 "num_base_bdevs_discovered": 3, 00:32:28.396 "num_base_bdevs_operational": 3, 00:32:28.396 "base_bdevs_list": [ 00:32:28.396 { 00:32:28.396 "name": null, 00:32:28.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:28.396 "is_configured": false, 00:32:28.396 "data_offset": 0, 00:32:28.396 "data_size": 65536 00:32:28.396 }, 00:32:28.396 { 00:32:28.396 "name": "BaseBdev2", 00:32:28.396 "uuid": "f6a995de-d582-4f28-8d6b-6840fcf30b85", 00:32:28.396 "is_configured": true, 00:32:28.396 "data_offset": 0, 00:32:28.396 "data_size": 65536 00:32:28.396 }, 00:32:28.396 { 00:32:28.396 "name": "BaseBdev3", 00:32:28.396 "uuid": "1a6b7a17-de90-491c-89f5-16181e55b72b", 00:32:28.396 "is_configured": true, 00:32:28.396 "data_offset": 0, 00:32:28.396 "data_size": 65536 00:32:28.396 }, 00:32:28.396 { 00:32:28.396 "name": "BaseBdev4", 00:32:28.396 "uuid": "c26f4165-3c72-474d-94fc-65a2c93b3ea7", 00:32:28.396 "is_configured": true, 00:32:28.396 "data_offset": 0, 00:32:28.396 "data_size": 65536 00:32:28.396 } 00:32:28.396 ] 00:32:28.396 }' 00:32:28.396 12:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:28.396 12:34:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:28.961 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:32:29.218 [2024-06-07 12:34:52.784992] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:32:29.218 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:29.218 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:29.219 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:29.219 12:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:29.476 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:29.476 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:29.477 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:32:29.734 [2024-06-07 12:34:53.262880] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:32:29.734 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:29.734 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:29.734 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:29.734 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:29.993 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:29.993 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:29.993 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:32:30.252 [2024-06-07 12:34:53.829410] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:32:30.252 [2024-06-07 12:34:53.829694] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:30.252 [2024-06-07 12:34:53.850994] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:30.252 [2024-06-07 12:34:53.851276] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:30.252 [2024-06-07 12:34:53.851370] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:32:30.252 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:30.252 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:30.252 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:30.252 12:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:30.510 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:32:30.770 BaseBdev2 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:30.770 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:31.030 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:31.289 [ 00:32:31.289 { 00:32:31.289 "name": "BaseBdev2", 00:32:31.289 "aliases": [ 00:32:31.289 "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6" 00:32:31.289 ], 00:32:31.289 "product_name": "Malloc disk", 00:32:31.289 "block_size": 512, 00:32:31.289 "num_blocks": 65536, 00:32:31.289 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:31.289 "assigned_rate_limits": { 00:32:31.289 "rw_ios_per_sec": 0, 00:32:31.289 "rw_mbytes_per_sec": 0, 00:32:31.289 "r_mbytes_per_sec": 0, 00:32:31.289 "w_mbytes_per_sec": 0 00:32:31.289 }, 00:32:31.289 "claimed": false, 00:32:31.289 "zoned": false, 00:32:31.289 "supported_io_types": { 00:32:31.289 "read": true, 00:32:31.289 "write": true, 00:32:31.289 "unmap": true, 00:32:31.289 "write_zeroes": true, 00:32:31.289 "flush": true, 00:32:31.289 "reset": true, 00:32:31.289 "compare": false, 00:32:31.289 "compare_and_write": false, 00:32:31.289 "abort": true, 00:32:31.289 "nvme_admin": false, 00:32:31.289 "nvme_io": false 00:32:31.289 }, 00:32:31.289 "memory_domains": [ 00:32:31.289 { 00:32:31.289 "dma_device_id": "system", 00:32:31.289 "dma_device_type": 1 00:32:31.289 }, 00:32:31.289 { 00:32:31.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:31.289 "dma_device_type": 2 00:32:31.289 } 00:32:31.289 ], 00:32:31.289 "driver_specific": {} 00:32:31.289 } 00:32:31.289 ] 00:32:31.289 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:31.289 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:32:31.289 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:31.289 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:32:31.548 BaseBdev3 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:31.548 12:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:31.807 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:32:31.807 [ 00:32:31.807 { 00:32:31.807 "name": "BaseBdev3", 00:32:31.807 "aliases": [ 00:32:31.807 "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3" 00:32:31.807 ], 00:32:31.807 "product_name": "Malloc disk", 00:32:31.807 "block_size": 512, 00:32:31.807 "num_blocks": 65536, 00:32:31.807 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:31.807 "assigned_rate_limits": { 00:32:31.807 "rw_ios_per_sec": 0, 00:32:31.807 "rw_mbytes_per_sec": 0, 00:32:31.807 "r_mbytes_per_sec": 0, 00:32:31.807 "w_mbytes_per_sec": 0 00:32:31.807 }, 00:32:31.807 "claimed": false, 00:32:31.807 "zoned": false, 00:32:31.807 "supported_io_types": { 00:32:31.807 "read": true, 00:32:31.807 "write": true, 00:32:31.807 "unmap": true, 00:32:31.807 "write_zeroes": true, 00:32:31.807 "flush": true, 00:32:31.807 "reset": true, 00:32:31.807 "compare": false, 00:32:31.807 "compare_and_write": false, 00:32:31.807 "abort": true, 00:32:31.807 "nvme_admin": false, 00:32:31.807 "nvme_io": false 00:32:31.807 }, 00:32:31.807 "memory_domains": [ 00:32:31.807 { 00:32:31.807 "dma_device_id": "system", 00:32:31.807 "dma_device_type": 1 00:32:31.807 }, 00:32:31.807 { 00:32:31.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:31.807 "dma_device_type": 2 00:32:31.807 } 00:32:31.807 ], 00:32:31.807 "driver_specific": {} 00:32:31.807 } 00:32:31.807 ] 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:32:32.066 BaseBdev4 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:32.066 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:32.325 12:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:32:32.583 [ 00:32:32.583 { 00:32:32.583 "name": "BaseBdev4", 00:32:32.583 "aliases": [ 00:32:32.583 "d3a374c2-1c93-408d-84a2-7be86d27372f" 00:32:32.583 ], 00:32:32.583 "product_name": "Malloc disk", 00:32:32.583 "block_size": 512, 00:32:32.583 "num_blocks": 65536, 00:32:32.583 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:32.583 "assigned_rate_limits": { 00:32:32.583 "rw_ios_per_sec": 0, 00:32:32.583 "rw_mbytes_per_sec": 0, 00:32:32.583 "r_mbytes_per_sec": 0, 00:32:32.583 "w_mbytes_per_sec": 0 00:32:32.583 }, 00:32:32.583 "claimed": false, 00:32:32.583 "zoned": false, 00:32:32.583 "supported_io_types": { 00:32:32.583 "read": true, 00:32:32.583 "write": true, 00:32:32.583 "unmap": true, 00:32:32.583 "write_zeroes": true, 00:32:32.583 "flush": true, 00:32:32.583 "reset": true, 00:32:32.583 "compare": false, 00:32:32.583 "compare_and_write": false, 00:32:32.583 "abort": true, 00:32:32.583 "nvme_admin": false, 00:32:32.583 "nvme_io": false 00:32:32.583 }, 00:32:32.583 "memory_domains": [ 00:32:32.583 { 00:32:32.583 "dma_device_id": "system", 00:32:32.583 "dma_device_type": 1 00:32:32.583 }, 00:32:32.583 { 00:32:32.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.583 "dma_device_type": 2 00:32:32.583 } 00:32:32.583 ], 00:32:32.583 "driver_specific": {} 00:32:32.583 } 00:32:32.583 ] 00:32:32.583 12:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:32.583 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:32:32.583 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:32:32.583 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:32.844 [2024-06-07 12:34:56.350129] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:32.844 [2024-06-07 12:34:56.350447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:32.844 [2024-06-07 12:34:56.350605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:32.844 [2024-06-07 12:34:56.352690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:32.844 [2024-06-07 12:34:56.352853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:32.844 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:33.103 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:33.103 "name": "Existed_Raid", 00:32:33.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:33.103 "strip_size_kb": 0, 00:32:33.103 "state": "configuring", 00:32:33.103 "raid_level": "raid1", 00:32:33.103 "superblock": false, 00:32:33.103 "num_base_bdevs": 4, 00:32:33.103 "num_base_bdevs_discovered": 3, 00:32:33.103 "num_base_bdevs_operational": 4, 00:32:33.103 "base_bdevs_list": [ 00:32:33.103 { 00:32:33.103 "name": "BaseBdev1", 00:32:33.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:33.103 "is_configured": false, 00:32:33.103 "data_offset": 0, 00:32:33.103 "data_size": 0 00:32:33.103 }, 00:32:33.103 { 00:32:33.103 "name": "BaseBdev2", 00:32:33.103 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:33.103 "is_configured": true, 00:32:33.103 "data_offset": 0, 00:32:33.103 "data_size": 65536 00:32:33.103 }, 00:32:33.103 { 00:32:33.103 "name": "BaseBdev3", 00:32:33.103 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:33.103 "is_configured": true, 00:32:33.103 "data_offset": 0, 00:32:33.103 "data_size": 65536 00:32:33.103 }, 00:32:33.103 { 00:32:33.103 "name": "BaseBdev4", 00:32:33.103 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:33.103 "is_configured": true, 00:32:33.103 "data_offset": 0, 00:32:33.103 "data_size": 65536 00:32:33.103 } 00:32:33.103 ] 00:32:33.103 }' 00:32:33.103 12:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:33.103 12:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:34.037 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:32:34.038 [2024-06-07 12:34:57.602326] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:34.038 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:34.306 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:34.306 "name": "Existed_Raid", 00:32:34.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:34.306 "strip_size_kb": 0, 00:32:34.306 "state": "configuring", 00:32:34.306 "raid_level": "raid1", 00:32:34.306 "superblock": false, 00:32:34.306 "num_base_bdevs": 4, 00:32:34.306 "num_base_bdevs_discovered": 2, 00:32:34.306 "num_base_bdevs_operational": 4, 00:32:34.306 "base_bdevs_list": [ 00:32:34.306 { 00:32:34.306 "name": "BaseBdev1", 00:32:34.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:34.306 "is_configured": false, 00:32:34.306 "data_offset": 0, 00:32:34.306 "data_size": 0 00:32:34.306 }, 00:32:34.306 { 00:32:34.306 "name": null, 00:32:34.306 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:34.306 "is_configured": false, 00:32:34.306 "data_offset": 0, 00:32:34.306 "data_size": 65536 00:32:34.306 }, 00:32:34.306 { 00:32:34.306 "name": "BaseBdev3", 00:32:34.306 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:34.306 "is_configured": true, 00:32:34.306 "data_offset": 0, 00:32:34.306 "data_size": 65536 00:32:34.306 }, 00:32:34.306 { 00:32:34.306 "name": "BaseBdev4", 00:32:34.306 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:34.306 "is_configured": true, 00:32:34.306 "data_offset": 0, 00:32:34.306 "data_size": 65536 00:32:34.306 } 00:32:34.306 ] 00:32:34.306 }' 00:32:34.306 12:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:34.306 12:34:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:35.292 12:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:32:35.292 12:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:35.292 12:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:32:35.292 12:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:32:35.550 [2024-06-07 12:34:59.050423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:35.550 BaseBdev1 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:35.550 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:35.808 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:36.067 [ 00:32:36.067 { 00:32:36.067 "name": "BaseBdev1", 00:32:36.067 "aliases": [ 00:32:36.067 "69d5442e-5b56-4287-bd4d-e290ea22f713" 00:32:36.067 ], 00:32:36.067 "product_name": "Malloc disk", 00:32:36.067 "block_size": 512, 00:32:36.067 "num_blocks": 65536, 00:32:36.067 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:36.067 "assigned_rate_limits": { 00:32:36.067 "rw_ios_per_sec": 0, 00:32:36.067 "rw_mbytes_per_sec": 0, 00:32:36.067 "r_mbytes_per_sec": 0, 00:32:36.067 "w_mbytes_per_sec": 0 00:32:36.067 }, 00:32:36.067 "claimed": true, 00:32:36.067 "claim_type": "exclusive_write", 00:32:36.067 "zoned": false, 00:32:36.067 "supported_io_types": { 00:32:36.067 "read": true, 00:32:36.067 "write": true, 00:32:36.067 "unmap": true, 00:32:36.067 "write_zeroes": true, 00:32:36.067 "flush": true, 00:32:36.068 "reset": true, 00:32:36.068 "compare": false, 00:32:36.068 "compare_and_write": false, 00:32:36.068 "abort": true, 00:32:36.068 "nvme_admin": false, 00:32:36.068 "nvme_io": false 00:32:36.068 }, 00:32:36.068 "memory_domains": [ 00:32:36.068 { 00:32:36.068 "dma_device_id": "system", 00:32:36.068 "dma_device_type": 1 00:32:36.068 }, 00:32:36.068 { 00:32:36.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:36.068 "dma_device_type": 2 00:32:36.068 } 00:32:36.068 ], 00:32:36.068 "driver_specific": {} 00:32:36.068 } 00:32:36.068 ] 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:36.068 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:36.326 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:36.326 "name": "Existed_Raid", 00:32:36.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:36.326 "strip_size_kb": 0, 00:32:36.326 "state": "configuring", 00:32:36.326 "raid_level": "raid1", 00:32:36.326 "superblock": false, 00:32:36.326 "num_base_bdevs": 4, 00:32:36.326 "num_base_bdevs_discovered": 3, 00:32:36.326 "num_base_bdevs_operational": 4, 00:32:36.326 "base_bdevs_list": [ 00:32:36.326 { 00:32:36.326 "name": "BaseBdev1", 00:32:36.326 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:36.326 "is_configured": true, 00:32:36.326 "data_offset": 0, 00:32:36.326 "data_size": 65536 00:32:36.326 }, 00:32:36.326 { 00:32:36.326 "name": null, 00:32:36.326 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:36.326 "is_configured": false, 00:32:36.326 "data_offset": 0, 00:32:36.326 "data_size": 65536 00:32:36.326 }, 00:32:36.326 { 00:32:36.326 "name": "BaseBdev3", 00:32:36.326 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:36.326 "is_configured": true, 00:32:36.326 "data_offset": 0, 00:32:36.326 "data_size": 65536 00:32:36.326 }, 00:32:36.326 { 00:32:36.326 "name": "BaseBdev4", 00:32:36.326 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:36.326 "is_configured": true, 00:32:36.326 "data_offset": 0, 00:32:36.326 "data_size": 65536 00:32:36.326 } 00:32:36.326 ] 00:32:36.326 }' 00:32:36.326 12:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:36.326 12:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:36.892 12:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:36.892 12:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:37.152 12:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:32:37.152 12:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:32:37.458 [2024-06-07 12:35:01.033720] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:37.459 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:37.717 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:37.717 "name": "Existed_Raid", 00:32:37.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:37.717 "strip_size_kb": 0, 00:32:37.717 "state": "configuring", 00:32:37.717 "raid_level": "raid1", 00:32:37.717 "superblock": false, 00:32:37.717 "num_base_bdevs": 4, 00:32:37.717 "num_base_bdevs_discovered": 2, 00:32:37.717 "num_base_bdevs_operational": 4, 00:32:37.717 "base_bdevs_list": [ 00:32:37.717 { 00:32:37.717 "name": "BaseBdev1", 00:32:37.717 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:37.717 "is_configured": true, 00:32:37.717 "data_offset": 0, 00:32:37.717 "data_size": 65536 00:32:37.717 }, 00:32:37.717 { 00:32:37.717 "name": null, 00:32:37.717 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:37.717 "is_configured": false, 00:32:37.717 "data_offset": 0, 00:32:37.717 "data_size": 65536 00:32:37.717 }, 00:32:37.717 { 00:32:37.717 "name": null, 00:32:37.717 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:37.717 "is_configured": false, 00:32:37.717 "data_offset": 0, 00:32:37.717 "data_size": 65536 00:32:37.717 }, 00:32:37.717 { 00:32:37.717 "name": "BaseBdev4", 00:32:37.717 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:37.717 "is_configured": true, 00:32:37.717 "data_offset": 0, 00:32:37.717 "data_size": 65536 00:32:37.717 } 00:32:37.717 ] 00:32:37.717 }' 00:32:37.717 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:37.717 12:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:38.283 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:38.283 12:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:38.540 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:32:38.540 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:32:38.797 [2024-06-07 12:35:02.376879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:38.797 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:39.419 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:39.419 "name": "Existed_Raid", 00:32:39.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:39.419 "strip_size_kb": 0, 00:32:39.419 "state": "configuring", 00:32:39.419 "raid_level": "raid1", 00:32:39.419 "superblock": false, 00:32:39.419 "num_base_bdevs": 4, 00:32:39.419 "num_base_bdevs_discovered": 3, 00:32:39.419 "num_base_bdevs_operational": 4, 00:32:39.419 "base_bdevs_list": [ 00:32:39.419 { 00:32:39.419 "name": "BaseBdev1", 00:32:39.419 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:39.419 "is_configured": true, 00:32:39.419 "data_offset": 0, 00:32:39.419 "data_size": 65536 00:32:39.419 }, 00:32:39.419 { 00:32:39.419 "name": null, 00:32:39.419 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:39.419 "is_configured": false, 00:32:39.419 "data_offset": 0, 00:32:39.419 "data_size": 65536 00:32:39.419 }, 00:32:39.419 { 00:32:39.419 "name": "BaseBdev3", 00:32:39.419 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:39.419 "is_configured": true, 00:32:39.419 "data_offset": 0, 00:32:39.419 "data_size": 65536 00:32:39.419 }, 00:32:39.419 { 00:32:39.419 "name": "BaseBdev4", 00:32:39.419 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:39.419 "is_configured": true, 00:32:39.419 "data_offset": 0, 00:32:39.419 "data_size": 65536 00:32:39.419 } 00:32:39.419 ] 00:32:39.419 }' 00:32:39.419 12:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:39.419 12:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:39.981 12:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:39.981 12:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:32:40.546 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:32:40.546 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:40.804 [2024-06-07 12:35:04.369784] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:40.804 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:41.368 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:41.368 "name": "Existed_Raid", 00:32:41.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:41.368 "strip_size_kb": 0, 00:32:41.368 "state": "configuring", 00:32:41.368 "raid_level": "raid1", 00:32:41.368 "superblock": false, 00:32:41.368 "num_base_bdevs": 4, 00:32:41.368 "num_base_bdevs_discovered": 2, 00:32:41.368 "num_base_bdevs_operational": 4, 00:32:41.368 "base_bdevs_list": [ 00:32:41.368 { 00:32:41.368 "name": null, 00:32:41.368 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:41.368 "is_configured": false, 00:32:41.368 "data_offset": 0, 00:32:41.368 "data_size": 65536 00:32:41.368 }, 00:32:41.368 { 00:32:41.368 "name": null, 00:32:41.368 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:41.368 "is_configured": false, 00:32:41.368 "data_offset": 0, 00:32:41.368 "data_size": 65536 00:32:41.368 }, 00:32:41.368 { 00:32:41.368 "name": "BaseBdev3", 00:32:41.368 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:41.368 "is_configured": true, 00:32:41.368 "data_offset": 0, 00:32:41.368 "data_size": 65536 00:32:41.368 }, 00:32:41.368 { 00:32:41.368 "name": "BaseBdev4", 00:32:41.368 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:41.368 "is_configured": true, 00:32:41.368 "data_offset": 0, 00:32:41.368 "data_size": 65536 00:32:41.368 } 00:32:41.368 ] 00:32:41.368 }' 00:32:41.368 12:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:41.368 12:35:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:41.950 12:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:41.950 12:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:32:42.207 12:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:32:42.207 12:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:32:42.772 [2024-06-07 12:35:06.124151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:42.772 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:43.029 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:43.029 "name": "Existed_Raid", 00:32:43.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:43.029 "strip_size_kb": 0, 00:32:43.029 "state": "configuring", 00:32:43.029 "raid_level": "raid1", 00:32:43.029 "superblock": false, 00:32:43.029 "num_base_bdevs": 4, 00:32:43.029 "num_base_bdevs_discovered": 3, 00:32:43.029 "num_base_bdevs_operational": 4, 00:32:43.029 "base_bdevs_list": [ 00:32:43.029 { 00:32:43.029 "name": null, 00:32:43.029 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:43.029 "is_configured": false, 00:32:43.029 "data_offset": 0, 00:32:43.029 "data_size": 65536 00:32:43.029 }, 00:32:43.029 { 00:32:43.029 "name": "BaseBdev2", 00:32:43.029 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:43.029 "is_configured": true, 00:32:43.029 "data_offset": 0, 00:32:43.029 "data_size": 65536 00:32:43.029 }, 00:32:43.029 { 00:32:43.029 "name": "BaseBdev3", 00:32:43.029 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:43.029 "is_configured": true, 00:32:43.029 "data_offset": 0, 00:32:43.029 "data_size": 65536 00:32:43.029 }, 00:32:43.029 { 00:32:43.029 "name": "BaseBdev4", 00:32:43.029 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:43.029 "is_configured": true, 00:32:43.029 "data_offset": 0, 00:32:43.029 "data_size": 65536 00:32:43.029 } 00:32:43.029 ] 00:32:43.029 }' 00:32:43.029 12:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:43.029 12:35:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:43.596 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:32:43.596 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:43.855 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:32:43.855 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:32:43.855 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:44.113 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 69d5442e-5b56-4287-bd4d-e290ea22f713 00:32:44.411 [2024-06-07 12:35:07.871366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:32:44.411 [2024-06-07 12:35:07.871718] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:32:44.411 [2024-06-07 12:35:07.871785] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:32:44.411 [2024-06-07 12:35:07.871982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:32:44.411 [2024-06-07 12:35:07.872323] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:32:44.411 [2024-06-07 12:35:07.872466] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:32:44.411 [2024-06-07 12:35:07.872823] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:44.411 NewBaseBdev 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:44.411 12:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:44.685 12:35:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:32:44.685 [ 00:32:44.685 { 00:32:44.685 "name": "NewBaseBdev", 00:32:44.685 "aliases": [ 00:32:44.685 "69d5442e-5b56-4287-bd4d-e290ea22f713" 00:32:44.685 ], 00:32:44.685 "product_name": "Malloc disk", 00:32:44.685 "block_size": 512, 00:32:44.685 "num_blocks": 65536, 00:32:44.685 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:44.685 "assigned_rate_limits": { 00:32:44.685 "rw_ios_per_sec": 0, 00:32:44.685 "rw_mbytes_per_sec": 0, 00:32:44.685 "r_mbytes_per_sec": 0, 00:32:44.685 "w_mbytes_per_sec": 0 00:32:44.685 }, 00:32:44.685 "claimed": true, 00:32:44.685 "claim_type": "exclusive_write", 00:32:44.685 "zoned": false, 00:32:44.685 "supported_io_types": { 00:32:44.685 "read": true, 00:32:44.685 "write": true, 00:32:44.685 "unmap": true, 00:32:44.685 "write_zeroes": true, 00:32:44.685 "flush": true, 00:32:44.685 "reset": true, 00:32:44.685 "compare": false, 00:32:44.685 "compare_and_write": false, 00:32:44.685 "abort": true, 00:32:44.685 "nvme_admin": false, 00:32:44.685 "nvme_io": false 00:32:44.685 }, 00:32:44.685 "memory_domains": [ 00:32:44.685 { 00:32:44.685 "dma_device_id": "system", 00:32:44.685 "dma_device_type": 1 00:32:44.685 }, 00:32:44.685 { 00:32:44.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:44.685 "dma_device_type": 2 00:32:44.685 } 00:32:44.685 ], 00:32:44.685 "driver_specific": {} 00:32:44.685 } 00:32:44.685 ] 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:44.945 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:45.204 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:45.205 "name": "Existed_Raid", 00:32:45.205 "uuid": "bbedd536-baf4-4fd3-ac43-bcdbc7132a43", 00:32:45.205 "strip_size_kb": 0, 00:32:45.205 "state": "online", 00:32:45.205 "raid_level": "raid1", 00:32:45.205 "superblock": false, 00:32:45.205 "num_base_bdevs": 4, 00:32:45.205 "num_base_bdevs_discovered": 4, 00:32:45.205 "num_base_bdevs_operational": 4, 00:32:45.205 "base_bdevs_list": [ 00:32:45.205 { 00:32:45.205 "name": "NewBaseBdev", 00:32:45.205 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:45.205 "is_configured": true, 00:32:45.205 "data_offset": 0, 00:32:45.205 "data_size": 65536 00:32:45.205 }, 00:32:45.205 { 00:32:45.205 "name": "BaseBdev2", 00:32:45.205 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:45.205 "is_configured": true, 00:32:45.205 "data_offset": 0, 00:32:45.205 "data_size": 65536 00:32:45.205 }, 00:32:45.205 { 00:32:45.205 "name": "BaseBdev3", 00:32:45.205 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:45.205 "is_configured": true, 00:32:45.205 "data_offset": 0, 00:32:45.205 "data_size": 65536 00:32:45.205 }, 00:32:45.205 { 00:32:45.205 "name": "BaseBdev4", 00:32:45.205 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:45.205 "is_configured": true, 00:32:45.205 "data_offset": 0, 00:32:45.205 "data_size": 65536 00:32:45.205 } 00:32:45.205 ] 00:32:45.205 }' 00:32:45.205 12:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:45.205 12:35:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:45.773 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:45.774 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:46.032 [2024-06-07 12:35:09.595854] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:46.032 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:46.032 "name": "Existed_Raid", 00:32:46.032 "aliases": [ 00:32:46.032 "bbedd536-baf4-4fd3-ac43-bcdbc7132a43" 00:32:46.032 ], 00:32:46.032 "product_name": "Raid Volume", 00:32:46.032 "block_size": 512, 00:32:46.032 "num_blocks": 65536, 00:32:46.032 "uuid": "bbedd536-baf4-4fd3-ac43-bcdbc7132a43", 00:32:46.032 "assigned_rate_limits": { 00:32:46.032 "rw_ios_per_sec": 0, 00:32:46.032 "rw_mbytes_per_sec": 0, 00:32:46.032 "r_mbytes_per_sec": 0, 00:32:46.032 "w_mbytes_per_sec": 0 00:32:46.032 }, 00:32:46.032 "claimed": false, 00:32:46.032 "zoned": false, 00:32:46.032 "supported_io_types": { 00:32:46.032 "read": true, 00:32:46.032 "write": true, 00:32:46.032 "unmap": false, 00:32:46.032 "write_zeroes": true, 00:32:46.032 "flush": false, 00:32:46.032 "reset": true, 00:32:46.032 "compare": false, 00:32:46.032 "compare_and_write": false, 00:32:46.032 "abort": false, 00:32:46.032 "nvme_admin": false, 00:32:46.032 "nvme_io": false 00:32:46.032 }, 00:32:46.032 "memory_domains": [ 00:32:46.032 { 00:32:46.032 "dma_device_id": "system", 00:32:46.032 "dma_device_type": 1 00:32:46.032 }, 00:32:46.032 { 00:32:46.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.032 "dma_device_type": 2 00:32:46.032 }, 00:32:46.032 { 00:32:46.032 "dma_device_id": "system", 00:32:46.033 "dma_device_type": 1 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.033 "dma_device_type": 2 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "dma_device_id": "system", 00:32:46.033 "dma_device_type": 1 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.033 "dma_device_type": 2 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "dma_device_id": "system", 00:32:46.033 "dma_device_type": 1 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.033 "dma_device_type": 2 00:32:46.033 } 00:32:46.033 ], 00:32:46.033 "driver_specific": { 00:32:46.033 "raid": { 00:32:46.033 "uuid": "bbedd536-baf4-4fd3-ac43-bcdbc7132a43", 00:32:46.033 "strip_size_kb": 0, 00:32:46.033 "state": "online", 00:32:46.033 "raid_level": "raid1", 00:32:46.033 "superblock": false, 00:32:46.033 "num_base_bdevs": 4, 00:32:46.033 "num_base_bdevs_discovered": 4, 00:32:46.033 "num_base_bdevs_operational": 4, 00:32:46.033 "base_bdevs_list": [ 00:32:46.033 { 00:32:46.033 "name": "NewBaseBdev", 00:32:46.033 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:46.033 "is_configured": true, 00:32:46.033 "data_offset": 0, 00:32:46.033 "data_size": 65536 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "name": "BaseBdev2", 00:32:46.033 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:46.033 "is_configured": true, 00:32:46.033 "data_offset": 0, 00:32:46.033 "data_size": 65536 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "name": "BaseBdev3", 00:32:46.033 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:46.033 "is_configured": true, 00:32:46.033 "data_offset": 0, 00:32:46.033 "data_size": 65536 00:32:46.033 }, 00:32:46.033 { 00:32:46.033 "name": "BaseBdev4", 00:32:46.033 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:46.033 "is_configured": true, 00:32:46.033 "data_offset": 0, 00:32:46.033 "data_size": 65536 00:32:46.033 } 00:32:46.033 ] 00:32:46.033 } 00:32:46.033 } 00:32:46.033 }' 00:32:46.033 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:46.033 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:32:46.033 BaseBdev2 00:32:46.033 BaseBdev3 00:32:46.033 BaseBdev4' 00:32:46.033 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:46.033 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:32:46.033 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:46.291 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:46.291 "name": "NewBaseBdev", 00:32:46.291 "aliases": [ 00:32:46.291 "69d5442e-5b56-4287-bd4d-e290ea22f713" 00:32:46.291 ], 00:32:46.291 "product_name": "Malloc disk", 00:32:46.291 "block_size": 512, 00:32:46.291 "num_blocks": 65536, 00:32:46.291 "uuid": "69d5442e-5b56-4287-bd4d-e290ea22f713", 00:32:46.291 "assigned_rate_limits": { 00:32:46.291 "rw_ios_per_sec": 0, 00:32:46.291 "rw_mbytes_per_sec": 0, 00:32:46.291 "r_mbytes_per_sec": 0, 00:32:46.291 "w_mbytes_per_sec": 0 00:32:46.291 }, 00:32:46.291 "claimed": true, 00:32:46.291 "claim_type": "exclusive_write", 00:32:46.291 "zoned": false, 00:32:46.291 "supported_io_types": { 00:32:46.291 "read": true, 00:32:46.291 "write": true, 00:32:46.291 "unmap": true, 00:32:46.291 "write_zeroes": true, 00:32:46.291 "flush": true, 00:32:46.291 "reset": true, 00:32:46.291 "compare": false, 00:32:46.291 "compare_and_write": false, 00:32:46.291 "abort": true, 00:32:46.291 "nvme_admin": false, 00:32:46.291 "nvme_io": false 00:32:46.291 }, 00:32:46.291 "memory_domains": [ 00:32:46.291 { 00:32:46.291 "dma_device_id": "system", 00:32:46.291 "dma_device_type": 1 00:32:46.291 }, 00:32:46.291 { 00:32:46.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.291 "dma_device_type": 2 00:32:46.291 } 00:32:46.291 ], 00:32:46.291 "driver_specific": {} 00:32:46.291 }' 00:32:46.291 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:46.573 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:46.573 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:46.573 12:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:46.573 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:46.844 "name": "BaseBdev2", 00:32:46.844 "aliases": [ 00:32:46.844 "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6" 00:32:46.844 ], 00:32:46.844 "product_name": "Malloc disk", 00:32:46.844 "block_size": 512, 00:32:46.844 "num_blocks": 65536, 00:32:46.844 "uuid": "6af100a5-1050-4e6f-94f4-f1fa8f4cbdf6", 00:32:46.844 "assigned_rate_limits": { 00:32:46.844 "rw_ios_per_sec": 0, 00:32:46.844 "rw_mbytes_per_sec": 0, 00:32:46.844 "r_mbytes_per_sec": 0, 00:32:46.844 "w_mbytes_per_sec": 0 00:32:46.844 }, 00:32:46.844 "claimed": true, 00:32:46.844 "claim_type": "exclusive_write", 00:32:46.844 "zoned": false, 00:32:46.844 "supported_io_types": { 00:32:46.844 "read": true, 00:32:46.844 "write": true, 00:32:46.844 "unmap": true, 00:32:46.844 "write_zeroes": true, 00:32:46.844 "flush": true, 00:32:46.844 "reset": true, 00:32:46.844 "compare": false, 00:32:46.844 "compare_and_write": false, 00:32:46.844 "abort": true, 00:32:46.844 "nvme_admin": false, 00:32:46.844 "nvme_io": false 00:32:46.844 }, 00:32:46.844 "memory_domains": [ 00:32:46.844 { 00:32:46.844 "dma_device_id": "system", 00:32:46.844 "dma_device_type": 1 00:32:46.844 }, 00:32:46.844 { 00:32:46.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.844 "dma_device_type": 2 00:32:46.844 } 00:32:46.844 ], 00:32:46.844 "driver_specific": {} 00:32:46.844 }' 00:32:46.844 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:47.102 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:47.361 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:47.361 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:47.361 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:47.361 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:47.361 12:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:32:47.617 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:47.617 "name": "BaseBdev3", 00:32:47.618 "aliases": [ 00:32:47.618 "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3" 00:32:47.618 ], 00:32:47.618 "product_name": "Malloc disk", 00:32:47.618 "block_size": 512, 00:32:47.618 "num_blocks": 65536, 00:32:47.618 "uuid": "9936d6b0-1f91-45e5-a8e0-bd1b52f54be3", 00:32:47.618 "assigned_rate_limits": { 00:32:47.618 "rw_ios_per_sec": 0, 00:32:47.618 "rw_mbytes_per_sec": 0, 00:32:47.618 "r_mbytes_per_sec": 0, 00:32:47.618 "w_mbytes_per_sec": 0 00:32:47.618 }, 00:32:47.618 "claimed": true, 00:32:47.618 "claim_type": "exclusive_write", 00:32:47.618 "zoned": false, 00:32:47.618 "supported_io_types": { 00:32:47.618 "read": true, 00:32:47.618 "write": true, 00:32:47.618 "unmap": true, 00:32:47.618 "write_zeroes": true, 00:32:47.618 "flush": true, 00:32:47.618 "reset": true, 00:32:47.618 "compare": false, 00:32:47.618 "compare_and_write": false, 00:32:47.618 "abort": true, 00:32:47.618 "nvme_admin": false, 00:32:47.618 "nvme_io": false 00:32:47.618 }, 00:32:47.618 "memory_domains": [ 00:32:47.618 { 00:32:47.618 "dma_device_id": "system", 00:32:47.618 "dma_device_type": 1 00:32:47.618 }, 00:32:47.618 { 00:32:47.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:47.618 "dma_device_type": 2 00:32:47.618 } 00:32:47.618 ], 00:32:47.618 "driver_specific": {} 00:32:47.618 }' 00:32:47.618 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:47.618 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:47.618 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:47.618 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:47.874 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:48.131 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:48.131 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:48.131 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:48.131 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:32:48.389 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:48.389 "name": "BaseBdev4", 00:32:48.389 "aliases": [ 00:32:48.389 "d3a374c2-1c93-408d-84a2-7be86d27372f" 00:32:48.389 ], 00:32:48.389 "product_name": "Malloc disk", 00:32:48.389 "block_size": 512, 00:32:48.389 "num_blocks": 65536, 00:32:48.389 "uuid": "d3a374c2-1c93-408d-84a2-7be86d27372f", 00:32:48.389 "assigned_rate_limits": { 00:32:48.389 "rw_ios_per_sec": 0, 00:32:48.389 "rw_mbytes_per_sec": 0, 00:32:48.389 "r_mbytes_per_sec": 0, 00:32:48.389 "w_mbytes_per_sec": 0 00:32:48.389 }, 00:32:48.389 "claimed": true, 00:32:48.389 "claim_type": "exclusive_write", 00:32:48.389 "zoned": false, 00:32:48.389 "supported_io_types": { 00:32:48.389 "read": true, 00:32:48.389 "write": true, 00:32:48.389 "unmap": true, 00:32:48.389 "write_zeroes": true, 00:32:48.389 "flush": true, 00:32:48.389 "reset": true, 00:32:48.389 "compare": false, 00:32:48.389 "compare_and_write": false, 00:32:48.389 "abort": true, 00:32:48.389 "nvme_admin": false, 00:32:48.389 "nvme_io": false 00:32:48.389 }, 00:32:48.389 "memory_domains": [ 00:32:48.389 { 00:32:48.389 "dma_device_id": "system", 00:32:48.389 "dma_device_type": 1 00:32:48.389 }, 00:32:48.389 { 00:32:48.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:48.389 "dma_device_type": 2 00:32:48.389 } 00:32:48.389 ], 00:32:48.389 "driver_specific": {} 00:32:48.389 }' 00:32:48.389 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:48.389 12:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:48.389 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:32:48.389 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:32:48.647 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:48.906 [2024-06-07 12:35:12.440088] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:48.906 [2024-06-07 12:35:12.440405] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:48.906 [2024-06-07 12:35:12.440585] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:48.906 [2024-06-07 12:35:12.440868] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:48.906 [2024-06-07 12:35:12.440968] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 217809 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 217809 ']' 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 217809 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 217809 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:48.906 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:48.907 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 217809' 00:32:48.907 killing process with pid 217809 00:32:48.907 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 217809 00:32:48.907 [2024-06-07 12:35:12.486550] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:48.907 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 217809 00:32:49.165 [2024-06-07 12:35:12.566813] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:32:49.424 00:32:49.424 real 0m34.214s 00:32:49.424 user 1m2.808s 00:32:49.424 sys 0m5.537s 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:32:49.424 ************************************ 00:32:49.424 END TEST raid_state_function_test 00:32:49.424 ************************************ 00:32:49.424 12:35:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:32:49.424 12:35:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:49.424 12:35:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:49.424 12:35:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:49.424 ************************************ 00:32:49.424 START TEST raid_state_function_test_sb 00:32:49.424 ************************************ 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 true 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev3 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # echo BaseBdev4 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:32:49.424 12:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=218911 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 218911' 00:32:49.424 Process raid pid: 218911 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 218911 /var/tmp/spdk-raid.sock 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 218911 ']' 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:49.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:49.424 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:49.424 [2024-06-07 12:35:13.034257] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:32:49.424 [2024-06-07 12:35:13.034703] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:49.683 [2024-06-07 12:35:13.177057] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:49.683 [2024-06-07 12:35:13.277497] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:49.942 [2024-06-07 12:35:13.361786] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:49.942 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:49.942 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:32:49.942 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:50.202 [2024-06-07 12:35:13.625542] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:50.202 [2024-06-07 12:35:13.625887] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:50.202 [2024-06-07 12:35:13.625993] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:50.202 [2024-06-07 12:35:13.626058] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:50.202 [2024-06-07 12:35:13.626154] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:50.202 [2024-06-07 12:35:13.626263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:50.202 [2024-06-07 12:35:13.626306] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:50.202 [2024-06-07 12:35:13.626462] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:50.202 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:50.461 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:50.461 "name": "Existed_Raid", 00:32:50.461 "uuid": "9eab1ece-2405-4662-b3b8-5d2428f7dc71", 00:32:50.461 "strip_size_kb": 0, 00:32:50.461 "state": "configuring", 00:32:50.461 "raid_level": "raid1", 00:32:50.461 "superblock": true, 00:32:50.461 "num_base_bdevs": 4, 00:32:50.461 "num_base_bdevs_discovered": 0, 00:32:50.461 "num_base_bdevs_operational": 4, 00:32:50.461 "base_bdevs_list": [ 00:32:50.461 { 00:32:50.461 "name": "BaseBdev1", 00:32:50.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:50.461 "is_configured": false, 00:32:50.461 "data_offset": 0, 00:32:50.461 "data_size": 0 00:32:50.461 }, 00:32:50.461 { 00:32:50.461 "name": "BaseBdev2", 00:32:50.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:50.461 "is_configured": false, 00:32:50.461 "data_offset": 0, 00:32:50.461 "data_size": 0 00:32:50.461 }, 00:32:50.461 { 00:32:50.461 "name": "BaseBdev3", 00:32:50.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:50.461 "is_configured": false, 00:32:50.461 "data_offset": 0, 00:32:50.461 "data_size": 0 00:32:50.461 }, 00:32:50.461 { 00:32:50.461 "name": "BaseBdev4", 00:32:50.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:50.461 "is_configured": false, 00:32:50.461 "data_offset": 0, 00:32:50.461 "data_size": 0 00:32:50.461 } 00:32:50.461 ] 00:32:50.461 }' 00:32:50.461 12:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:50.461 12:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:51.028 12:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:51.286 [2024-06-07 12:35:14.757579] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:51.286 [2024-06-07 12:35:14.757840] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:32:51.286 12:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:51.544 [2024-06-07 12:35:14.981672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:51.544 [2024-06-07 12:35:14.982005] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:51.544 [2024-06-07 12:35:14.982120] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:51.544 [2024-06-07 12:35:14.982206] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:51.544 [2024-06-07 12:35:14.982343] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:51.544 [2024-06-07 12:35:14.982415] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:51.544 [2024-06-07 12:35:14.982526] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:51.544 [2024-06-07 12:35:14.982601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:51.544 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:32:51.802 [2024-06-07 12:35:15.197224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:51.802 BaseBdev1 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:51.802 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:52.061 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:52.320 [ 00:32:52.320 { 00:32:52.320 "name": "BaseBdev1", 00:32:52.320 "aliases": [ 00:32:52.320 "017dae86-dba1-46b4-a366-d5c7264fde36" 00:32:52.320 ], 00:32:52.320 "product_name": "Malloc disk", 00:32:52.320 "block_size": 512, 00:32:52.320 "num_blocks": 65536, 00:32:52.320 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:32:52.320 "assigned_rate_limits": { 00:32:52.320 "rw_ios_per_sec": 0, 00:32:52.320 "rw_mbytes_per_sec": 0, 00:32:52.320 "r_mbytes_per_sec": 0, 00:32:52.320 "w_mbytes_per_sec": 0 00:32:52.320 }, 00:32:52.320 "claimed": true, 00:32:52.320 "claim_type": "exclusive_write", 00:32:52.320 "zoned": false, 00:32:52.320 "supported_io_types": { 00:32:52.320 "read": true, 00:32:52.320 "write": true, 00:32:52.320 "unmap": true, 00:32:52.320 "write_zeroes": true, 00:32:52.320 "flush": true, 00:32:52.320 "reset": true, 00:32:52.320 "compare": false, 00:32:52.320 "compare_and_write": false, 00:32:52.320 "abort": true, 00:32:52.320 "nvme_admin": false, 00:32:52.320 "nvme_io": false 00:32:52.320 }, 00:32:52.320 "memory_domains": [ 00:32:52.320 { 00:32:52.320 "dma_device_id": "system", 00:32:52.320 "dma_device_type": 1 00:32:52.320 }, 00:32:52.320 { 00:32:52.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:52.320 "dma_device_type": 2 00:32:52.320 } 00:32:52.320 ], 00:32:52.320 "driver_specific": {} 00:32:52.320 } 00:32:52.320 ] 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:52.320 12:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:52.579 12:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:52.579 "name": "Existed_Raid", 00:32:52.579 "uuid": "b5b75a47-448a-456a-b4d7-694da337da5c", 00:32:52.579 "strip_size_kb": 0, 00:32:52.579 "state": "configuring", 00:32:52.579 "raid_level": "raid1", 00:32:52.579 "superblock": true, 00:32:52.579 "num_base_bdevs": 4, 00:32:52.579 "num_base_bdevs_discovered": 1, 00:32:52.579 "num_base_bdevs_operational": 4, 00:32:52.579 "base_bdevs_list": [ 00:32:52.579 { 00:32:52.579 "name": "BaseBdev1", 00:32:52.579 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:32:52.579 "is_configured": true, 00:32:52.579 "data_offset": 2048, 00:32:52.579 "data_size": 63488 00:32:52.579 }, 00:32:52.579 { 00:32:52.579 "name": "BaseBdev2", 00:32:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:52.579 "is_configured": false, 00:32:52.579 "data_offset": 0, 00:32:52.579 "data_size": 0 00:32:52.579 }, 00:32:52.579 { 00:32:52.579 "name": "BaseBdev3", 00:32:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:52.579 "is_configured": false, 00:32:52.579 "data_offset": 0, 00:32:52.579 "data_size": 0 00:32:52.579 }, 00:32:52.579 { 00:32:52.579 "name": "BaseBdev4", 00:32:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:52.579 "is_configured": false, 00:32:52.579 "data_offset": 0, 00:32:52.579 "data_size": 0 00:32:52.579 } 00:32:52.579 ] 00:32:52.579 }' 00:32:52.579 12:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:52.579 12:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:53.146 12:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:53.403 [2024-06-07 12:35:16.989574] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:53.403 [2024-06-07 12:35:16.989958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:32:53.403 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:32:53.661 [2024-06-07 12:35:17.253724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:53.661 [2024-06-07 12:35:17.256155] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:53.661 [2024-06-07 12:35:17.256402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:53.662 [2024-06-07 12:35:17.256509] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:32:53.662 [2024-06-07 12:35:17.256576] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:32:53.662 [2024-06-07 12:35:17.256652] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:32:53.662 [2024-06-07 12:35:17.256707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:53.662 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:54.229 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:54.230 "name": "Existed_Raid", 00:32:54.230 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:32:54.230 "strip_size_kb": 0, 00:32:54.230 "state": "configuring", 00:32:54.230 "raid_level": "raid1", 00:32:54.230 "superblock": true, 00:32:54.230 "num_base_bdevs": 4, 00:32:54.230 "num_base_bdevs_discovered": 1, 00:32:54.230 "num_base_bdevs_operational": 4, 00:32:54.230 "base_bdevs_list": [ 00:32:54.230 { 00:32:54.230 "name": "BaseBdev1", 00:32:54.230 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:32:54.230 "is_configured": true, 00:32:54.230 "data_offset": 2048, 00:32:54.230 "data_size": 63488 00:32:54.230 }, 00:32:54.230 { 00:32:54.230 "name": "BaseBdev2", 00:32:54.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:54.230 "is_configured": false, 00:32:54.230 "data_offset": 0, 00:32:54.230 "data_size": 0 00:32:54.230 }, 00:32:54.230 { 00:32:54.230 "name": "BaseBdev3", 00:32:54.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:54.230 "is_configured": false, 00:32:54.230 "data_offset": 0, 00:32:54.230 "data_size": 0 00:32:54.230 }, 00:32:54.230 { 00:32:54.230 "name": "BaseBdev4", 00:32:54.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:54.230 "is_configured": false, 00:32:54.230 "data_offset": 0, 00:32:54.230 "data_size": 0 00:32:54.230 } 00:32:54.230 ] 00:32:54.230 }' 00:32:54.230 12:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:54.230 12:35:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:54.796 12:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:32:55.053 [2024-06-07 12:35:18.643871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:55.053 BaseBdev2 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:55.053 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:55.619 12:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:55.877 [ 00:32:55.877 { 00:32:55.877 "name": "BaseBdev2", 00:32:55.877 "aliases": [ 00:32:55.877 "a294ee78-a01f-41a9-a5d5-2b74c8507861" 00:32:55.877 ], 00:32:55.877 "product_name": "Malloc disk", 00:32:55.877 "block_size": 512, 00:32:55.877 "num_blocks": 65536, 00:32:55.877 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:32:55.877 "assigned_rate_limits": { 00:32:55.877 "rw_ios_per_sec": 0, 00:32:55.877 "rw_mbytes_per_sec": 0, 00:32:55.877 "r_mbytes_per_sec": 0, 00:32:55.877 "w_mbytes_per_sec": 0 00:32:55.877 }, 00:32:55.877 "claimed": true, 00:32:55.877 "claim_type": "exclusive_write", 00:32:55.877 "zoned": false, 00:32:55.877 "supported_io_types": { 00:32:55.877 "read": true, 00:32:55.877 "write": true, 00:32:55.877 "unmap": true, 00:32:55.877 "write_zeroes": true, 00:32:55.877 "flush": true, 00:32:55.877 "reset": true, 00:32:55.877 "compare": false, 00:32:55.877 "compare_and_write": false, 00:32:55.877 "abort": true, 00:32:55.877 "nvme_admin": false, 00:32:55.877 "nvme_io": false 00:32:55.877 }, 00:32:55.877 "memory_domains": [ 00:32:55.877 { 00:32:55.877 "dma_device_id": "system", 00:32:55.877 "dma_device_type": 1 00:32:55.877 }, 00:32:55.877 { 00:32:55.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:55.877 "dma_device_type": 2 00:32:55.877 } 00:32:55.877 ], 00:32:55.877 "driver_specific": {} 00:32:55.877 } 00:32:55.877 ] 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:55.877 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:56.136 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:56.136 "name": "Existed_Raid", 00:32:56.136 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:32:56.136 "strip_size_kb": 0, 00:32:56.136 "state": "configuring", 00:32:56.136 "raid_level": "raid1", 00:32:56.136 "superblock": true, 00:32:56.136 "num_base_bdevs": 4, 00:32:56.136 "num_base_bdevs_discovered": 2, 00:32:56.136 "num_base_bdevs_operational": 4, 00:32:56.136 "base_bdevs_list": [ 00:32:56.136 { 00:32:56.136 "name": "BaseBdev1", 00:32:56.136 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:32:56.136 "is_configured": true, 00:32:56.136 "data_offset": 2048, 00:32:56.136 "data_size": 63488 00:32:56.136 }, 00:32:56.136 { 00:32:56.136 "name": "BaseBdev2", 00:32:56.136 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:32:56.136 "is_configured": true, 00:32:56.136 "data_offset": 2048, 00:32:56.136 "data_size": 63488 00:32:56.136 }, 00:32:56.136 { 00:32:56.136 "name": "BaseBdev3", 00:32:56.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:56.136 "is_configured": false, 00:32:56.136 "data_offset": 0, 00:32:56.136 "data_size": 0 00:32:56.136 }, 00:32:56.136 { 00:32:56.136 "name": "BaseBdev4", 00:32:56.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:56.136 "is_configured": false, 00:32:56.136 "data_offset": 0, 00:32:56.136 "data_size": 0 00:32:56.136 } 00:32:56.136 ] 00:32:56.136 }' 00:32:56.136 12:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:56.136 12:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:32:57.071 [2024-06-07 12:35:20.666806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:32:57.071 BaseBdev3 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:57.071 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:57.638 12:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:32:57.897 [ 00:32:57.897 { 00:32:57.897 "name": "BaseBdev3", 00:32:57.897 "aliases": [ 00:32:57.897 "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b" 00:32:57.897 ], 00:32:57.897 "product_name": "Malloc disk", 00:32:57.897 "block_size": 512, 00:32:57.897 "num_blocks": 65536, 00:32:57.897 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:32:57.897 "assigned_rate_limits": { 00:32:57.897 "rw_ios_per_sec": 0, 00:32:57.897 "rw_mbytes_per_sec": 0, 00:32:57.897 "r_mbytes_per_sec": 0, 00:32:57.897 "w_mbytes_per_sec": 0 00:32:57.897 }, 00:32:57.897 "claimed": true, 00:32:57.897 "claim_type": "exclusive_write", 00:32:57.897 "zoned": false, 00:32:57.897 "supported_io_types": { 00:32:57.897 "read": true, 00:32:57.897 "write": true, 00:32:57.897 "unmap": true, 00:32:57.897 "write_zeroes": true, 00:32:57.897 "flush": true, 00:32:57.897 "reset": true, 00:32:57.897 "compare": false, 00:32:57.897 "compare_and_write": false, 00:32:57.897 "abort": true, 00:32:57.897 "nvme_admin": false, 00:32:57.897 "nvme_io": false 00:32:57.897 }, 00:32:57.897 "memory_domains": [ 00:32:57.897 { 00:32:57.897 "dma_device_id": "system", 00:32:57.897 "dma_device_type": 1 00:32:57.897 }, 00:32:57.897 { 00:32:57.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:57.897 "dma_device_type": 2 00:32:57.897 } 00:32:57.897 ], 00:32:57.897 "driver_specific": {} 00:32:57.897 } 00:32:57.897 ] 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:57.897 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:57.898 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:57.898 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:57.898 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:57.898 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:57.898 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:58.157 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:58.157 "name": "Existed_Raid", 00:32:58.157 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:32:58.157 "strip_size_kb": 0, 00:32:58.157 "state": "configuring", 00:32:58.157 "raid_level": "raid1", 00:32:58.157 "superblock": true, 00:32:58.157 "num_base_bdevs": 4, 00:32:58.157 "num_base_bdevs_discovered": 3, 00:32:58.157 "num_base_bdevs_operational": 4, 00:32:58.157 "base_bdevs_list": [ 00:32:58.157 { 00:32:58.157 "name": "BaseBdev1", 00:32:58.157 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:32:58.157 "is_configured": true, 00:32:58.157 "data_offset": 2048, 00:32:58.157 "data_size": 63488 00:32:58.157 }, 00:32:58.157 { 00:32:58.157 "name": "BaseBdev2", 00:32:58.157 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:32:58.157 "is_configured": true, 00:32:58.157 "data_offset": 2048, 00:32:58.157 "data_size": 63488 00:32:58.157 }, 00:32:58.157 { 00:32:58.157 "name": "BaseBdev3", 00:32:58.157 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:32:58.157 "is_configured": true, 00:32:58.157 "data_offset": 2048, 00:32:58.157 "data_size": 63488 00:32:58.157 }, 00:32:58.157 { 00:32:58.157 "name": "BaseBdev4", 00:32:58.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:58.157 "is_configured": false, 00:32:58.157 "data_offset": 0, 00:32:58.157 "data_size": 0 00:32:58.157 } 00:32:58.157 ] 00:32:58.157 }' 00:32:58.157 12:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:58.157 12:35:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:32:58.767 12:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:32:59.030 [2024-06-07 12:35:22.541206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:32:59.030 [2024-06-07 12:35:22.541789] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:32:59.030 [2024-06-07 12:35:22.541962] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:32:59.030 [2024-06-07 12:35:22.542148] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:32:59.030 [2024-06-07 12:35:22.542680] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:32:59.030 [2024-06-07 12:35:22.542794] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:32:59.030 [2024-06-07 12:35:22.543010] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:59.030 BaseBdev4 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:32:59.030 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:59.288 12:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:32:59.551 [ 00:32:59.551 { 00:32:59.551 "name": "BaseBdev4", 00:32:59.551 "aliases": [ 00:32:59.551 "23909905-8fb0-4a97-9374-d9cee1c386ca" 00:32:59.551 ], 00:32:59.551 "product_name": "Malloc disk", 00:32:59.551 "block_size": 512, 00:32:59.551 "num_blocks": 65536, 00:32:59.551 "uuid": "23909905-8fb0-4a97-9374-d9cee1c386ca", 00:32:59.551 "assigned_rate_limits": { 00:32:59.551 "rw_ios_per_sec": 0, 00:32:59.551 "rw_mbytes_per_sec": 0, 00:32:59.551 "r_mbytes_per_sec": 0, 00:32:59.551 "w_mbytes_per_sec": 0 00:32:59.551 }, 00:32:59.551 "claimed": true, 00:32:59.551 "claim_type": "exclusive_write", 00:32:59.551 "zoned": false, 00:32:59.551 "supported_io_types": { 00:32:59.551 "read": true, 00:32:59.551 "write": true, 00:32:59.551 "unmap": true, 00:32:59.551 "write_zeroes": true, 00:32:59.551 "flush": true, 00:32:59.551 "reset": true, 00:32:59.551 "compare": false, 00:32:59.551 "compare_and_write": false, 00:32:59.551 "abort": true, 00:32:59.551 "nvme_admin": false, 00:32:59.551 "nvme_io": false 00:32:59.551 }, 00:32:59.551 "memory_domains": [ 00:32:59.551 { 00:32:59.551 "dma_device_id": "system", 00:32:59.551 "dma_device_type": 1 00:32:59.551 }, 00:32:59.551 { 00:32:59.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:59.551 "dma_device_type": 2 00:32:59.551 } 00:32:59.551 ], 00:32:59.551 "driver_specific": {} 00:32:59.551 } 00:32:59.551 ] 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:59.551 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:59.552 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:00.119 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:00.119 "name": "Existed_Raid", 00:33:00.119 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:33:00.119 "strip_size_kb": 0, 00:33:00.119 "state": "online", 00:33:00.119 "raid_level": "raid1", 00:33:00.119 "superblock": true, 00:33:00.119 "num_base_bdevs": 4, 00:33:00.119 "num_base_bdevs_discovered": 4, 00:33:00.119 "num_base_bdevs_operational": 4, 00:33:00.119 "base_bdevs_list": [ 00:33:00.119 { 00:33:00.119 "name": "BaseBdev1", 00:33:00.119 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:33:00.119 "is_configured": true, 00:33:00.119 "data_offset": 2048, 00:33:00.119 "data_size": 63488 00:33:00.119 }, 00:33:00.119 { 00:33:00.119 "name": "BaseBdev2", 00:33:00.119 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:33:00.119 "is_configured": true, 00:33:00.119 "data_offset": 2048, 00:33:00.119 "data_size": 63488 00:33:00.119 }, 00:33:00.119 { 00:33:00.119 "name": "BaseBdev3", 00:33:00.119 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:33:00.119 "is_configured": true, 00:33:00.119 "data_offset": 2048, 00:33:00.119 "data_size": 63488 00:33:00.119 }, 00:33:00.119 { 00:33:00.119 "name": "BaseBdev4", 00:33:00.119 "uuid": "23909905-8fb0-4a97-9374-d9cee1c386ca", 00:33:00.119 "is_configured": true, 00:33:00.119 "data_offset": 2048, 00:33:00.119 "data_size": 63488 00:33:00.119 } 00:33:00.119 ] 00:33:00.119 }' 00:33:00.119 12:35:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:00.119 12:35:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:00.686 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:00.686 [2024-06-07 12:35:24.311825] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:00.944 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:00.944 "name": "Existed_Raid", 00:33:00.944 "aliases": [ 00:33:00.944 "aa8979da-d8b9-4d6c-851a-ab52076e661f" 00:33:00.944 ], 00:33:00.944 "product_name": "Raid Volume", 00:33:00.944 "block_size": 512, 00:33:00.944 "num_blocks": 63488, 00:33:00.944 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:33:00.944 "assigned_rate_limits": { 00:33:00.944 "rw_ios_per_sec": 0, 00:33:00.944 "rw_mbytes_per_sec": 0, 00:33:00.944 "r_mbytes_per_sec": 0, 00:33:00.944 "w_mbytes_per_sec": 0 00:33:00.944 }, 00:33:00.944 "claimed": false, 00:33:00.944 "zoned": false, 00:33:00.944 "supported_io_types": { 00:33:00.944 "read": true, 00:33:00.944 "write": true, 00:33:00.944 "unmap": false, 00:33:00.944 "write_zeroes": true, 00:33:00.944 "flush": false, 00:33:00.944 "reset": true, 00:33:00.944 "compare": false, 00:33:00.944 "compare_and_write": false, 00:33:00.944 "abort": false, 00:33:00.944 "nvme_admin": false, 00:33:00.944 "nvme_io": false 00:33:00.944 }, 00:33:00.944 "memory_domains": [ 00:33:00.944 { 00:33:00.944 "dma_device_id": "system", 00:33:00.944 "dma_device_type": 1 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.944 "dma_device_type": 2 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "system", 00:33:00.944 "dma_device_type": 1 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.944 "dma_device_type": 2 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "system", 00:33:00.944 "dma_device_type": 1 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.944 "dma_device_type": 2 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "system", 00:33:00.944 "dma_device_type": 1 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:00.944 "dma_device_type": 2 00:33:00.944 } 00:33:00.944 ], 00:33:00.944 "driver_specific": { 00:33:00.944 "raid": { 00:33:00.944 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:33:00.944 "strip_size_kb": 0, 00:33:00.944 "state": "online", 00:33:00.944 "raid_level": "raid1", 00:33:00.944 "superblock": true, 00:33:00.944 "num_base_bdevs": 4, 00:33:00.944 "num_base_bdevs_discovered": 4, 00:33:00.944 "num_base_bdevs_operational": 4, 00:33:00.944 "base_bdevs_list": [ 00:33:00.944 { 00:33:00.944 "name": "BaseBdev1", 00:33:00.944 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:33:00.944 "is_configured": true, 00:33:00.944 "data_offset": 2048, 00:33:00.944 "data_size": 63488 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "name": "BaseBdev2", 00:33:00.944 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:33:00.944 "is_configured": true, 00:33:00.944 "data_offset": 2048, 00:33:00.944 "data_size": 63488 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "name": "BaseBdev3", 00:33:00.944 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:33:00.944 "is_configured": true, 00:33:00.944 "data_offset": 2048, 00:33:00.944 "data_size": 63488 00:33:00.944 }, 00:33:00.944 { 00:33:00.944 "name": "BaseBdev4", 00:33:00.944 "uuid": "23909905-8fb0-4a97-9374-d9cee1c386ca", 00:33:00.944 "is_configured": true, 00:33:00.944 "data_offset": 2048, 00:33:00.944 "data_size": 63488 00:33:00.944 } 00:33:00.944 ] 00:33:00.944 } 00:33:00.944 } 00:33:00.944 }' 00:33:00.944 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:00.944 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:33:00.944 BaseBdev2 00:33:00.944 BaseBdev3 00:33:00.944 BaseBdev4' 00:33:00.944 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:00.945 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:33:00.945 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:01.202 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:01.202 "name": "BaseBdev1", 00:33:01.202 "aliases": [ 00:33:01.202 "017dae86-dba1-46b4-a366-d5c7264fde36" 00:33:01.202 ], 00:33:01.202 "product_name": "Malloc disk", 00:33:01.202 "block_size": 512, 00:33:01.202 "num_blocks": 65536, 00:33:01.202 "uuid": "017dae86-dba1-46b4-a366-d5c7264fde36", 00:33:01.202 "assigned_rate_limits": { 00:33:01.202 "rw_ios_per_sec": 0, 00:33:01.202 "rw_mbytes_per_sec": 0, 00:33:01.202 "r_mbytes_per_sec": 0, 00:33:01.202 "w_mbytes_per_sec": 0 00:33:01.202 }, 00:33:01.202 "claimed": true, 00:33:01.202 "claim_type": "exclusive_write", 00:33:01.202 "zoned": false, 00:33:01.202 "supported_io_types": { 00:33:01.202 "read": true, 00:33:01.202 "write": true, 00:33:01.202 "unmap": true, 00:33:01.202 "write_zeroes": true, 00:33:01.202 "flush": true, 00:33:01.202 "reset": true, 00:33:01.202 "compare": false, 00:33:01.202 "compare_and_write": false, 00:33:01.202 "abort": true, 00:33:01.202 "nvme_admin": false, 00:33:01.202 "nvme_io": false 00:33:01.202 }, 00:33:01.202 "memory_domains": [ 00:33:01.202 { 00:33:01.202 "dma_device_id": "system", 00:33:01.202 "dma_device_type": 1 00:33:01.202 }, 00:33:01.202 { 00:33:01.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:01.202 "dma_device_type": 2 00:33:01.202 } 00:33:01.202 ], 00:33:01.202 "driver_specific": {} 00:33:01.202 }' 00:33:01.202 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:01.202 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:01.202 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:01.202 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:01.460 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:01.460 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:01.460 12:35:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:01.460 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:01.460 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:01.460 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:01.774 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:01.774 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:01.774 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:01.774 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:01.774 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:02.033 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:02.033 "name": "BaseBdev2", 00:33:02.033 "aliases": [ 00:33:02.033 "a294ee78-a01f-41a9-a5d5-2b74c8507861" 00:33:02.033 ], 00:33:02.033 "product_name": "Malloc disk", 00:33:02.033 "block_size": 512, 00:33:02.033 "num_blocks": 65536, 00:33:02.033 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:33:02.033 "assigned_rate_limits": { 00:33:02.033 "rw_ios_per_sec": 0, 00:33:02.033 "rw_mbytes_per_sec": 0, 00:33:02.033 "r_mbytes_per_sec": 0, 00:33:02.033 "w_mbytes_per_sec": 0 00:33:02.033 }, 00:33:02.033 "claimed": true, 00:33:02.033 "claim_type": "exclusive_write", 00:33:02.033 "zoned": false, 00:33:02.033 "supported_io_types": { 00:33:02.033 "read": true, 00:33:02.033 "write": true, 00:33:02.033 "unmap": true, 00:33:02.033 "write_zeroes": true, 00:33:02.033 "flush": true, 00:33:02.033 "reset": true, 00:33:02.033 "compare": false, 00:33:02.033 "compare_and_write": false, 00:33:02.033 "abort": true, 00:33:02.033 "nvme_admin": false, 00:33:02.033 "nvme_io": false 00:33:02.033 }, 00:33:02.033 "memory_domains": [ 00:33:02.033 { 00:33:02.033 "dma_device_id": "system", 00:33:02.033 "dma_device_type": 1 00:33:02.033 }, 00:33:02.033 { 00:33:02.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:02.033 "dma_device_type": 2 00:33:02.033 } 00:33:02.033 ], 00:33:02.033 "driver_specific": {} 00:33:02.033 }' 00:33:02.033 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:02.033 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:02.291 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:02.548 12:35:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:02.548 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:02.548 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:02.548 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:02.548 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:02.805 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:02.805 "name": "BaseBdev3", 00:33:02.805 "aliases": [ 00:33:02.805 "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b" 00:33:02.805 ], 00:33:02.805 "product_name": "Malloc disk", 00:33:02.805 "block_size": 512, 00:33:02.805 "num_blocks": 65536, 00:33:02.805 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:33:02.805 "assigned_rate_limits": { 00:33:02.805 "rw_ios_per_sec": 0, 00:33:02.805 "rw_mbytes_per_sec": 0, 00:33:02.805 "r_mbytes_per_sec": 0, 00:33:02.805 "w_mbytes_per_sec": 0 00:33:02.805 }, 00:33:02.805 "claimed": true, 00:33:02.805 "claim_type": "exclusive_write", 00:33:02.805 "zoned": false, 00:33:02.805 "supported_io_types": { 00:33:02.805 "read": true, 00:33:02.805 "write": true, 00:33:02.805 "unmap": true, 00:33:02.805 "write_zeroes": true, 00:33:02.805 "flush": true, 00:33:02.805 "reset": true, 00:33:02.805 "compare": false, 00:33:02.805 "compare_and_write": false, 00:33:02.805 "abort": true, 00:33:02.805 "nvme_admin": false, 00:33:02.805 "nvme_io": false 00:33:02.805 }, 00:33:02.805 "memory_domains": [ 00:33:02.805 { 00:33:02.805 "dma_device_id": "system", 00:33:02.805 "dma_device_type": 1 00:33:02.805 }, 00:33:02.805 { 00:33:02.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:02.805 "dma_device_type": 2 00:33:02.805 } 00:33:02.805 ], 00:33:02.805 "driver_specific": {} 00:33:02.805 }' 00:33:02.805 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:03.063 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:03.320 12:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:03.578 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:03.578 "name": "BaseBdev4", 00:33:03.578 "aliases": [ 00:33:03.578 "23909905-8fb0-4a97-9374-d9cee1c386ca" 00:33:03.578 ], 00:33:03.578 "product_name": "Malloc disk", 00:33:03.578 "block_size": 512, 00:33:03.578 "num_blocks": 65536, 00:33:03.578 "uuid": "23909905-8fb0-4a97-9374-d9cee1c386ca", 00:33:03.578 "assigned_rate_limits": { 00:33:03.578 "rw_ios_per_sec": 0, 00:33:03.578 "rw_mbytes_per_sec": 0, 00:33:03.578 "r_mbytes_per_sec": 0, 00:33:03.578 "w_mbytes_per_sec": 0 00:33:03.578 }, 00:33:03.578 "claimed": true, 00:33:03.578 "claim_type": "exclusive_write", 00:33:03.578 "zoned": false, 00:33:03.578 "supported_io_types": { 00:33:03.578 "read": true, 00:33:03.578 "write": true, 00:33:03.578 "unmap": true, 00:33:03.578 "write_zeroes": true, 00:33:03.578 "flush": true, 00:33:03.578 "reset": true, 00:33:03.578 "compare": false, 00:33:03.578 "compare_and_write": false, 00:33:03.578 "abort": true, 00:33:03.578 "nvme_admin": false, 00:33:03.578 "nvme_io": false 00:33:03.578 }, 00:33:03.578 "memory_domains": [ 00:33:03.578 { 00:33:03.578 "dma_device_id": "system", 00:33:03.578 "dma_device_type": 1 00:33:03.578 }, 00:33:03.578 { 00:33:03.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:03.578 "dma_device_type": 2 00:33:03.578 } 00:33:03.578 ], 00:33:03.578 "driver_specific": {} 00:33:03.578 }' 00:33:03.578 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:03.578 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:03.835 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:04.092 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:04.093 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:04.093 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:04.351 [2024-06-07 12:35:27.768209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:04.351 12:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:04.610 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:04.610 "name": "Existed_Raid", 00:33:04.610 "uuid": "aa8979da-d8b9-4d6c-851a-ab52076e661f", 00:33:04.610 "strip_size_kb": 0, 00:33:04.610 "state": "online", 00:33:04.610 "raid_level": "raid1", 00:33:04.610 "superblock": true, 00:33:04.610 "num_base_bdevs": 4, 00:33:04.610 "num_base_bdevs_discovered": 3, 00:33:04.610 "num_base_bdevs_operational": 3, 00:33:04.610 "base_bdevs_list": [ 00:33:04.610 { 00:33:04.610 "name": null, 00:33:04.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:04.610 "is_configured": false, 00:33:04.610 "data_offset": 2048, 00:33:04.610 "data_size": 63488 00:33:04.610 }, 00:33:04.610 { 00:33:04.610 "name": "BaseBdev2", 00:33:04.610 "uuid": "a294ee78-a01f-41a9-a5d5-2b74c8507861", 00:33:04.610 "is_configured": true, 00:33:04.610 "data_offset": 2048, 00:33:04.610 "data_size": 63488 00:33:04.610 }, 00:33:04.610 { 00:33:04.610 "name": "BaseBdev3", 00:33:04.610 "uuid": "ece7cc63-6f6f-45a6-809b-5bf0ba83f94b", 00:33:04.610 "is_configured": true, 00:33:04.610 "data_offset": 2048, 00:33:04.610 "data_size": 63488 00:33:04.610 }, 00:33:04.610 { 00:33:04.610 "name": "BaseBdev4", 00:33:04.610 "uuid": "23909905-8fb0-4a97-9374-d9cee1c386ca", 00:33:04.610 "is_configured": true, 00:33:04.610 "data_offset": 2048, 00:33:04.610 "data_size": 63488 00:33:04.610 } 00:33:04.610 ] 00:33:04.610 }' 00:33:04.610 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:04.610 12:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:05.176 12:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:33:05.434 [2024-06-07 12:35:28.992369] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:05.434 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:05.434 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:05.434 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:05.434 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:05.692 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:05.692 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:05.692 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:33:05.951 [2024-06-07 12:35:29.510643] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:05.952 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:05.952 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:05.952 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:05.952 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:06.210 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:06.210 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:06.210 12:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:33:06.778 [2024-06-07 12:35:30.124086] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:33:06.778 [2024-06-07 12:35:30.124454] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:06.778 [2024-06-07 12:35:30.147925] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:06.778 [2024-06-07 12:35:30.161348] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:06.778 [2024-06-07 12:35:30.161559] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:33:06.778 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:06.778 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:06.778 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:33:06.778 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:07.036 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:33:07.293 BaseBdev2 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:07.293 12:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:07.551 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:07.808 [ 00:33:07.808 { 00:33:07.808 "name": "BaseBdev2", 00:33:07.808 "aliases": [ 00:33:07.808 "e32e08ad-961b-49ed-845a-9e4a09513e14" 00:33:07.808 ], 00:33:07.808 "product_name": "Malloc disk", 00:33:07.808 "block_size": 512, 00:33:07.808 "num_blocks": 65536, 00:33:07.808 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:07.808 "assigned_rate_limits": { 00:33:07.808 "rw_ios_per_sec": 0, 00:33:07.808 "rw_mbytes_per_sec": 0, 00:33:07.808 "r_mbytes_per_sec": 0, 00:33:07.808 "w_mbytes_per_sec": 0 00:33:07.808 }, 00:33:07.808 "claimed": false, 00:33:07.808 "zoned": false, 00:33:07.808 "supported_io_types": { 00:33:07.808 "read": true, 00:33:07.808 "write": true, 00:33:07.808 "unmap": true, 00:33:07.808 "write_zeroes": true, 00:33:07.808 "flush": true, 00:33:07.808 "reset": true, 00:33:07.808 "compare": false, 00:33:07.808 "compare_and_write": false, 00:33:07.808 "abort": true, 00:33:07.808 "nvme_admin": false, 00:33:07.808 "nvme_io": false 00:33:07.808 }, 00:33:07.808 "memory_domains": [ 00:33:07.808 { 00:33:07.808 "dma_device_id": "system", 00:33:07.808 "dma_device_type": 1 00:33:07.808 }, 00:33:07.808 { 00:33:07.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:07.808 "dma_device_type": 2 00:33:07.808 } 00:33:07.808 ], 00:33:07.808 "driver_specific": {} 00:33:07.808 } 00:33:07.808 ] 00:33:07.808 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:07.808 12:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:07.808 12:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:07.808 12:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:33:08.065 BaseBdev3 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:08.065 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:08.630 12:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:33:08.630 [ 00:33:08.630 { 00:33:08.630 "name": "BaseBdev3", 00:33:08.630 "aliases": [ 00:33:08.630 "75fcd703-e312-4b1b-9f90-60c051f9ed12" 00:33:08.630 ], 00:33:08.630 "product_name": "Malloc disk", 00:33:08.630 "block_size": 512, 00:33:08.630 "num_blocks": 65536, 00:33:08.630 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:08.630 "assigned_rate_limits": { 00:33:08.630 "rw_ios_per_sec": 0, 00:33:08.630 "rw_mbytes_per_sec": 0, 00:33:08.630 "r_mbytes_per_sec": 0, 00:33:08.630 "w_mbytes_per_sec": 0 00:33:08.630 }, 00:33:08.630 "claimed": false, 00:33:08.630 "zoned": false, 00:33:08.630 "supported_io_types": { 00:33:08.630 "read": true, 00:33:08.630 "write": true, 00:33:08.630 "unmap": true, 00:33:08.630 "write_zeroes": true, 00:33:08.630 "flush": true, 00:33:08.630 "reset": true, 00:33:08.630 "compare": false, 00:33:08.630 "compare_and_write": false, 00:33:08.630 "abort": true, 00:33:08.630 "nvme_admin": false, 00:33:08.630 "nvme_io": false 00:33:08.630 }, 00:33:08.630 "memory_domains": [ 00:33:08.630 { 00:33:08.630 "dma_device_id": "system", 00:33:08.630 "dma_device_type": 1 00:33:08.630 }, 00:33:08.630 { 00:33:08.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:08.630 "dma_device_type": 2 00:33:08.630 } 00:33:08.630 ], 00:33:08.630 "driver_specific": {} 00:33:08.630 } 00:33:08.630 ] 00:33:08.630 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:08.630 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:08.630 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:08.630 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:33:08.887 BaseBdev4 00:33:08.887 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:33:08.887 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:33:08.887 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:08.887 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:08.888 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:08.888 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:08.888 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:09.146 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:33:09.403 [ 00:33:09.403 { 00:33:09.403 "name": "BaseBdev4", 00:33:09.403 "aliases": [ 00:33:09.403 "4944b0af-eefc-4169-9f0b-7e9735009f7e" 00:33:09.403 ], 00:33:09.403 "product_name": "Malloc disk", 00:33:09.403 "block_size": 512, 00:33:09.403 "num_blocks": 65536, 00:33:09.403 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:09.403 "assigned_rate_limits": { 00:33:09.403 "rw_ios_per_sec": 0, 00:33:09.403 "rw_mbytes_per_sec": 0, 00:33:09.403 "r_mbytes_per_sec": 0, 00:33:09.403 "w_mbytes_per_sec": 0 00:33:09.403 }, 00:33:09.403 "claimed": false, 00:33:09.403 "zoned": false, 00:33:09.403 "supported_io_types": { 00:33:09.403 "read": true, 00:33:09.403 "write": true, 00:33:09.403 "unmap": true, 00:33:09.403 "write_zeroes": true, 00:33:09.403 "flush": true, 00:33:09.403 "reset": true, 00:33:09.403 "compare": false, 00:33:09.403 "compare_and_write": false, 00:33:09.403 "abort": true, 00:33:09.403 "nvme_admin": false, 00:33:09.403 "nvme_io": false 00:33:09.403 }, 00:33:09.403 "memory_domains": [ 00:33:09.403 { 00:33:09.403 "dma_device_id": "system", 00:33:09.403 "dma_device_type": 1 00:33:09.403 }, 00:33:09.403 { 00:33:09.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:09.403 "dma_device_type": 2 00:33:09.403 } 00:33:09.403 ], 00:33:09.403 "driver_specific": {} 00:33:09.403 } 00:33:09.403 ] 00:33:09.403 12:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:09.403 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:33:09.403 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:33:09.403 12:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:33:09.661 [2024-06-07 12:35:33.223331] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:09.661 [2024-06-07 12:35:33.223638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:09.661 [2024-06-07 12:35:33.223771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:09.661 [2024-06-07 12:35:33.225996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:09.661 [2024-06-07 12:35:33.226177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:09.661 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:09.919 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:09.919 "name": "Existed_Raid", 00:33:09.919 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:09.919 "strip_size_kb": 0, 00:33:09.919 "state": "configuring", 00:33:09.919 "raid_level": "raid1", 00:33:09.919 "superblock": true, 00:33:09.919 "num_base_bdevs": 4, 00:33:09.919 "num_base_bdevs_discovered": 3, 00:33:09.919 "num_base_bdevs_operational": 4, 00:33:09.919 "base_bdevs_list": [ 00:33:09.919 { 00:33:09.919 "name": "BaseBdev1", 00:33:09.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:09.919 "is_configured": false, 00:33:09.919 "data_offset": 0, 00:33:09.919 "data_size": 0 00:33:09.919 }, 00:33:09.919 { 00:33:09.919 "name": "BaseBdev2", 00:33:09.919 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:09.919 "is_configured": true, 00:33:09.919 "data_offset": 2048, 00:33:09.919 "data_size": 63488 00:33:09.919 }, 00:33:09.919 { 00:33:09.919 "name": "BaseBdev3", 00:33:09.919 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:09.919 "is_configured": true, 00:33:09.919 "data_offset": 2048, 00:33:09.919 "data_size": 63488 00:33:09.919 }, 00:33:09.919 { 00:33:09.919 "name": "BaseBdev4", 00:33:09.919 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:09.919 "is_configured": true, 00:33:09.919 "data_offset": 2048, 00:33:09.919 "data_size": 63488 00:33:09.919 } 00:33:09.919 ] 00:33:09.919 }' 00:33:09.919 12:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:09.919 12:35:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:10.490 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:33:10.748 [2024-06-07 12:35:34.315483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:10.748 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:11.006 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:11.006 "name": "Existed_Raid", 00:33:11.006 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:11.006 "strip_size_kb": 0, 00:33:11.006 "state": "configuring", 00:33:11.006 "raid_level": "raid1", 00:33:11.006 "superblock": true, 00:33:11.006 "num_base_bdevs": 4, 00:33:11.006 "num_base_bdevs_discovered": 2, 00:33:11.006 "num_base_bdevs_operational": 4, 00:33:11.006 "base_bdevs_list": [ 00:33:11.006 { 00:33:11.006 "name": "BaseBdev1", 00:33:11.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:11.006 "is_configured": false, 00:33:11.006 "data_offset": 0, 00:33:11.006 "data_size": 0 00:33:11.006 }, 00:33:11.006 { 00:33:11.006 "name": null, 00:33:11.006 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:11.006 "is_configured": false, 00:33:11.006 "data_offset": 2048, 00:33:11.006 "data_size": 63488 00:33:11.006 }, 00:33:11.006 { 00:33:11.006 "name": "BaseBdev3", 00:33:11.006 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:11.006 "is_configured": true, 00:33:11.006 "data_offset": 2048, 00:33:11.006 "data_size": 63488 00:33:11.006 }, 00:33:11.006 { 00:33:11.006 "name": "BaseBdev4", 00:33:11.006 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:11.006 "is_configured": true, 00:33:11.006 "data_offset": 2048, 00:33:11.006 "data_size": 63488 00:33:11.006 } 00:33:11.006 ] 00:33:11.006 }' 00:33:11.006 12:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:11.006 12:35:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:11.572 12:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:11.572 12:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:11.830 12:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:33:11.830 12:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:33:12.088 [2024-06-07 12:35:35.637267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:12.088 BaseBdev1 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:12.088 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:12.346 12:35:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:12.605 [ 00:33:12.605 { 00:33:12.605 "name": "BaseBdev1", 00:33:12.605 "aliases": [ 00:33:12.605 "57e1ed4d-cb60-4d6a-86b9-38049a748415" 00:33:12.605 ], 00:33:12.605 "product_name": "Malloc disk", 00:33:12.605 "block_size": 512, 00:33:12.605 "num_blocks": 65536, 00:33:12.605 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:12.605 "assigned_rate_limits": { 00:33:12.605 "rw_ios_per_sec": 0, 00:33:12.605 "rw_mbytes_per_sec": 0, 00:33:12.605 "r_mbytes_per_sec": 0, 00:33:12.605 "w_mbytes_per_sec": 0 00:33:12.605 }, 00:33:12.605 "claimed": true, 00:33:12.605 "claim_type": "exclusive_write", 00:33:12.605 "zoned": false, 00:33:12.605 "supported_io_types": { 00:33:12.605 "read": true, 00:33:12.605 "write": true, 00:33:12.605 "unmap": true, 00:33:12.605 "write_zeroes": true, 00:33:12.605 "flush": true, 00:33:12.605 "reset": true, 00:33:12.605 "compare": false, 00:33:12.605 "compare_and_write": false, 00:33:12.605 "abort": true, 00:33:12.605 "nvme_admin": false, 00:33:12.605 "nvme_io": false 00:33:12.605 }, 00:33:12.605 "memory_domains": [ 00:33:12.605 { 00:33:12.605 "dma_device_id": "system", 00:33:12.605 "dma_device_type": 1 00:33:12.605 }, 00:33:12.605 { 00:33:12.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:12.605 "dma_device_type": 2 00:33:12.605 } 00:33:12.605 ], 00:33:12.605 "driver_specific": {} 00:33:12.605 } 00:33:12.605 ] 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:12.605 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:13.173 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:13.173 "name": "Existed_Raid", 00:33:13.173 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:13.173 "strip_size_kb": 0, 00:33:13.173 "state": "configuring", 00:33:13.173 "raid_level": "raid1", 00:33:13.173 "superblock": true, 00:33:13.173 "num_base_bdevs": 4, 00:33:13.173 "num_base_bdevs_discovered": 3, 00:33:13.173 "num_base_bdevs_operational": 4, 00:33:13.173 "base_bdevs_list": [ 00:33:13.173 { 00:33:13.173 "name": "BaseBdev1", 00:33:13.173 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:13.173 "is_configured": true, 00:33:13.173 "data_offset": 2048, 00:33:13.173 "data_size": 63488 00:33:13.173 }, 00:33:13.173 { 00:33:13.173 "name": null, 00:33:13.173 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:13.173 "is_configured": false, 00:33:13.173 "data_offset": 2048, 00:33:13.173 "data_size": 63488 00:33:13.173 }, 00:33:13.173 { 00:33:13.173 "name": "BaseBdev3", 00:33:13.173 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:13.173 "is_configured": true, 00:33:13.173 "data_offset": 2048, 00:33:13.173 "data_size": 63488 00:33:13.173 }, 00:33:13.173 { 00:33:13.173 "name": "BaseBdev4", 00:33:13.173 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:13.173 "is_configured": true, 00:33:13.173 "data_offset": 2048, 00:33:13.173 "data_size": 63488 00:33:13.173 } 00:33:13.173 ] 00:33:13.173 }' 00:33:13.173 12:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:13.173 12:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:13.740 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:13.740 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:13.998 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:33:13.998 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:33:14.257 [2024-06-07 12:35:37.853638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:14.257 12:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:14.824 12:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:14.824 "name": "Existed_Raid", 00:33:14.824 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:14.824 "strip_size_kb": 0, 00:33:14.824 "state": "configuring", 00:33:14.824 "raid_level": "raid1", 00:33:14.824 "superblock": true, 00:33:14.824 "num_base_bdevs": 4, 00:33:14.824 "num_base_bdevs_discovered": 2, 00:33:14.824 "num_base_bdevs_operational": 4, 00:33:14.824 "base_bdevs_list": [ 00:33:14.824 { 00:33:14.824 "name": "BaseBdev1", 00:33:14.824 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:14.824 "is_configured": true, 00:33:14.824 "data_offset": 2048, 00:33:14.824 "data_size": 63488 00:33:14.824 }, 00:33:14.824 { 00:33:14.824 "name": null, 00:33:14.824 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:14.824 "is_configured": false, 00:33:14.824 "data_offset": 2048, 00:33:14.824 "data_size": 63488 00:33:14.824 }, 00:33:14.824 { 00:33:14.824 "name": null, 00:33:14.824 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:14.824 "is_configured": false, 00:33:14.824 "data_offset": 2048, 00:33:14.824 "data_size": 63488 00:33:14.824 }, 00:33:14.824 { 00:33:14.824 "name": "BaseBdev4", 00:33:14.824 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:14.824 "is_configured": true, 00:33:14.824 "data_offset": 2048, 00:33:14.824 "data_size": 63488 00:33:14.824 } 00:33:14.824 ] 00:33:14.824 }' 00:33:14.824 12:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:14.824 12:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:15.391 12:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.391 12:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:15.650 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:33:15.650 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:33:15.970 [2024-06-07 12:35:39.397830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.970 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:16.230 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:16.230 "name": "Existed_Raid", 00:33:16.230 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:16.230 "strip_size_kb": 0, 00:33:16.230 "state": "configuring", 00:33:16.230 "raid_level": "raid1", 00:33:16.230 "superblock": true, 00:33:16.230 "num_base_bdevs": 4, 00:33:16.230 "num_base_bdevs_discovered": 3, 00:33:16.230 "num_base_bdevs_operational": 4, 00:33:16.230 "base_bdevs_list": [ 00:33:16.230 { 00:33:16.230 "name": "BaseBdev1", 00:33:16.230 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:16.230 "is_configured": true, 00:33:16.230 "data_offset": 2048, 00:33:16.230 "data_size": 63488 00:33:16.230 }, 00:33:16.230 { 00:33:16.230 "name": null, 00:33:16.230 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:16.230 "is_configured": false, 00:33:16.230 "data_offset": 2048, 00:33:16.230 "data_size": 63488 00:33:16.230 }, 00:33:16.230 { 00:33:16.230 "name": "BaseBdev3", 00:33:16.230 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:16.230 "is_configured": true, 00:33:16.230 "data_offset": 2048, 00:33:16.230 "data_size": 63488 00:33:16.230 }, 00:33:16.230 { 00:33:16.230 "name": "BaseBdev4", 00:33:16.230 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:16.230 "is_configured": true, 00:33:16.230 "data_offset": 2048, 00:33:16.230 "data_size": 63488 00:33:16.230 } 00:33:16.230 ] 00:33:16.230 }' 00:33:16.230 12:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:16.230 12:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:16.797 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:33:16.797 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:17.054 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:33:17.054 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:17.312 [2024-06-07 12:35:40.938043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:17.571 12:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:17.830 12:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:17.830 "name": "Existed_Raid", 00:33:17.830 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:17.830 "strip_size_kb": 0, 00:33:17.830 "state": "configuring", 00:33:17.830 "raid_level": "raid1", 00:33:17.830 "superblock": true, 00:33:17.830 "num_base_bdevs": 4, 00:33:17.830 "num_base_bdevs_discovered": 2, 00:33:17.830 "num_base_bdevs_operational": 4, 00:33:17.830 "base_bdevs_list": [ 00:33:17.830 { 00:33:17.830 "name": null, 00:33:17.830 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:17.830 "is_configured": false, 00:33:17.830 "data_offset": 2048, 00:33:17.830 "data_size": 63488 00:33:17.830 }, 00:33:17.830 { 00:33:17.830 "name": null, 00:33:17.830 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:17.830 "is_configured": false, 00:33:17.830 "data_offset": 2048, 00:33:17.830 "data_size": 63488 00:33:17.830 }, 00:33:17.830 { 00:33:17.830 "name": "BaseBdev3", 00:33:17.830 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:17.830 "is_configured": true, 00:33:17.830 "data_offset": 2048, 00:33:17.830 "data_size": 63488 00:33:17.830 }, 00:33:17.830 { 00:33:17.830 "name": "BaseBdev4", 00:33:17.830 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:17.830 "is_configured": true, 00:33:17.830 "data_offset": 2048, 00:33:17.830 "data_size": 63488 00:33:17.830 } 00:33:17.830 ] 00:33:17.830 }' 00:33:17.830 12:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:17.830 12:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:18.089 12:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:18.089 12:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:33:18.661 [2024-06-07 12:35:42.190706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:18.661 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:18.927 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:18.927 "name": "Existed_Raid", 00:33:18.927 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:18.927 "strip_size_kb": 0, 00:33:18.927 "state": "configuring", 00:33:18.927 "raid_level": "raid1", 00:33:18.927 "superblock": true, 00:33:18.927 "num_base_bdevs": 4, 00:33:18.927 "num_base_bdevs_discovered": 3, 00:33:18.927 "num_base_bdevs_operational": 4, 00:33:18.927 "base_bdevs_list": [ 00:33:18.927 { 00:33:18.927 "name": null, 00:33:18.927 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:18.927 "is_configured": false, 00:33:18.927 "data_offset": 2048, 00:33:18.927 "data_size": 63488 00:33:18.927 }, 00:33:18.927 { 00:33:18.927 "name": "BaseBdev2", 00:33:18.927 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:18.927 "is_configured": true, 00:33:18.927 "data_offset": 2048, 00:33:18.927 "data_size": 63488 00:33:18.927 }, 00:33:18.927 { 00:33:18.927 "name": "BaseBdev3", 00:33:18.927 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:18.927 "is_configured": true, 00:33:18.927 "data_offset": 2048, 00:33:18.927 "data_size": 63488 00:33:18.927 }, 00:33:18.927 { 00:33:18.927 "name": "BaseBdev4", 00:33:18.927 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:18.927 "is_configured": true, 00:33:18.927 "data_offset": 2048, 00:33:18.927 "data_size": 63488 00:33:18.927 } 00:33:18.927 ] 00:33:18.927 }' 00:33:18.927 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:18.927 12:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:19.491 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:33:19.491 12:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:19.748 12:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:33:19.748 12:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:19.748 12:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:33:20.005 12:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 57e1ed4d-cb60-4d6a-86b9-38049a748415 00:33:20.263 [2024-06-07 12:35:43.719753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:33:20.263 [2024-06-07 12:35:43.720162] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008180 00:33:20.263 [2024-06-07 12:35:43.720284] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:20.263 [2024-06-07 12:35:43.720381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:33:20.263 [2024-06-07 12:35:43.720674] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008180 00:33:20.263 [2024-06-07 12:35:43.720777] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000008180 00:33:20.263 [2024-06-07 12:35:43.720925] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:20.263 NewBaseBdev 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:33:20.263 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:20.520 12:35:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:33:20.520 [ 00:33:20.520 { 00:33:20.520 "name": "NewBaseBdev", 00:33:20.520 "aliases": [ 00:33:20.520 "57e1ed4d-cb60-4d6a-86b9-38049a748415" 00:33:20.520 ], 00:33:20.520 "product_name": "Malloc disk", 00:33:20.520 "block_size": 512, 00:33:20.520 "num_blocks": 65536, 00:33:20.520 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:20.520 "assigned_rate_limits": { 00:33:20.520 "rw_ios_per_sec": 0, 00:33:20.520 "rw_mbytes_per_sec": 0, 00:33:20.520 "r_mbytes_per_sec": 0, 00:33:20.520 "w_mbytes_per_sec": 0 00:33:20.520 }, 00:33:20.520 "claimed": true, 00:33:20.520 "claim_type": "exclusive_write", 00:33:20.520 "zoned": false, 00:33:20.520 "supported_io_types": { 00:33:20.520 "read": true, 00:33:20.520 "write": true, 00:33:20.520 "unmap": true, 00:33:20.520 "write_zeroes": true, 00:33:20.520 "flush": true, 00:33:20.520 "reset": true, 00:33:20.520 "compare": false, 00:33:20.520 "compare_and_write": false, 00:33:20.520 "abort": true, 00:33:20.520 "nvme_admin": false, 00:33:20.520 "nvme_io": false 00:33:20.520 }, 00:33:20.520 "memory_domains": [ 00:33:20.520 { 00:33:20.520 "dma_device_id": "system", 00:33:20.520 "dma_device_type": 1 00:33:20.520 }, 00:33:20.520 { 00:33:20.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:20.520 "dma_device_type": 2 00:33:20.520 } 00:33:20.520 ], 00:33:20.520 "driver_specific": {} 00:33:20.520 } 00:33:20.520 ] 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:20.778 "name": "Existed_Raid", 00:33:20.778 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:20.778 "strip_size_kb": 0, 00:33:20.778 "state": "online", 00:33:20.778 "raid_level": "raid1", 00:33:20.778 "superblock": true, 00:33:20.778 "num_base_bdevs": 4, 00:33:20.778 "num_base_bdevs_discovered": 4, 00:33:20.778 "num_base_bdevs_operational": 4, 00:33:20.778 "base_bdevs_list": [ 00:33:20.778 { 00:33:20.778 "name": "NewBaseBdev", 00:33:20.778 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:20.778 "is_configured": true, 00:33:20.778 "data_offset": 2048, 00:33:20.778 "data_size": 63488 00:33:20.778 }, 00:33:20.778 { 00:33:20.778 "name": "BaseBdev2", 00:33:20.778 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:20.778 "is_configured": true, 00:33:20.778 "data_offset": 2048, 00:33:20.778 "data_size": 63488 00:33:20.778 }, 00:33:20.778 { 00:33:20.778 "name": "BaseBdev3", 00:33:20.778 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:20.778 "is_configured": true, 00:33:20.778 "data_offset": 2048, 00:33:20.778 "data_size": 63488 00:33:20.778 }, 00:33:20.778 { 00:33:20.778 "name": "BaseBdev4", 00:33:20.778 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:20.778 "is_configured": true, 00:33:20.778 "data_offset": 2048, 00:33:20.778 "data_size": 63488 00:33:20.778 } 00:33:20.778 ] 00:33:20.778 }' 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:20.778 12:35:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:21.344 12:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:21.604 [2024-06-07 12:35:45.224176] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:21.604 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:21.604 "name": "Existed_Raid", 00:33:21.604 "aliases": [ 00:33:21.604 "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e" 00:33:21.604 ], 00:33:21.604 "product_name": "Raid Volume", 00:33:21.604 "block_size": 512, 00:33:21.604 "num_blocks": 63488, 00:33:21.604 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:21.604 "assigned_rate_limits": { 00:33:21.604 "rw_ios_per_sec": 0, 00:33:21.604 "rw_mbytes_per_sec": 0, 00:33:21.604 "r_mbytes_per_sec": 0, 00:33:21.604 "w_mbytes_per_sec": 0 00:33:21.604 }, 00:33:21.604 "claimed": false, 00:33:21.604 "zoned": false, 00:33:21.604 "supported_io_types": { 00:33:21.604 "read": true, 00:33:21.604 "write": true, 00:33:21.604 "unmap": false, 00:33:21.604 "write_zeroes": true, 00:33:21.604 "flush": false, 00:33:21.604 "reset": true, 00:33:21.604 "compare": false, 00:33:21.604 "compare_and_write": false, 00:33:21.604 "abort": false, 00:33:21.604 "nvme_admin": false, 00:33:21.604 "nvme_io": false 00:33:21.604 }, 00:33:21.604 "memory_domains": [ 00:33:21.604 { 00:33:21.604 "dma_device_id": "system", 00:33:21.604 "dma_device_type": 1 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.604 "dma_device_type": 2 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "system", 00:33:21.604 "dma_device_type": 1 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.604 "dma_device_type": 2 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "system", 00:33:21.604 "dma_device_type": 1 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.604 "dma_device_type": 2 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "system", 00:33:21.604 "dma_device_type": 1 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:21.604 "dma_device_type": 2 00:33:21.604 } 00:33:21.604 ], 00:33:21.604 "driver_specific": { 00:33:21.604 "raid": { 00:33:21.604 "uuid": "e7fa1c4f-99e5-4b2a-8331-7d6ca38e029e", 00:33:21.604 "strip_size_kb": 0, 00:33:21.604 "state": "online", 00:33:21.604 "raid_level": "raid1", 00:33:21.604 "superblock": true, 00:33:21.604 "num_base_bdevs": 4, 00:33:21.604 "num_base_bdevs_discovered": 4, 00:33:21.604 "num_base_bdevs_operational": 4, 00:33:21.604 "base_bdevs_list": [ 00:33:21.604 { 00:33:21.604 "name": "NewBaseBdev", 00:33:21.604 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:21.604 "is_configured": true, 00:33:21.604 "data_offset": 2048, 00:33:21.604 "data_size": 63488 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "name": "BaseBdev2", 00:33:21.604 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:21.604 "is_configured": true, 00:33:21.604 "data_offset": 2048, 00:33:21.604 "data_size": 63488 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "name": "BaseBdev3", 00:33:21.604 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:21.604 "is_configured": true, 00:33:21.604 "data_offset": 2048, 00:33:21.604 "data_size": 63488 00:33:21.604 }, 00:33:21.604 { 00:33:21.604 "name": "BaseBdev4", 00:33:21.604 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:21.604 "is_configured": true, 00:33:21.604 "data_offset": 2048, 00:33:21.604 "data_size": 63488 00:33:21.604 } 00:33:21.604 ] 00:33:21.604 } 00:33:21.604 } 00:33:21.604 }' 00:33:21.863 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:21.863 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:33:21.863 BaseBdev2 00:33:21.863 BaseBdev3 00:33:21.863 BaseBdev4' 00:33:21.863 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:21.863 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:33:21.863 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:22.121 "name": "NewBaseBdev", 00:33:22.121 "aliases": [ 00:33:22.121 "57e1ed4d-cb60-4d6a-86b9-38049a748415" 00:33:22.121 ], 00:33:22.121 "product_name": "Malloc disk", 00:33:22.121 "block_size": 512, 00:33:22.121 "num_blocks": 65536, 00:33:22.121 "uuid": "57e1ed4d-cb60-4d6a-86b9-38049a748415", 00:33:22.121 "assigned_rate_limits": { 00:33:22.121 "rw_ios_per_sec": 0, 00:33:22.121 "rw_mbytes_per_sec": 0, 00:33:22.121 "r_mbytes_per_sec": 0, 00:33:22.121 "w_mbytes_per_sec": 0 00:33:22.121 }, 00:33:22.121 "claimed": true, 00:33:22.121 "claim_type": "exclusive_write", 00:33:22.121 "zoned": false, 00:33:22.121 "supported_io_types": { 00:33:22.121 "read": true, 00:33:22.121 "write": true, 00:33:22.121 "unmap": true, 00:33:22.121 "write_zeroes": true, 00:33:22.121 "flush": true, 00:33:22.121 "reset": true, 00:33:22.121 "compare": false, 00:33:22.121 "compare_and_write": false, 00:33:22.121 "abort": true, 00:33:22.121 "nvme_admin": false, 00:33:22.121 "nvme_io": false 00:33:22.121 }, 00:33:22.121 "memory_domains": [ 00:33:22.121 { 00:33:22.121 "dma_device_id": "system", 00:33:22.121 "dma_device_type": 1 00:33:22.121 }, 00:33:22.121 { 00:33:22.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:22.121 "dma_device_type": 2 00:33:22.121 } 00:33:22.121 ], 00:33:22.121 "driver_specific": {} 00:33:22.121 }' 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:22.121 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.379 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.379 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:22.379 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:22.379 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:22.379 12:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:22.638 "name": "BaseBdev2", 00:33:22.638 "aliases": [ 00:33:22.638 "e32e08ad-961b-49ed-845a-9e4a09513e14" 00:33:22.638 ], 00:33:22.638 "product_name": "Malloc disk", 00:33:22.638 "block_size": 512, 00:33:22.638 "num_blocks": 65536, 00:33:22.638 "uuid": "e32e08ad-961b-49ed-845a-9e4a09513e14", 00:33:22.638 "assigned_rate_limits": { 00:33:22.638 "rw_ios_per_sec": 0, 00:33:22.638 "rw_mbytes_per_sec": 0, 00:33:22.638 "r_mbytes_per_sec": 0, 00:33:22.638 "w_mbytes_per_sec": 0 00:33:22.638 }, 00:33:22.638 "claimed": true, 00:33:22.638 "claim_type": "exclusive_write", 00:33:22.638 "zoned": false, 00:33:22.638 "supported_io_types": { 00:33:22.638 "read": true, 00:33:22.638 "write": true, 00:33:22.638 "unmap": true, 00:33:22.638 "write_zeroes": true, 00:33:22.638 "flush": true, 00:33:22.638 "reset": true, 00:33:22.638 "compare": false, 00:33:22.638 "compare_and_write": false, 00:33:22.638 "abort": true, 00:33:22.638 "nvme_admin": false, 00:33:22.638 "nvme_io": false 00:33:22.638 }, 00:33:22.638 "memory_domains": [ 00:33:22.638 { 00:33:22.638 "dma_device_id": "system", 00:33:22.638 "dma_device_type": 1 00:33:22.638 }, 00:33:22.638 { 00:33:22.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:22.638 "dma_device_type": 2 00:33:22.638 } 00:33:22.638 ], 00:33:22.638 "driver_specific": {} 00:33:22.638 }' 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:22.638 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:33:22.897 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:23.155 "name": "BaseBdev3", 00:33:23.155 "aliases": [ 00:33:23.155 "75fcd703-e312-4b1b-9f90-60c051f9ed12" 00:33:23.155 ], 00:33:23.155 "product_name": "Malloc disk", 00:33:23.155 "block_size": 512, 00:33:23.155 "num_blocks": 65536, 00:33:23.155 "uuid": "75fcd703-e312-4b1b-9f90-60c051f9ed12", 00:33:23.155 "assigned_rate_limits": { 00:33:23.155 "rw_ios_per_sec": 0, 00:33:23.155 "rw_mbytes_per_sec": 0, 00:33:23.155 "r_mbytes_per_sec": 0, 00:33:23.155 "w_mbytes_per_sec": 0 00:33:23.155 }, 00:33:23.155 "claimed": true, 00:33:23.155 "claim_type": "exclusive_write", 00:33:23.155 "zoned": false, 00:33:23.155 "supported_io_types": { 00:33:23.155 "read": true, 00:33:23.155 "write": true, 00:33:23.155 "unmap": true, 00:33:23.155 "write_zeroes": true, 00:33:23.155 "flush": true, 00:33:23.155 "reset": true, 00:33:23.155 "compare": false, 00:33:23.155 "compare_and_write": false, 00:33:23.155 "abort": true, 00:33:23.155 "nvme_admin": false, 00:33:23.155 "nvme_io": false 00:33:23.155 }, 00:33:23.155 "memory_domains": [ 00:33:23.155 { 00:33:23.155 "dma_device_id": "system", 00:33:23.155 "dma_device_type": 1 00:33:23.155 }, 00:33:23.155 { 00:33:23.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:23.155 "dma_device_type": 2 00:33:23.155 } 00:33:23.155 ], 00:33:23.155 "driver_specific": {} 00:33:23.155 }' 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:23.155 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:23.413 12:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:23.413 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:23.413 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:23.413 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:33:23.413 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:23.671 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:23.671 "name": "BaseBdev4", 00:33:23.671 "aliases": [ 00:33:23.671 "4944b0af-eefc-4169-9f0b-7e9735009f7e" 00:33:23.671 ], 00:33:23.671 "product_name": "Malloc disk", 00:33:23.671 "block_size": 512, 00:33:23.671 "num_blocks": 65536, 00:33:23.671 "uuid": "4944b0af-eefc-4169-9f0b-7e9735009f7e", 00:33:23.671 "assigned_rate_limits": { 00:33:23.671 "rw_ios_per_sec": 0, 00:33:23.671 "rw_mbytes_per_sec": 0, 00:33:23.671 "r_mbytes_per_sec": 0, 00:33:23.671 "w_mbytes_per_sec": 0 00:33:23.671 }, 00:33:23.671 "claimed": true, 00:33:23.671 "claim_type": "exclusive_write", 00:33:23.671 "zoned": false, 00:33:23.671 "supported_io_types": { 00:33:23.671 "read": true, 00:33:23.671 "write": true, 00:33:23.671 "unmap": true, 00:33:23.671 "write_zeroes": true, 00:33:23.671 "flush": true, 00:33:23.671 "reset": true, 00:33:23.671 "compare": false, 00:33:23.671 "compare_and_write": false, 00:33:23.671 "abort": true, 00:33:23.671 "nvme_admin": false, 00:33:23.671 "nvme_io": false 00:33:23.671 }, 00:33:23.671 "memory_domains": [ 00:33:23.671 { 00:33:23.671 "dma_device_id": "system", 00:33:23.671 "dma_device_type": 1 00:33:23.671 }, 00:33:23.671 { 00:33:23.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:23.671 "dma_device_type": 2 00:33:23.671 } 00:33:23.671 ], 00:33:23.671 "driver_specific": {} 00:33:23.671 }' 00:33:23.672 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:23.930 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:24.189 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:24.189 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:24.189 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:24.448 [2024-06-07 12:35:47.920349] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:24.448 [2024-06-07 12:35:47.920625] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:24.448 [2024-06-07 12:35:47.920798] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:24.449 [2024-06-07 12:35:47.921116] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:24.449 [2024-06-07 12:35:47.921214] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008180 name Existed_Raid, state offline 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 218911 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 218911 ']' 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 218911 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 218911 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 218911' 00:33:24.449 killing process with pid 218911 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 218911 00:33:24.449 [2024-06-07 12:35:47.978802] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:24.449 12:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 218911 00:33:24.449 [2024-06-07 12:35:48.059279] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:25.015 12:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:33:25.015 00:33:25.015 real 0m35.409s 00:33:25.015 user 1m5.299s 00:33:25.015 sys 0m5.789s 00:33:25.015 12:35:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:25.015 12:35:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:33:25.015 ************************************ 00:33:25.015 END TEST raid_state_function_test_sb 00:33:25.015 ************************************ 00:33:25.015 12:35:48 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:33:25.015 12:35:48 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:33:25.015 12:35:48 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:25.015 12:35:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:25.015 ************************************ 00:33:25.015 START TEST raid_superblock_test 00:33:25.015 ************************************ 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 4 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=220008 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 220008 /var/tmp/spdk-raid.sock 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 220008 ']' 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:25.015 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:25.015 12:35:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:25.015 [2024-06-07 12:35:48.518717] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:33:25.015 [2024-06-07 12:35:48.519427] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid220008 ] 00:33:25.273 [2024-06-07 12:35:48.683374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:25.273 [2024-06-07 12:35:48.777182] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:25.273 [2024-06-07 12:35:48.861064] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:33:26.208 malloc1 00:33:26.208 12:35:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:26.467 [2024-06-07 12:35:50.068629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:26.467 [2024-06-07 12:35:50.068984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:26.467 [2024-06-07 12:35:50.069182] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:33:26.467 [2024-06-07 12:35:50.069401] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:26.467 [2024-06-07 12:35:50.072216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:26.467 [2024-06-07 12:35:50.072444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:26.467 pt1 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:26.467 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:33:26.725 malloc2 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:26.983 [2024-06-07 12:35:50.572197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:26.983 [2024-06-07 12:35:50.572536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:26.983 [2024-06-07 12:35:50.572621] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:33:26.983 [2024-06-07 12:35:50.572851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:26.983 [2024-06-07 12:35:50.575122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:26.983 [2024-06-07 12:35:50.575304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:26.983 pt2 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:26.983 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:33:27.241 malloc3 00:33:27.241 12:35:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:33:27.499 [2024-06-07 12:35:51.110208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:33:27.499 [2024-06-07 12:35:51.110734] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:27.499 [2024-06-07 12:35:51.111207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:33:27.499 [2024-06-07 12:35:51.111742] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:27.499 [2024-06-07 12:35:51.117065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:27.499 [2024-06-07 12:35:51.117529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:33:27.499 pt3 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:27.499 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:33:27.757 malloc4 00:33:27.757 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:28.015 [2024-06-07 12:35:51.553034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:28.015 [2024-06-07 12:35:51.553415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:28.015 [2024-06-07 12:35:51.553641] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:33:28.015 [2024-06-07 12:35:51.553813] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:28.015 [2024-06-07 12:35:51.556373] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:28.015 [2024-06-07 12:35:51.556572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:28.015 pt4 00:33:28.015 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:28.015 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:28.015 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:33:28.308 [2024-06-07 12:35:51.761370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:28.308 [2024-06-07 12:35:51.763461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:28.308 [2024-06-07 12:35:51.763647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:28.308 [2024-06-07 12:35:51.763732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:28.308 [2024-06-07 12:35:51.764002] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008480 00:33:28.308 [2024-06-07 12:35:51.764108] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:28.308 [2024-06-07 12:35:51.764351] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:33:28.308 [2024-06-07 12:35:51.764795] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008480 00:33:28.308 [2024-06-07 12:35:51.764904] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008480 00:33:28.308 [2024-06-07 12:35:51.765121] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:28.308 12:35:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:28.566 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:28.566 "name": "raid_bdev1", 00:33:28.566 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:28.566 "strip_size_kb": 0, 00:33:28.566 "state": "online", 00:33:28.566 "raid_level": "raid1", 00:33:28.566 "superblock": true, 00:33:28.566 "num_base_bdevs": 4, 00:33:28.566 "num_base_bdevs_discovered": 4, 00:33:28.566 "num_base_bdevs_operational": 4, 00:33:28.566 "base_bdevs_list": [ 00:33:28.566 { 00:33:28.566 "name": "pt1", 00:33:28.566 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:28.566 "is_configured": true, 00:33:28.566 "data_offset": 2048, 00:33:28.566 "data_size": 63488 00:33:28.566 }, 00:33:28.566 { 00:33:28.566 "name": "pt2", 00:33:28.566 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:28.566 "is_configured": true, 00:33:28.566 "data_offset": 2048, 00:33:28.566 "data_size": 63488 00:33:28.566 }, 00:33:28.566 { 00:33:28.566 "name": "pt3", 00:33:28.566 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:28.566 "is_configured": true, 00:33:28.566 "data_offset": 2048, 00:33:28.566 "data_size": 63488 00:33:28.566 }, 00:33:28.566 { 00:33:28.566 "name": "pt4", 00:33:28.566 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:28.566 "is_configured": true, 00:33:28.566 "data_offset": 2048, 00:33:28.566 "data_size": 63488 00:33:28.566 } 00:33:28.566 ] 00:33:28.566 }' 00:33:28.566 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:28.566 12:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:29.132 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:29.390 [2024-06-07 12:35:52.801637] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:29.390 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:29.390 "name": "raid_bdev1", 00:33:29.390 "aliases": [ 00:33:29.390 "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d" 00:33:29.390 ], 00:33:29.390 "product_name": "Raid Volume", 00:33:29.391 "block_size": 512, 00:33:29.391 "num_blocks": 63488, 00:33:29.391 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:29.391 "assigned_rate_limits": { 00:33:29.391 "rw_ios_per_sec": 0, 00:33:29.391 "rw_mbytes_per_sec": 0, 00:33:29.391 "r_mbytes_per_sec": 0, 00:33:29.391 "w_mbytes_per_sec": 0 00:33:29.391 }, 00:33:29.391 "claimed": false, 00:33:29.391 "zoned": false, 00:33:29.391 "supported_io_types": { 00:33:29.391 "read": true, 00:33:29.391 "write": true, 00:33:29.391 "unmap": false, 00:33:29.391 "write_zeroes": true, 00:33:29.391 "flush": false, 00:33:29.391 "reset": true, 00:33:29.391 "compare": false, 00:33:29.391 "compare_and_write": false, 00:33:29.391 "abort": false, 00:33:29.391 "nvme_admin": false, 00:33:29.391 "nvme_io": false 00:33:29.391 }, 00:33:29.391 "memory_domains": [ 00:33:29.391 { 00:33:29.391 "dma_device_id": "system", 00:33:29.391 "dma_device_type": 1 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.391 "dma_device_type": 2 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "system", 00:33:29.391 "dma_device_type": 1 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.391 "dma_device_type": 2 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "system", 00:33:29.391 "dma_device_type": 1 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.391 "dma_device_type": 2 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "system", 00:33:29.391 "dma_device_type": 1 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.391 "dma_device_type": 2 00:33:29.391 } 00:33:29.391 ], 00:33:29.391 "driver_specific": { 00:33:29.391 "raid": { 00:33:29.391 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:29.391 "strip_size_kb": 0, 00:33:29.391 "state": "online", 00:33:29.391 "raid_level": "raid1", 00:33:29.391 "superblock": true, 00:33:29.391 "num_base_bdevs": 4, 00:33:29.391 "num_base_bdevs_discovered": 4, 00:33:29.391 "num_base_bdevs_operational": 4, 00:33:29.391 "base_bdevs_list": [ 00:33:29.391 { 00:33:29.391 "name": "pt1", 00:33:29.391 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:29.391 "is_configured": true, 00:33:29.391 "data_offset": 2048, 00:33:29.391 "data_size": 63488 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "name": "pt2", 00:33:29.391 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:29.391 "is_configured": true, 00:33:29.391 "data_offset": 2048, 00:33:29.391 "data_size": 63488 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "name": "pt3", 00:33:29.391 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:29.391 "is_configured": true, 00:33:29.391 "data_offset": 2048, 00:33:29.391 "data_size": 63488 00:33:29.391 }, 00:33:29.391 { 00:33:29.391 "name": "pt4", 00:33:29.391 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:29.391 "is_configured": true, 00:33:29.391 "data_offset": 2048, 00:33:29.391 "data_size": 63488 00:33:29.391 } 00:33:29.391 ] 00:33:29.391 } 00:33:29.391 } 00:33:29.391 }' 00:33:29.391 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:29.391 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:29.391 pt2 00:33:29.391 pt3 00:33:29.391 pt4' 00:33:29.391 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:29.391 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:29.391 12:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:29.648 "name": "pt1", 00:33:29.648 "aliases": [ 00:33:29.648 "00000000-0000-0000-0000-000000000001" 00:33:29.648 ], 00:33:29.648 "product_name": "passthru", 00:33:29.648 "block_size": 512, 00:33:29.648 "num_blocks": 65536, 00:33:29.648 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:29.648 "assigned_rate_limits": { 00:33:29.648 "rw_ios_per_sec": 0, 00:33:29.648 "rw_mbytes_per_sec": 0, 00:33:29.648 "r_mbytes_per_sec": 0, 00:33:29.648 "w_mbytes_per_sec": 0 00:33:29.648 }, 00:33:29.648 "claimed": true, 00:33:29.648 "claim_type": "exclusive_write", 00:33:29.648 "zoned": false, 00:33:29.648 "supported_io_types": { 00:33:29.648 "read": true, 00:33:29.648 "write": true, 00:33:29.648 "unmap": true, 00:33:29.648 "write_zeroes": true, 00:33:29.648 "flush": true, 00:33:29.648 "reset": true, 00:33:29.648 "compare": false, 00:33:29.648 "compare_and_write": false, 00:33:29.648 "abort": true, 00:33:29.648 "nvme_admin": false, 00:33:29.648 "nvme_io": false 00:33:29.648 }, 00:33:29.648 "memory_domains": [ 00:33:29.648 { 00:33:29.648 "dma_device_id": "system", 00:33:29.648 "dma_device_type": 1 00:33:29.648 }, 00:33:29.648 { 00:33:29.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.648 "dma_device_type": 2 00:33:29.648 } 00:33:29.648 ], 00:33:29.648 "driver_specific": { 00:33:29.648 "passthru": { 00:33:29.648 "name": "pt1", 00:33:29.648 "base_bdev_name": "malloc1" 00:33:29.648 } 00:33:29.648 } 00:33:29.648 }' 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:29.648 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:29.906 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:30.164 "name": "pt2", 00:33:30.164 "aliases": [ 00:33:30.164 "00000000-0000-0000-0000-000000000002" 00:33:30.164 ], 00:33:30.164 "product_name": "passthru", 00:33:30.164 "block_size": 512, 00:33:30.164 "num_blocks": 65536, 00:33:30.164 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:30.164 "assigned_rate_limits": { 00:33:30.164 "rw_ios_per_sec": 0, 00:33:30.164 "rw_mbytes_per_sec": 0, 00:33:30.164 "r_mbytes_per_sec": 0, 00:33:30.164 "w_mbytes_per_sec": 0 00:33:30.164 }, 00:33:30.164 "claimed": true, 00:33:30.164 "claim_type": "exclusive_write", 00:33:30.164 "zoned": false, 00:33:30.164 "supported_io_types": { 00:33:30.164 "read": true, 00:33:30.164 "write": true, 00:33:30.164 "unmap": true, 00:33:30.164 "write_zeroes": true, 00:33:30.164 "flush": true, 00:33:30.164 "reset": true, 00:33:30.164 "compare": false, 00:33:30.164 "compare_and_write": false, 00:33:30.164 "abort": true, 00:33:30.164 "nvme_admin": false, 00:33:30.164 "nvme_io": false 00:33:30.164 }, 00:33:30.164 "memory_domains": [ 00:33:30.164 { 00:33:30.164 "dma_device_id": "system", 00:33:30.164 "dma_device_type": 1 00:33:30.164 }, 00:33:30.164 { 00:33:30.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.164 "dma_device_type": 2 00:33:30.164 } 00:33:30.164 ], 00:33:30.164 "driver_specific": { 00:33:30.164 "passthru": { 00:33:30.164 "name": "pt2", 00:33:30.164 "base_bdev_name": "malloc2" 00:33:30.164 } 00:33:30.164 } 00:33:30.164 }' 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:30.164 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:30.422 12:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:33:30.680 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:30.680 "name": "pt3", 00:33:30.680 "aliases": [ 00:33:30.680 "00000000-0000-0000-0000-000000000003" 00:33:30.680 ], 00:33:30.680 "product_name": "passthru", 00:33:30.680 "block_size": 512, 00:33:30.680 "num_blocks": 65536, 00:33:30.680 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:30.680 "assigned_rate_limits": { 00:33:30.680 "rw_ios_per_sec": 0, 00:33:30.680 "rw_mbytes_per_sec": 0, 00:33:30.680 "r_mbytes_per_sec": 0, 00:33:30.680 "w_mbytes_per_sec": 0 00:33:30.680 }, 00:33:30.680 "claimed": true, 00:33:30.680 "claim_type": "exclusive_write", 00:33:30.680 "zoned": false, 00:33:30.680 "supported_io_types": { 00:33:30.680 "read": true, 00:33:30.680 "write": true, 00:33:30.680 "unmap": true, 00:33:30.680 "write_zeroes": true, 00:33:30.680 "flush": true, 00:33:30.680 "reset": true, 00:33:30.680 "compare": false, 00:33:30.680 "compare_and_write": false, 00:33:30.680 "abort": true, 00:33:30.680 "nvme_admin": false, 00:33:30.680 "nvme_io": false 00:33:30.680 }, 00:33:30.680 "memory_domains": [ 00:33:30.680 { 00:33:30.680 "dma_device_id": "system", 00:33:30.680 "dma_device_type": 1 00:33:30.680 }, 00:33:30.680 { 00:33:30.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:30.680 "dma_device_type": 2 00:33:30.680 } 00:33:30.680 ], 00:33:30.680 "driver_specific": { 00:33:30.680 "passthru": { 00:33:30.680 "name": "pt3", 00:33:30.680 "base_bdev_name": "malloc3" 00:33:30.680 } 00:33:30.680 } 00:33:30.680 }' 00:33:30.680 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:30.680 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:30.681 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:30.681 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:30.681 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:33:30.939 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:31.196 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:31.196 "name": "pt4", 00:33:31.196 "aliases": [ 00:33:31.196 "00000000-0000-0000-0000-000000000004" 00:33:31.196 ], 00:33:31.196 "product_name": "passthru", 00:33:31.196 "block_size": 512, 00:33:31.196 "num_blocks": 65536, 00:33:31.196 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:31.196 "assigned_rate_limits": { 00:33:31.196 "rw_ios_per_sec": 0, 00:33:31.196 "rw_mbytes_per_sec": 0, 00:33:31.196 "r_mbytes_per_sec": 0, 00:33:31.196 "w_mbytes_per_sec": 0 00:33:31.196 }, 00:33:31.196 "claimed": true, 00:33:31.196 "claim_type": "exclusive_write", 00:33:31.196 "zoned": false, 00:33:31.196 "supported_io_types": { 00:33:31.196 "read": true, 00:33:31.196 "write": true, 00:33:31.196 "unmap": true, 00:33:31.196 "write_zeroes": true, 00:33:31.196 "flush": true, 00:33:31.196 "reset": true, 00:33:31.196 "compare": false, 00:33:31.196 "compare_and_write": false, 00:33:31.196 "abort": true, 00:33:31.196 "nvme_admin": false, 00:33:31.196 "nvme_io": false 00:33:31.196 }, 00:33:31.196 "memory_domains": [ 00:33:31.196 { 00:33:31.196 "dma_device_id": "system", 00:33:31.196 "dma_device_type": 1 00:33:31.196 }, 00:33:31.197 { 00:33:31.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:31.197 "dma_device_type": 2 00:33:31.197 } 00:33:31.197 ], 00:33:31.197 "driver_specific": { 00:33:31.197 "passthru": { 00:33:31.197 "name": "pt4", 00:33:31.197 "base_bdev_name": "malloc4" 00:33:31.197 } 00:33:31.197 } 00:33:31.197 }' 00:33:31.197 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.197 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:31.197 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:31.197 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:31.454 12:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.454 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:31.454 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:31.454 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:31.454 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:33:31.713 [2024-06-07 12:35:55.245905] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:31.713 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0a569c85-d4ff-43c1-a046-ff2ee94b5c9d 00:33:31.713 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0a569c85-d4ff-43c1-a046-ff2ee94b5c9d ']' 00:33:31.713 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:31.972 [2024-06-07 12:35:55.541733] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:31.972 [2024-06-07 12:35:55.542003] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:31.972 [2024-06-07 12:35:55.542211] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:31.972 [2024-06-07 12:35:55.542396] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:31.972 [2024-06-07 12:35:55.542492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008480 name raid_bdev1, state offline 00:33:31.972 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:31.972 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:33:32.230 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:33:32.230 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:33:32.231 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:32.231 12:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:33:32.489 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:32.489 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:32.748 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:32.748 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:33:33.006 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:33.006 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:33:33.264 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:33:33.264 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:33:33.523 12:35:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:33:33.523 [2024-06-07 12:35:57.141897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:33:33.523 [2024-06-07 12:35:57.144069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:33:33.523 [2024-06-07 12:35:57.144268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:33:33.523 [2024-06-07 12:35:57.144334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:33:33.523 [2024-06-07 12:35:57.144474] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:33:33.523 [2024-06-07 12:35:57.144659] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:33:33.523 [2024-06-07 12:35:57.144754] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:33:33.523 [2024-06-07 12:35:57.144929] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:33:33.523 [2024-06-07 12:35:57.145093] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:33.523 [2024-06-07 12:35:57.145132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state configuring 00:33:33.523 request: 00:33:33.523 { 00:33:33.523 "name": "raid_bdev1", 00:33:33.523 "raid_level": "raid1", 00:33:33.523 "base_bdevs": [ 00:33:33.523 "malloc1", 00:33:33.523 "malloc2", 00:33:33.523 "malloc3", 00:33:33.523 "malloc4" 00:33:33.523 ], 00:33:33.523 "superblock": false, 00:33:33.523 "method": "bdev_raid_create", 00:33:33.523 "req_id": 1 00:33:33.523 } 00:33:33.523 Got JSON-RPC error response 00:33:33.523 response: 00:33:33.523 { 00:33:33.523 "code": -17, 00:33:33.523 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:33:33.523 } 00:33:33.523 12:35:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:33:33.523 12:35:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:33:33.523 12:35:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:33:33.523 12:35:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:33:33.782 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:33:33.782 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:33.782 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:33:33.782 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:33:33.783 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:34.040 [2024-06-07 12:35:57.569922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:34.040 [2024-06-07 12:35:57.570276] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:34.040 [2024-06-07 12:35:57.570454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:33:34.040 [2024-06-07 12:35:57.570585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:34.040 [2024-06-07 12:35:57.572837] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:34.040 [2024-06-07 12:35:57.573024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:34.040 [2024-06-07 12:35:57.573265] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:33:34.040 [2024-06-07 12:35:57.573403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:34.040 pt1 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:34.040 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:34.298 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:34.298 "name": "raid_bdev1", 00:33:34.298 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:34.298 "strip_size_kb": 0, 00:33:34.298 "state": "configuring", 00:33:34.298 "raid_level": "raid1", 00:33:34.298 "superblock": true, 00:33:34.298 "num_base_bdevs": 4, 00:33:34.298 "num_base_bdevs_discovered": 1, 00:33:34.298 "num_base_bdevs_operational": 4, 00:33:34.298 "base_bdevs_list": [ 00:33:34.298 { 00:33:34.298 "name": "pt1", 00:33:34.298 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:34.298 "is_configured": true, 00:33:34.298 "data_offset": 2048, 00:33:34.298 "data_size": 63488 00:33:34.298 }, 00:33:34.298 { 00:33:34.298 "name": null, 00:33:34.298 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:34.298 "is_configured": false, 00:33:34.298 "data_offset": 2048, 00:33:34.298 "data_size": 63488 00:33:34.298 }, 00:33:34.298 { 00:33:34.298 "name": null, 00:33:34.298 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:34.298 "is_configured": false, 00:33:34.298 "data_offset": 2048, 00:33:34.298 "data_size": 63488 00:33:34.298 }, 00:33:34.298 { 00:33:34.298 "name": null, 00:33:34.298 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:34.298 "is_configured": false, 00:33:34.298 "data_offset": 2048, 00:33:34.298 "data_size": 63488 00:33:34.298 } 00:33:34.298 ] 00:33:34.298 }' 00:33:34.298 12:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:34.298 12:35:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:34.865 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:33:34.865 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:35.124 [2024-06-07 12:35:58.594060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:35.124 [2024-06-07 12:35:58.594413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:35.124 [2024-06-07 12:35:58.594560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:33:35.124 [2024-06-07 12:35:58.594674] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:35.124 [2024-06-07 12:35:58.595103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:35.124 [2024-06-07 12:35:58.595267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:35.124 [2024-06-07 12:35:58.595437] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:35.124 [2024-06-07 12:35:58.595558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:35.124 pt2 00:33:35.124 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:35.382 [2024-06-07 12:35:58.802095] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:35.382 12:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:35.711 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:35.711 "name": "raid_bdev1", 00:33:35.711 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:35.711 "strip_size_kb": 0, 00:33:35.711 "state": "configuring", 00:33:35.711 "raid_level": "raid1", 00:33:35.711 "superblock": true, 00:33:35.711 "num_base_bdevs": 4, 00:33:35.711 "num_base_bdevs_discovered": 1, 00:33:35.711 "num_base_bdevs_operational": 4, 00:33:35.711 "base_bdevs_list": [ 00:33:35.711 { 00:33:35.711 "name": "pt1", 00:33:35.711 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:35.711 "is_configured": true, 00:33:35.711 "data_offset": 2048, 00:33:35.711 "data_size": 63488 00:33:35.711 }, 00:33:35.711 { 00:33:35.711 "name": null, 00:33:35.711 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:35.711 "is_configured": false, 00:33:35.711 "data_offset": 2048, 00:33:35.711 "data_size": 63488 00:33:35.711 }, 00:33:35.711 { 00:33:35.711 "name": null, 00:33:35.711 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:35.711 "is_configured": false, 00:33:35.711 "data_offset": 2048, 00:33:35.711 "data_size": 63488 00:33:35.711 }, 00:33:35.711 { 00:33:35.711 "name": null, 00:33:35.711 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:35.711 "is_configured": false, 00:33:35.711 "data_offset": 2048, 00:33:35.711 "data_size": 63488 00:33:35.711 } 00:33:35.711 ] 00:33:35.711 }' 00:33:35.711 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:35.711 12:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:36.277 [2024-06-07 12:35:59.828694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:36.277 [2024-06-07 12:35:59.829063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:36.277 [2024-06-07 12:35:59.829145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009c80 00:33:36.277 [2024-06-07 12:35:59.829262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:36.277 [2024-06-07 12:35:59.829715] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:36.277 [2024-06-07 12:35:59.829882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:36.277 [2024-06-07 12:35:59.830102] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:36.277 [2024-06-07 12:35:59.830218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:36.277 pt2 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:36.277 12:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:33:36.536 [2024-06-07 12:36:00.116780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:33:36.536 [2024-06-07 12:36:00.117131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:36.536 [2024-06-07 12:36:00.117205] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:33:36.536 [2024-06-07 12:36:00.117348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:36.536 [2024-06-07 12:36:00.117761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:36.536 [2024-06-07 12:36:00.117941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:33:36.536 [2024-06-07 12:36:00.118141] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:33:36.536 [2024-06-07 12:36:00.118271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:36.536 pt3 00:33:36.536 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:36.536 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:36.536 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:36.795 [2024-06-07 12:36:00.340776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:36.795 [2024-06-07 12:36:00.341099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:36.795 [2024-06-07 12:36:00.341167] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a280 00:33:36.795 [2024-06-07 12:36:00.341280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:36.795 [2024-06-07 12:36:00.341681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:36.795 [2024-06-07 12:36:00.341839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:36.795 [2024-06-07 12:36:00.341993] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:33:36.795 [2024-06-07 12:36:00.342117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:36.795 [2024-06-07 12:36:00.342267] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:33:36.795 [2024-06-07 12:36:00.342340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:36.795 [2024-06-07 12:36:00.342438] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:33:36.795 [2024-06-07 12:36:00.342789] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:33:36.795 [2024-06-07 12:36:00.342884] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:33:36.795 [2024-06-07 12:36:00.343018] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:36.795 pt4 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:36.795 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:36.796 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:36.796 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:36.796 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:36.796 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:37.054 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:37.054 "name": "raid_bdev1", 00:33:37.054 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:37.054 "strip_size_kb": 0, 00:33:37.054 "state": "online", 00:33:37.054 "raid_level": "raid1", 00:33:37.054 "superblock": true, 00:33:37.054 "num_base_bdevs": 4, 00:33:37.054 "num_base_bdevs_discovered": 4, 00:33:37.054 "num_base_bdevs_operational": 4, 00:33:37.054 "base_bdevs_list": [ 00:33:37.054 { 00:33:37.054 "name": "pt1", 00:33:37.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:37.054 "is_configured": true, 00:33:37.054 "data_offset": 2048, 00:33:37.054 "data_size": 63488 00:33:37.054 }, 00:33:37.054 { 00:33:37.054 "name": "pt2", 00:33:37.054 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:37.054 "is_configured": true, 00:33:37.054 "data_offset": 2048, 00:33:37.054 "data_size": 63488 00:33:37.054 }, 00:33:37.054 { 00:33:37.054 "name": "pt3", 00:33:37.054 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:37.054 "is_configured": true, 00:33:37.054 "data_offset": 2048, 00:33:37.054 "data_size": 63488 00:33:37.054 }, 00:33:37.054 { 00:33:37.054 "name": "pt4", 00:33:37.054 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:37.055 "is_configured": true, 00:33:37.055 "data_offset": 2048, 00:33:37.055 "data_size": 63488 00:33:37.055 } 00:33:37.055 ] 00:33:37.055 }' 00:33:37.055 12:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:37.055 12:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:37.620 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:37.878 [2024-06-07 12:36:01.285109] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:37.878 "name": "raid_bdev1", 00:33:37.878 "aliases": [ 00:33:37.878 "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d" 00:33:37.878 ], 00:33:37.878 "product_name": "Raid Volume", 00:33:37.878 "block_size": 512, 00:33:37.878 "num_blocks": 63488, 00:33:37.878 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:37.878 "assigned_rate_limits": { 00:33:37.878 "rw_ios_per_sec": 0, 00:33:37.878 "rw_mbytes_per_sec": 0, 00:33:37.878 "r_mbytes_per_sec": 0, 00:33:37.878 "w_mbytes_per_sec": 0 00:33:37.878 }, 00:33:37.878 "claimed": false, 00:33:37.878 "zoned": false, 00:33:37.878 "supported_io_types": { 00:33:37.878 "read": true, 00:33:37.878 "write": true, 00:33:37.878 "unmap": false, 00:33:37.878 "write_zeroes": true, 00:33:37.878 "flush": false, 00:33:37.878 "reset": true, 00:33:37.878 "compare": false, 00:33:37.878 "compare_and_write": false, 00:33:37.878 "abort": false, 00:33:37.878 "nvme_admin": false, 00:33:37.878 "nvme_io": false 00:33:37.878 }, 00:33:37.878 "memory_domains": [ 00:33:37.878 { 00:33:37.878 "dma_device_id": "system", 00:33:37.878 "dma_device_type": 1 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:37.878 "dma_device_type": 2 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "system", 00:33:37.878 "dma_device_type": 1 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:37.878 "dma_device_type": 2 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "system", 00:33:37.878 "dma_device_type": 1 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:37.878 "dma_device_type": 2 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "system", 00:33:37.878 "dma_device_type": 1 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:37.878 "dma_device_type": 2 00:33:37.878 } 00:33:37.878 ], 00:33:37.878 "driver_specific": { 00:33:37.878 "raid": { 00:33:37.878 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:37.878 "strip_size_kb": 0, 00:33:37.878 "state": "online", 00:33:37.878 "raid_level": "raid1", 00:33:37.878 "superblock": true, 00:33:37.878 "num_base_bdevs": 4, 00:33:37.878 "num_base_bdevs_discovered": 4, 00:33:37.878 "num_base_bdevs_operational": 4, 00:33:37.878 "base_bdevs_list": [ 00:33:37.878 { 00:33:37.878 "name": "pt1", 00:33:37.878 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:37.878 "is_configured": true, 00:33:37.878 "data_offset": 2048, 00:33:37.878 "data_size": 63488 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "name": "pt2", 00:33:37.878 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:37.878 "is_configured": true, 00:33:37.878 "data_offset": 2048, 00:33:37.878 "data_size": 63488 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "name": "pt3", 00:33:37.878 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:37.878 "is_configured": true, 00:33:37.878 "data_offset": 2048, 00:33:37.878 "data_size": 63488 00:33:37.878 }, 00:33:37.878 { 00:33:37.878 "name": "pt4", 00:33:37.878 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:37.878 "is_configured": true, 00:33:37.878 "data_offset": 2048, 00:33:37.878 "data_size": 63488 00:33:37.878 } 00:33:37.878 ] 00:33:37.878 } 00:33:37.878 } 00:33:37.878 }' 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:37.878 pt2 00:33:37.878 pt3 00:33:37.878 pt4' 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:37.878 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:38.135 "name": "pt1", 00:33:38.135 "aliases": [ 00:33:38.135 "00000000-0000-0000-0000-000000000001" 00:33:38.135 ], 00:33:38.135 "product_name": "passthru", 00:33:38.135 "block_size": 512, 00:33:38.135 "num_blocks": 65536, 00:33:38.135 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:38.135 "assigned_rate_limits": { 00:33:38.135 "rw_ios_per_sec": 0, 00:33:38.135 "rw_mbytes_per_sec": 0, 00:33:38.135 "r_mbytes_per_sec": 0, 00:33:38.135 "w_mbytes_per_sec": 0 00:33:38.135 }, 00:33:38.135 "claimed": true, 00:33:38.135 "claim_type": "exclusive_write", 00:33:38.135 "zoned": false, 00:33:38.135 "supported_io_types": { 00:33:38.135 "read": true, 00:33:38.135 "write": true, 00:33:38.135 "unmap": true, 00:33:38.135 "write_zeroes": true, 00:33:38.135 "flush": true, 00:33:38.135 "reset": true, 00:33:38.135 "compare": false, 00:33:38.135 "compare_and_write": false, 00:33:38.135 "abort": true, 00:33:38.135 "nvme_admin": false, 00:33:38.135 "nvme_io": false 00:33:38.135 }, 00:33:38.135 "memory_domains": [ 00:33:38.135 { 00:33:38.135 "dma_device_id": "system", 00:33:38.135 "dma_device_type": 1 00:33:38.135 }, 00:33:38.135 { 00:33:38.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:38.135 "dma_device_type": 2 00:33:38.135 } 00:33:38.135 ], 00:33:38.135 "driver_specific": { 00:33:38.135 "passthru": { 00:33:38.135 "name": "pt1", 00:33:38.135 "base_bdev_name": "malloc1" 00:33:38.135 } 00:33:38.135 } 00:33:38.135 }' 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:38.135 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:38.393 12:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:38.651 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:38.651 "name": "pt2", 00:33:38.651 "aliases": [ 00:33:38.651 "00000000-0000-0000-0000-000000000002" 00:33:38.651 ], 00:33:38.651 "product_name": "passthru", 00:33:38.651 "block_size": 512, 00:33:38.651 "num_blocks": 65536, 00:33:38.651 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:38.651 "assigned_rate_limits": { 00:33:38.651 "rw_ios_per_sec": 0, 00:33:38.651 "rw_mbytes_per_sec": 0, 00:33:38.651 "r_mbytes_per_sec": 0, 00:33:38.651 "w_mbytes_per_sec": 0 00:33:38.651 }, 00:33:38.651 "claimed": true, 00:33:38.651 "claim_type": "exclusive_write", 00:33:38.651 "zoned": false, 00:33:38.651 "supported_io_types": { 00:33:38.651 "read": true, 00:33:38.651 "write": true, 00:33:38.651 "unmap": true, 00:33:38.651 "write_zeroes": true, 00:33:38.651 "flush": true, 00:33:38.651 "reset": true, 00:33:38.651 "compare": false, 00:33:38.651 "compare_and_write": false, 00:33:38.651 "abort": true, 00:33:38.651 "nvme_admin": false, 00:33:38.651 "nvme_io": false 00:33:38.651 }, 00:33:38.651 "memory_domains": [ 00:33:38.651 { 00:33:38.651 "dma_device_id": "system", 00:33:38.651 "dma_device_type": 1 00:33:38.651 }, 00:33:38.651 { 00:33:38.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:38.651 "dma_device_type": 2 00:33:38.651 } 00:33:38.651 ], 00:33:38.651 "driver_specific": { 00:33:38.651 "passthru": { 00:33:38.651 "name": "pt2", 00:33:38.651 "base_bdev_name": "malloc2" 00:33:38.651 } 00:33:38.651 } 00:33:38.651 }' 00:33:38.651 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.651 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.651 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:38.651 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:33:38.910 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:39.169 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:39.169 "name": "pt3", 00:33:39.169 "aliases": [ 00:33:39.169 "00000000-0000-0000-0000-000000000003" 00:33:39.169 ], 00:33:39.169 "product_name": "passthru", 00:33:39.169 "block_size": 512, 00:33:39.169 "num_blocks": 65536, 00:33:39.169 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:39.169 "assigned_rate_limits": { 00:33:39.169 "rw_ios_per_sec": 0, 00:33:39.169 "rw_mbytes_per_sec": 0, 00:33:39.169 "r_mbytes_per_sec": 0, 00:33:39.169 "w_mbytes_per_sec": 0 00:33:39.169 }, 00:33:39.169 "claimed": true, 00:33:39.169 "claim_type": "exclusive_write", 00:33:39.169 "zoned": false, 00:33:39.169 "supported_io_types": { 00:33:39.169 "read": true, 00:33:39.169 "write": true, 00:33:39.169 "unmap": true, 00:33:39.169 "write_zeroes": true, 00:33:39.169 "flush": true, 00:33:39.169 "reset": true, 00:33:39.169 "compare": false, 00:33:39.169 "compare_and_write": false, 00:33:39.169 "abort": true, 00:33:39.169 "nvme_admin": false, 00:33:39.169 "nvme_io": false 00:33:39.169 }, 00:33:39.169 "memory_domains": [ 00:33:39.169 { 00:33:39.169 "dma_device_id": "system", 00:33:39.169 "dma_device_type": 1 00:33:39.169 }, 00:33:39.169 { 00:33:39.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:39.169 "dma_device_type": 2 00:33:39.169 } 00:33:39.169 ], 00:33:39.169 "driver_specific": { 00:33:39.169 "passthru": { 00:33:39.169 "name": "pt3", 00:33:39.169 "base_bdev_name": "malloc3" 00:33:39.169 } 00:33:39.169 } 00:33:39.169 }' 00:33:39.169 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:39.169 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:39.169 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:39.169 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:39.429 12:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:39.429 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:39.429 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:39.429 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:33:39.429 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:39.720 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:39.720 "name": "pt4", 00:33:39.720 "aliases": [ 00:33:39.720 "00000000-0000-0000-0000-000000000004" 00:33:39.720 ], 00:33:39.720 "product_name": "passthru", 00:33:39.720 "block_size": 512, 00:33:39.720 "num_blocks": 65536, 00:33:39.720 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:39.720 "assigned_rate_limits": { 00:33:39.720 "rw_ios_per_sec": 0, 00:33:39.720 "rw_mbytes_per_sec": 0, 00:33:39.720 "r_mbytes_per_sec": 0, 00:33:39.720 "w_mbytes_per_sec": 0 00:33:39.720 }, 00:33:39.720 "claimed": true, 00:33:39.720 "claim_type": "exclusive_write", 00:33:39.720 "zoned": false, 00:33:39.720 "supported_io_types": { 00:33:39.720 "read": true, 00:33:39.720 "write": true, 00:33:39.720 "unmap": true, 00:33:39.720 "write_zeroes": true, 00:33:39.720 "flush": true, 00:33:39.720 "reset": true, 00:33:39.720 "compare": false, 00:33:39.720 "compare_and_write": false, 00:33:39.720 "abort": true, 00:33:39.720 "nvme_admin": false, 00:33:39.720 "nvme_io": false 00:33:39.720 }, 00:33:39.720 "memory_domains": [ 00:33:39.720 { 00:33:39.720 "dma_device_id": "system", 00:33:39.720 "dma_device_type": 1 00:33:39.720 }, 00:33:39.720 { 00:33:39.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:39.720 "dma_device_type": 2 00:33:39.720 } 00:33:39.720 ], 00:33:39.720 "driver_specific": { 00:33:39.720 "passthru": { 00:33:39.720 "name": "pt4", 00:33:39.720 "base_bdev_name": "malloc4" 00:33:39.720 } 00:33:39.720 } 00:33:39.720 }' 00:33:39.720 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:33:39.978 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:40.236 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:40.236 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:33:40.237 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:40.237 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:33:40.237 [2024-06-07 12:36:03.877435] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:40.495 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0a569c85-d4ff-43c1-a046-ff2ee94b5c9d '!=' 0a569c85-d4ff-43c1-a046-ff2ee94b5c9d ']' 00:33:40.495 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:33:40.495 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:40.495 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:33:40.495 12:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:33:40.754 [2024-06-07 12:36:04.153320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:40.754 "name": "raid_bdev1", 00:33:40.754 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:40.754 "strip_size_kb": 0, 00:33:40.754 "state": "online", 00:33:40.754 "raid_level": "raid1", 00:33:40.754 "superblock": true, 00:33:40.754 "num_base_bdevs": 4, 00:33:40.754 "num_base_bdevs_discovered": 3, 00:33:40.754 "num_base_bdevs_operational": 3, 00:33:40.754 "base_bdevs_list": [ 00:33:40.754 { 00:33:40.754 "name": null, 00:33:40.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:40.754 "is_configured": false, 00:33:40.754 "data_offset": 2048, 00:33:40.754 "data_size": 63488 00:33:40.754 }, 00:33:40.754 { 00:33:40.754 "name": "pt2", 00:33:40.754 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:40.754 "is_configured": true, 00:33:40.754 "data_offset": 2048, 00:33:40.754 "data_size": 63488 00:33:40.754 }, 00:33:40.754 { 00:33:40.754 "name": "pt3", 00:33:40.754 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:40.754 "is_configured": true, 00:33:40.754 "data_offset": 2048, 00:33:40.754 "data_size": 63488 00:33:40.754 }, 00:33:40.754 { 00:33:40.754 "name": "pt4", 00:33:40.754 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:40.754 "is_configured": true, 00:33:40.754 "data_offset": 2048, 00:33:40.754 "data_size": 63488 00:33:40.754 } 00:33:40.754 ] 00:33:40.754 }' 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:40.754 12:36:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:41.321 12:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:41.580 [2024-06-07 12:36:05.209378] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:41.580 [2024-06-07 12:36:05.209430] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:41.580 [2024-06-07 12:36:05.209497] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:41.580 [2024-06-07 12:36:05.209558] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:41.580 [2024-06-07 12:36:05.209568] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:33:41.838 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:42.096 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:42.355 12:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:33:42.614 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:33:42.614 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:42.614 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:33:42.614 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:33:42.614 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:42.872 [2024-06-07 12:36:06.337471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:42.872 [2024-06-07 12:36:06.337599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:42.872 [2024-06-07 12:36:06.337634] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a580 00:33:42.872 [2024-06-07 12:36:06.337663] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:42.872 [2024-06-07 12:36:06.339812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:42.872 [2024-06-07 12:36:06.339900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:42.872 [2024-06-07 12:36:06.339978] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:42.872 [2024-06-07 12:36:06.340009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:42.872 pt2 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:42.872 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:43.130 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:43.130 "name": "raid_bdev1", 00:33:43.130 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:43.130 "strip_size_kb": 0, 00:33:43.130 "state": "configuring", 00:33:43.130 "raid_level": "raid1", 00:33:43.130 "superblock": true, 00:33:43.130 "num_base_bdevs": 4, 00:33:43.130 "num_base_bdevs_discovered": 1, 00:33:43.130 "num_base_bdevs_operational": 3, 00:33:43.130 "base_bdevs_list": [ 00:33:43.130 { 00:33:43.130 "name": null, 00:33:43.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:43.130 "is_configured": false, 00:33:43.130 "data_offset": 2048, 00:33:43.130 "data_size": 63488 00:33:43.130 }, 00:33:43.130 { 00:33:43.130 "name": "pt2", 00:33:43.130 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:43.130 "is_configured": true, 00:33:43.130 "data_offset": 2048, 00:33:43.130 "data_size": 63488 00:33:43.130 }, 00:33:43.130 { 00:33:43.130 "name": null, 00:33:43.130 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:43.130 "is_configured": false, 00:33:43.130 "data_offset": 2048, 00:33:43.130 "data_size": 63488 00:33:43.130 }, 00:33:43.130 { 00:33:43.130 "name": null, 00:33:43.130 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:43.130 "is_configured": false, 00:33:43.130 "data_offset": 2048, 00:33:43.130 "data_size": 63488 00:33:43.130 } 00:33:43.130 ] 00:33:43.130 }' 00:33:43.130 12:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:43.130 12:36:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:43.696 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:33:43.696 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:33:43.696 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:33:43.955 [2024-06-07 12:36:07.377670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:33:43.955 [2024-06-07 12:36:07.378031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:43.955 [2024-06-07 12:36:07.378115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000ae80 00:33:43.955 [2024-06-07 12:36:07.378239] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:43.955 [2024-06-07 12:36:07.378690] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:43.955 [2024-06-07 12:36:07.378889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:33:43.955 [2024-06-07 12:36:07.379078] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:33:43.955 [2024-06-07 12:36:07.379137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:43.955 pt3 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:43.955 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:44.214 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:44.214 "name": "raid_bdev1", 00:33:44.214 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:44.214 "strip_size_kb": 0, 00:33:44.214 "state": "configuring", 00:33:44.214 "raid_level": "raid1", 00:33:44.214 "superblock": true, 00:33:44.214 "num_base_bdevs": 4, 00:33:44.214 "num_base_bdevs_discovered": 2, 00:33:44.214 "num_base_bdevs_operational": 3, 00:33:44.214 "base_bdevs_list": [ 00:33:44.214 { 00:33:44.214 "name": null, 00:33:44.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:44.214 "is_configured": false, 00:33:44.214 "data_offset": 2048, 00:33:44.214 "data_size": 63488 00:33:44.214 }, 00:33:44.214 { 00:33:44.214 "name": "pt2", 00:33:44.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:44.214 "is_configured": true, 00:33:44.214 "data_offset": 2048, 00:33:44.214 "data_size": 63488 00:33:44.214 }, 00:33:44.214 { 00:33:44.214 "name": "pt3", 00:33:44.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:44.214 "is_configured": true, 00:33:44.214 "data_offset": 2048, 00:33:44.214 "data_size": 63488 00:33:44.214 }, 00:33:44.214 { 00:33:44.214 "name": null, 00:33:44.214 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:44.214 "is_configured": false, 00:33:44.214 "data_offset": 2048, 00:33:44.214 "data_size": 63488 00:33:44.214 } 00:33:44.214 ] 00:33:44.214 }' 00:33:44.214 12:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:44.214 12:36:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:44.779 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:33:44.779 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:33:44.779 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:33:44.779 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:45.036 [2024-06-07 12:36:08.445811] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:45.036 [2024-06-07 12:36:08.446152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:45.036 [2024-06-07 12:36:08.446271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b180 00:33:45.036 [2024-06-07 12:36:08.446374] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:45.036 [2024-06-07 12:36:08.446799] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:45.036 [2024-06-07 12:36:08.446951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:45.036 [2024-06-07 12:36:08.447108] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:33:45.036 [2024-06-07 12:36:08.447162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:45.036 [2024-06-07 12:36:08.447454] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600000ab80 00:33:45.036 [2024-06-07 12:36:08.447565] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:45.036 [2024-06-07 12:36:08.447685] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002e20 00:33:45.036 [2024-06-07 12:36:08.448038] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600000ab80 00:33:45.037 [2024-06-07 12:36:08.448147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600000ab80 00:33:45.037 [2024-06-07 12:36:08.448328] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:45.037 pt4 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:45.037 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:45.294 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:45.294 "name": "raid_bdev1", 00:33:45.294 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:45.294 "strip_size_kb": 0, 00:33:45.294 "state": "online", 00:33:45.294 "raid_level": "raid1", 00:33:45.294 "superblock": true, 00:33:45.294 "num_base_bdevs": 4, 00:33:45.294 "num_base_bdevs_discovered": 3, 00:33:45.294 "num_base_bdevs_operational": 3, 00:33:45.294 "base_bdevs_list": [ 00:33:45.294 { 00:33:45.294 "name": null, 00:33:45.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:45.294 "is_configured": false, 00:33:45.294 "data_offset": 2048, 00:33:45.294 "data_size": 63488 00:33:45.294 }, 00:33:45.294 { 00:33:45.294 "name": "pt2", 00:33:45.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:45.294 "is_configured": true, 00:33:45.294 "data_offset": 2048, 00:33:45.294 "data_size": 63488 00:33:45.294 }, 00:33:45.294 { 00:33:45.294 "name": "pt3", 00:33:45.294 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:45.294 "is_configured": true, 00:33:45.294 "data_offset": 2048, 00:33:45.294 "data_size": 63488 00:33:45.294 }, 00:33:45.294 { 00:33:45.294 "name": "pt4", 00:33:45.294 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:45.294 "is_configured": true, 00:33:45.294 "data_offset": 2048, 00:33:45.294 "data_size": 63488 00:33:45.294 } 00:33:45.294 ] 00:33:45.294 }' 00:33:45.294 12:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:45.294 12:36:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:45.858 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:46.116 [2024-06-07 12:36:09.557941] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:46.116 [2024-06-07 12:36:09.558182] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:46.116 [2024-06-07 12:36:09.558346] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:46.116 [2024-06-07 12:36:09.558434] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:46.116 [2024-06-07 12:36:09.558698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000ab80 name raid_bdev1, state offline 00:33:46.116 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:46.116 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:33:46.374 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:33:46.374 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:33:46.374 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:33:46.374 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:33:46.374 12:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:46.632 [2024-06-07 12:36:10.210042] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:46.632 [2024-06-07 12:36:10.210365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:46.632 [2024-06-07 12:36:10.210511] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b480 00:33:46.632 [2024-06-07 12:36:10.210623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:46.632 [2024-06-07 12:36:10.213148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:46.632 [2024-06-07 12:36:10.213374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:46.632 [2024-06-07 12:36:10.213572] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:33:46.632 [2024-06-07 12:36:10.213726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:46.632 [2024-06-07 12:36:10.213947] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:33:46.632 [2024-06-07 12:36:10.214051] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:46.632 [2024-06-07 12:36:10.214111] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000ba80 name raid_bdev1, state configuring 00:33:46.632 [2024-06-07 12:36:10.214269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:46.632 [2024-06-07 12:36:10.214499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:33:46.632 pt1 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:46.632 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:46.958 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:46.958 "name": "raid_bdev1", 00:33:46.958 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:46.958 "strip_size_kb": 0, 00:33:46.958 "state": "configuring", 00:33:46.958 "raid_level": "raid1", 00:33:46.958 "superblock": true, 00:33:46.958 "num_base_bdevs": 4, 00:33:46.958 "num_base_bdevs_discovered": 2, 00:33:46.958 "num_base_bdevs_operational": 3, 00:33:46.958 "base_bdevs_list": [ 00:33:46.958 { 00:33:46.958 "name": null, 00:33:46.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:46.958 "is_configured": false, 00:33:46.958 "data_offset": 2048, 00:33:46.958 "data_size": 63488 00:33:46.958 }, 00:33:46.958 { 00:33:46.958 "name": "pt2", 00:33:46.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:46.958 "is_configured": true, 00:33:46.958 "data_offset": 2048, 00:33:46.958 "data_size": 63488 00:33:46.958 }, 00:33:46.958 { 00:33:46.958 "name": "pt3", 00:33:46.958 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:46.958 "is_configured": true, 00:33:46.958 "data_offset": 2048, 00:33:46.958 "data_size": 63488 00:33:46.958 }, 00:33:46.958 { 00:33:46.958 "name": null, 00:33:46.958 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:46.958 "is_configured": false, 00:33:46.958 "data_offset": 2048, 00:33:46.958 "data_size": 63488 00:33:46.958 } 00:33:46.958 ] 00:33:46.958 }' 00:33:46.958 12:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:46.958 12:36:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:47.533 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:33:47.533 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:33:47.792 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:33:47.792 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:33:48.050 [2024-06-07 12:36:11.602235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:33:48.050 [2024-06-07 12:36:11.602644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:48.050 [2024-06-07 12:36:11.602751] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000c080 00:33:48.050 [2024-06-07 12:36:11.602871] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:48.050 [2024-06-07 12:36:11.603345] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:48.050 [2024-06-07 12:36:11.603552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:33:48.050 [2024-06-07 12:36:11.603702] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:33:48.050 [2024-06-07 12:36:11.603778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:33:48.050 [2024-06-07 12:36:11.604009] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600000bd80 00:33:48.050 [2024-06-07 12:36:11.604053] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:48.050 [2024-06-07 12:36:11.604159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000003090 00:33:48.050 [2024-06-07 12:36:11.604467] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600000bd80 00:33:48.050 [2024-06-07 12:36:11.604521] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600000bd80 00:33:48.050 [2024-06-07 12:36:11.604720] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:48.050 pt4 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:48.050 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:48.308 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:48.308 "name": "raid_bdev1", 00:33:48.308 "uuid": "0a569c85-d4ff-43c1-a046-ff2ee94b5c9d", 00:33:48.308 "strip_size_kb": 0, 00:33:48.308 "state": "online", 00:33:48.308 "raid_level": "raid1", 00:33:48.308 "superblock": true, 00:33:48.308 "num_base_bdevs": 4, 00:33:48.308 "num_base_bdevs_discovered": 3, 00:33:48.308 "num_base_bdevs_operational": 3, 00:33:48.308 "base_bdevs_list": [ 00:33:48.308 { 00:33:48.308 "name": null, 00:33:48.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:48.308 "is_configured": false, 00:33:48.308 "data_offset": 2048, 00:33:48.308 "data_size": 63488 00:33:48.308 }, 00:33:48.308 { 00:33:48.308 "name": "pt2", 00:33:48.308 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:48.308 "is_configured": true, 00:33:48.308 "data_offset": 2048, 00:33:48.308 "data_size": 63488 00:33:48.308 }, 00:33:48.308 { 00:33:48.308 "name": "pt3", 00:33:48.308 "uuid": "00000000-0000-0000-0000-000000000003", 00:33:48.308 "is_configured": true, 00:33:48.308 "data_offset": 2048, 00:33:48.308 "data_size": 63488 00:33:48.308 }, 00:33:48.308 { 00:33:48.308 "name": "pt4", 00:33:48.308 "uuid": "00000000-0000-0000-0000-000000000004", 00:33:48.308 "is_configured": true, 00:33:48.308 "data_offset": 2048, 00:33:48.308 "data_size": 63488 00:33:48.308 } 00:33:48.308 ] 00:33:48.308 }' 00:33:48.308 12:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:48.308 12:36:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:49.240 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:33:49.240 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:33:49.240 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:33:49.240 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:49.240 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:33:49.497 [2024-06-07 12:36:12.942585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 0a569c85-d4ff-43c1-a046-ff2ee94b5c9d '!=' 0a569c85-d4ff-43c1-a046-ff2ee94b5c9d ']' 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 220008 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 220008 ']' 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 220008 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 220008 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:49.497 12:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:49.497 12:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 220008' 00:33:49.497 killing process with pid 220008 00:33:49.498 12:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 220008 00:33:49.498 [2024-06-07 12:36:13.002576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:49.498 12:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 220008 00:33:49.498 [2024-06-07 12:36:13.002797] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:49.498 [2024-06-07 12:36:13.002871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:49.498 [2024-06-07 12:36:13.002882] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000bd80 name raid_bdev1, state offline 00:33:49.498 [2024-06-07 12:36:13.087859] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:50.062 12:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:33:50.062 00:33:50.062 real 0m24.966s 00:33:50.062 user 0m45.415s 00:33:50.062 sys 0m4.302s 00:33:50.062 12:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:50.062 12:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:33:50.062 ************************************ 00:33:50.062 END TEST raid_superblock_test 00:33:50.062 ************************************ 00:33:50.062 12:36:13 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:33:50.063 12:36:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:33:50.063 12:36:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:50.063 12:36:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:50.063 ************************************ 00:33:50.063 START TEST raid_read_error_test 00:33:50.063 ************************************ 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 read 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tRNlsAAMHg 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=220846 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 220846 /var/tmp/spdk-raid.sock 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 220846 ']' 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:50.063 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:50.063 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:33:50.063 [2024-06-07 12:36:13.570060] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:33:50.063 [2024-06-07 12:36:13.570995] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid220846 ] 00:33:50.063 [2024-06-07 12:36:13.705288] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:50.374 [2024-06-07 12:36:13.797535] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:50.374 [2024-06-07 12:36:13.878264] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:50.374 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:50.374 12:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:33:50.374 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:50.374 12:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:33:50.671 BaseBdev1_malloc 00:33:50.671 12:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:33:50.931 true 00:33:50.931 12:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:33:51.190 [2024-06-07 12:36:14.757728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:33:51.190 [2024-06-07 12:36:14.758469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:51.190 [2024-06-07 12:36:14.758706] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:33:51.190 [2024-06-07 12:36:14.758934] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:51.190 [2024-06-07 12:36:14.761423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:51.190 [2024-06-07 12:36:14.761645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:33:51.190 BaseBdev1 00:33:51.190 12:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:51.190 12:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:33:51.448 BaseBdev2_malloc 00:33:51.449 12:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:33:51.708 true 00:33:51.708 12:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:33:51.966 [2024-06-07 12:36:15.385354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:33:51.966 [2024-06-07 12:36:15.385758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:51.966 [2024-06-07 12:36:15.385972] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:33:51.966 [2024-06-07 12:36:15.386200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:51.966 [2024-06-07 12:36:15.388685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:51.966 [2024-06-07 12:36:15.388897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:33:51.966 BaseBdev2 00:33:51.966 12:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:51.966 12:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:33:52.224 BaseBdev3_malloc 00:33:52.224 12:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:33:52.224 true 00:33:52.482 12:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:33:52.482 [2024-06-07 12:36:16.116103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:33:52.482 [2024-06-07 12:36:16.116854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:52.482 [2024-06-07 12:36:16.117095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:33:52.482 [2024-06-07 12:36:16.117330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:52.482 [2024-06-07 12:36:16.119581] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:52.482 [2024-06-07 12:36:16.119826] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:33:52.482 BaseBdev3 00:33:52.741 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:52.741 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:33:52.741 BaseBdev4_malloc 00:33:52.742 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:33:52.999 true 00:33:52.999 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:33:53.258 [2024-06-07 12:36:16.747877] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:33:53.258 [2024-06-07 12:36:16.748291] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:53.258 [2024-06-07 12:36:16.748507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:33:53.258 [2024-06-07 12:36:16.748754] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:53.258 [2024-06-07 12:36:16.751050] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:53.258 [2024-06-07 12:36:16.751279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:33:53.258 BaseBdev4 00:33:53.258 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:33:53.517 [2024-06-07 12:36:16.964131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:53.517 [2024-06-07 12:36:16.966377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:53.517 [2024-06-07 12:36:16.966549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:33:53.517 [2024-06-07 12:36:16.966626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:33:53.517 [2024-06-07 12:36:16.966873] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:33:53.517 [2024-06-07 12:36:16.966969] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:33:53.517 [2024-06-07 12:36:16.967134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:33:53.517 [2024-06-07 12:36:16.967603] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:33:53.517 [2024-06-07 12:36:16.967706] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:33:53.517 [2024-06-07 12:36:16.967970] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:53.517 12:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:53.776 12:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:53.776 "name": "raid_bdev1", 00:33:53.776 "uuid": "80f8dc70-4ccb-46bd-bdf8-864baeec94c6", 00:33:53.776 "strip_size_kb": 0, 00:33:53.776 "state": "online", 00:33:53.776 "raid_level": "raid1", 00:33:53.776 "superblock": true, 00:33:53.776 "num_base_bdevs": 4, 00:33:53.776 "num_base_bdevs_discovered": 4, 00:33:53.776 "num_base_bdevs_operational": 4, 00:33:53.776 "base_bdevs_list": [ 00:33:53.776 { 00:33:53.776 "name": "BaseBdev1", 00:33:53.776 "uuid": "342ce9c1-e5c4-5433-bf18-7cf83bc2d21e", 00:33:53.776 "is_configured": true, 00:33:53.776 "data_offset": 2048, 00:33:53.776 "data_size": 63488 00:33:53.776 }, 00:33:53.776 { 00:33:53.776 "name": "BaseBdev2", 00:33:53.776 "uuid": "8e0d5062-b18e-51c1-b0c2-eac4d31c8514", 00:33:53.776 "is_configured": true, 00:33:53.776 "data_offset": 2048, 00:33:53.776 "data_size": 63488 00:33:53.776 }, 00:33:53.776 { 00:33:53.776 "name": "BaseBdev3", 00:33:53.776 "uuid": "d8e14193-82ca-57d7-8ff0-aa4cd6560679", 00:33:53.776 "is_configured": true, 00:33:53.776 "data_offset": 2048, 00:33:53.776 "data_size": 63488 00:33:53.776 }, 00:33:53.776 { 00:33:53.776 "name": "BaseBdev4", 00:33:53.776 "uuid": "fa6b5eeb-7ee4-5b93-a1fe-9ad2d52d646a", 00:33:53.776 "is_configured": true, 00:33:53.776 "data_offset": 2048, 00:33:53.776 "data_size": 63488 00:33:53.776 } 00:33:53.776 ] 00:33:53.776 }' 00:33:53.776 12:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:53.776 12:36:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:33:54.344 12:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:33:54.344 12:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:33:54.344 [2024-06-07 12:36:17.804447] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:33:55.279 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:55.537 12:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:55.537 12:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:55.537 12:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:55.795 12:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:55.795 "name": "raid_bdev1", 00:33:55.795 "uuid": "80f8dc70-4ccb-46bd-bdf8-864baeec94c6", 00:33:55.795 "strip_size_kb": 0, 00:33:55.795 "state": "online", 00:33:55.795 "raid_level": "raid1", 00:33:55.795 "superblock": true, 00:33:55.795 "num_base_bdevs": 4, 00:33:55.795 "num_base_bdevs_discovered": 4, 00:33:55.795 "num_base_bdevs_operational": 4, 00:33:55.795 "base_bdevs_list": [ 00:33:55.795 { 00:33:55.795 "name": "BaseBdev1", 00:33:55.795 "uuid": "342ce9c1-e5c4-5433-bf18-7cf83bc2d21e", 00:33:55.795 "is_configured": true, 00:33:55.795 "data_offset": 2048, 00:33:55.795 "data_size": 63488 00:33:55.795 }, 00:33:55.795 { 00:33:55.795 "name": "BaseBdev2", 00:33:55.795 "uuid": "8e0d5062-b18e-51c1-b0c2-eac4d31c8514", 00:33:55.795 "is_configured": true, 00:33:55.795 "data_offset": 2048, 00:33:55.795 "data_size": 63488 00:33:55.795 }, 00:33:55.795 { 00:33:55.795 "name": "BaseBdev3", 00:33:55.795 "uuid": "d8e14193-82ca-57d7-8ff0-aa4cd6560679", 00:33:55.795 "is_configured": true, 00:33:55.795 "data_offset": 2048, 00:33:55.795 "data_size": 63488 00:33:55.795 }, 00:33:55.795 { 00:33:55.795 "name": "BaseBdev4", 00:33:55.795 "uuid": "fa6b5eeb-7ee4-5b93-a1fe-9ad2d52d646a", 00:33:55.795 "is_configured": true, 00:33:55.795 "data_offset": 2048, 00:33:55.795 "data_size": 63488 00:33:55.795 } 00:33:55.795 ] 00:33:55.795 }' 00:33:55.795 12:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:55.795 12:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:33:56.374 12:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:56.633 [2024-06-07 12:36:20.193162] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:56.633 [2024-06-07 12:36:20.193745] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:56.633 [2024-06-07 12:36:20.195080] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:56.633 [2024-06-07 12:36:20.195244] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:56.633 [2024-06-07 12:36:20.195465] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:56.633 [2024-06-07 12:36:20.195573] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:33:56.633 0 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 220846 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 220846 ']' 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 220846 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 220846 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 220846' 00:33:56.633 killing process with pid 220846 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 220846 00:33:56.633 [2024-06-07 12:36:20.258547] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:56.633 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 220846 00:33:56.892 [2024-06-07 12:36:20.323968] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tRNlsAAMHg 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:33:57.151 00:33:57.151 real 0m7.188s 00:33:57.151 user 0m11.486s 00:33:57.151 sys 0m1.272s 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:57.151 12:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:33:57.151 ************************************ 00:33:57.151 END TEST raid_read_error_test 00:33:57.151 ************************************ 00:33:57.151 12:36:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:33:57.151 12:36:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:33:57.151 12:36:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:57.151 12:36:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:57.151 ************************************ 00:33:57.151 START TEST raid_write_error_test 00:33:57.151 ************************************ 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 write 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev1 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev2 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev3 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # echo BaseBdev4 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.00Mlf6NjU7 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=221040 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 221040 /var/tmp/spdk-raid.sock 00:33:57.151 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 221040 ']' 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:57.409 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:57.409 12:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:33:57.409 [2024-06-07 12:36:20.834483] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:33:57.409 [2024-06-07 12:36:20.834731] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221040 ] 00:33:57.409 [2024-06-07 12:36:20.984841] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:57.668 [2024-06-07 12:36:21.083844] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:57.668 [2024-06-07 12:36:21.170240] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:58.236 12:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:58.236 12:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:33:58.236 12:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:58.236 12:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:33:58.494 BaseBdev1_malloc 00:33:58.494 12:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:33:58.752 true 00:33:58.752 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:33:58.752 [2024-06-07 12:36:22.360162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:33:58.752 [2024-06-07 12:36:22.360765] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:58.752 [2024-06-07 12:36:22.360891] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005d80 00:33:58.752 [2024-06-07 12:36:22.360996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:58.752 [2024-06-07 12:36:22.363335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:58.752 [2024-06-07 12:36:22.363473] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:33:58.752 BaseBdev1 00:33:58.752 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:58.752 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:33:59.011 BaseBdev2_malloc 00:33:59.011 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:33:59.276 true 00:33:59.276 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:33:59.534 [2024-06-07 12:36:22.951357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:33:59.534 [2024-06-07 12:36:22.951619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:59.534 [2024-06-07 12:36:22.951717] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006c80 00:33:59.534 [2024-06-07 12:36:22.951806] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:59.534 [2024-06-07 12:36:22.953979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:59.534 [2024-06-07 12:36:22.954094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:33:59.534 BaseBdev2 00:33:59.534 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:33:59.534 12:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:33:59.534 BaseBdev3_malloc 00:33:59.795 12:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:33:59.795 true 00:33:59.795 12:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:34:00.056 [2024-06-07 12:36:23.557799] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:34:00.056 [2024-06-07 12:36:23.558389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:00.057 [2024-06-07 12:36:23.558513] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007b80 00:34:00.057 [2024-06-07 12:36:23.558618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:00.057 [2024-06-07 12:36:23.561054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:00.057 [2024-06-07 12:36:23.561170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:34:00.057 BaseBdev3 00:34:00.057 12:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:34:00.057 12:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:34:00.315 BaseBdev4_malloc 00:34:00.315 12:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:34:00.573 true 00:34:00.573 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:34:00.831 [2024-06-07 12:36:24.301536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:34:00.831 [2024-06-07 12:36:24.303947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:00.831 [2024-06-07 12:36:24.304262] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:34:00.831 [2024-06-07 12:36:24.304551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:00.831 [2024-06-07 12:36:24.307521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:00.831 [2024-06-07 12:36:24.307901] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:34:00.831 BaseBdev4 00:34:00.831 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:34:01.090 [2024-06-07 12:36:24.504457] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:01.090 [2024-06-07 12:36:24.506555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:01.090 [2024-06-07 12:36:24.506719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:34:01.090 [2024-06-07 12:36:24.506795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:34:01.090 [2024-06-07 12:36:24.507035] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:34:01.090 [2024-06-07 12:36:24.507136] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:34:01.090 [2024-06-07 12:36:24.507297] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:34:01.090 [2024-06-07 12:36:24.507664] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:34:01.090 [2024-06-07 12:36:24.507767] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:34:01.090 [2024-06-07 12:36:24.508005] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:01.090 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:01.348 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:01.348 "name": "raid_bdev1", 00:34:01.348 "uuid": "f84cb84f-9fc8-41f8-9d49-d12130f82328", 00:34:01.348 "strip_size_kb": 0, 00:34:01.348 "state": "online", 00:34:01.348 "raid_level": "raid1", 00:34:01.348 "superblock": true, 00:34:01.348 "num_base_bdevs": 4, 00:34:01.348 "num_base_bdevs_discovered": 4, 00:34:01.348 "num_base_bdevs_operational": 4, 00:34:01.348 "base_bdevs_list": [ 00:34:01.348 { 00:34:01.348 "name": "BaseBdev1", 00:34:01.348 "uuid": "e6939a06-aecf-55d5-8da0-59442696ae31", 00:34:01.348 "is_configured": true, 00:34:01.348 "data_offset": 2048, 00:34:01.348 "data_size": 63488 00:34:01.348 }, 00:34:01.348 { 00:34:01.348 "name": "BaseBdev2", 00:34:01.348 "uuid": "531ceadd-b559-517c-a9e9-17c53c294ba0", 00:34:01.348 "is_configured": true, 00:34:01.348 "data_offset": 2048, 00:34:01.348 "data_size": 63488 00:34:01.348 }, 00:34:01.348 { 00:34:01.348 "name": "BaseBdev3", 00:34:01.348 "uuid": "bf7b714d-55cd-5ea0-9492-af953658e47f", 00:34:01.348 "is_configured": true, 00:34:01.348 "data_offset": 2048, 00:34:01.348 "data_size": 63488 00:34:01.348 }, 00:34:01.348 { 00:34:01.348 "name": "BaseBdev4", 00:34:01.348 "uuid": "265eb257-1877-59f0-9372-c8767b6f446a", 00:34:01.348 "is_configured": true, 00:34:01.348 "data_offset": 2048, 00:34:01.348 "data_size": 63488 00:34:01.348 } 00:34:01.348 ] 00:34:01.348 }' 00:34:01.348 12:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:01.348 12:36:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:01.915 12:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:34:01.915 12:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:34:01.915 [2024-06-07 12:36:25.464818] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:34:02.850 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:34:03.108 [2024-06-07 12:36:26.664238] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:34:03.108 [2024-06-07 12:36:26.665057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:03.108 [2024-06-07 12:36:26.665434] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d0000027a0 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:03.108 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:03.109 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:03.373 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:03.373 "name": "raid_bdev1", 00:34:03.373 "uuid": "f84cb84f-9fc8-41f8-9d49-d12130f82328", 00:34:03.373 "strip_size_kb": 0, 00:34:03.373 "state": "online", 00:34:03.373 "raid_level": "raid1", 00:34:03.373 "superblock": true, 00:34:03.373 "num_base_bdevs": 4, 00:34:03.373 "num_base_bdevs_discovered": 3, 00:34:03.373 "num_base_bdevs_operational": 3, 00:34:03.373 "base_bdevs_list": [ 00:34:03.373 { 00:34:03.373 "name": null, 00:34:03.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:03.373 "is_configured": false, 00:34:03.373 "data_offset": 2048, 00:34:03.373 "data_size": 63488 00:34:03.373 }, 00:34:03.373 { 00:34:03.373 "name": "BaseBdev2", 00:34:03.373 "uuid": "531ceadd-b559-517c-a9e9-17c53c294ba0", 00:34:03.373 "is_configured": true, 00:34:03.373 "data_offset": 2048, 00:34:03.373 "data_size": 63488 00:34:03.373 }, 00:34:03.373 { 00:34:03.373 "name": "BaseBdev3", 00:34:03.373 "uuid": "bf7b714d-55cd-5ea0-9492-af953658e47f", 00:34:03.373 "is_configured": true, 00:34:03.373 "data_offset": 2048, 00:34:03.373 "data_size": 63488 00:34:03.373 }, 00:34:03.373 { 00:34:03.373 "name": "BaseBdev4", 00:34:03.373 "uuid": "265eb257-1877-59f0-9372-c8767b6f446a", 00:34:03.373 "is_configured": true, 00:34:03.373 "data_offset": 2048, 00:34:03.373 "data_size": 63488 00:34:03.373 } 00:34:03.373 ] 00:34:03.373 }' 00:34:03.373 12:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:03.373 12:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:04.308 12:36:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:04.567 [2024-06-07 12:36:28.001429] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:04.567 [2024-06-07 12:36:28.002112] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:04.567 [2024-06-07 12:36:28.003851] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:04.567 [2024-06-07 12:36:28.004047] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:04.567 [2024-06-07 12:36:28.004170] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:04.567 [2024-06-07 12:36:28.004378] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:34:04.567 0 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 221040 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 221040 ']' 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 221040 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 221040 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 221040' 00:34:04.567 killing process with pid 221040 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 221040 00:34:04.567 [2024-06-07 12:36:28.064173] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:04.567 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 221040 00:34:04.567 [2024-06-07 12:36:28.134137] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.00Mlf6NjU7 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:34:05.135 00:34:05.135 real 0m7.746s 00:34:05.135 user 0m12.126s 00:34:05.135 sys 0m1.297s 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:05.135 12:36:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:34:05.135 ************************************ 00:34:05.135 END TEST raid_write_error_test 00:34:05.135 ************************************ 00:34:05.135 12:36:28 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:34:05.135 12:36:28 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:34:05.135 12:36:28 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:34:05.135 12:36:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:34:05.135 12:36:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:34:05.135 12:36:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:05.135 ************************************ 00:34:05.135 START TEST raid_rebuild_test 00:34:05.135 ************************************ 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false false true 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=221234 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 221234 /var/tmp/spdk-raid.sock 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 221234 ']' 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:05.135 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:34:05.135 12:36:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:05.135 [2024-06-07 12:36:28.639994] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:34:05.135 [2024-06-07 12:36:28.640767] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221234 ] 00:34:05.135 I/O size of 3145728 is greater than zero copy threshold (65536). 00:34:05.135 Zero copy mechanism will not be used. 00:34:05.135 [2024-06-07 12:36:28.776616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:05.394 [2024-06-07 12:36:28.862098] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:05.394 [2024-06-07 12:36:28.940653] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:06.328 12:36:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:06.328 12:36:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:34:06.328 12:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:06.328 12:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:34:06.328 BaseBdev1_malloc 00:34:06.328 12:36:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:34:06.587 [2024-06-07 12:36:30.217387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:34:06.587 [2024-06-07 12:36:30.218126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:06.587 [2024-06-07 12:36:30.218403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:34:06.587 [2024-06-07 12:36:30.218621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:06.587 [2024-06-07 12:36:30.221433] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:06.587 [2024-06-07 12:36:30.221731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:06.587 BaseBdev1 00:34:06.846 12:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:06.846 12:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:34:07.104 BaseBdev2_malloc 00:34:07.104 12:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:34:07.363 [2024-06-07 12:36:30.825898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:34:07.363 [2024-06-07 12:36:30.826346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:07.363 [2024-06-07 12:36:30.826565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:34:07.363 [2024-06-07 12:36:30.826765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:07.363 [2024-06-07 12:36:30.829160] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:07.363 [2024-06-07 12:36:30.829380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:34:07.363 BaseBdev2 00:34:07.363 12:36:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:34:07.622 spare_malloc 00:34:07.622 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:34:07.881 spare_delay 00:34:07.881 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:08.140 [2024-06-07 12:36:31.720433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:08.140 [2024-06-07 12:36:31.721139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:08.140 [2024-06-07 12:36:31.721381] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:34:08.140 [2024-06-07 12:36:31.721596] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:08.140 [2024-06-07 12:36:31.724009] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:08.140 [2024-06-07 12:36:31.724251] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:08.140 spare 00:34:08.140 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:34:08.398 [2024-06-07 12:36:31.924727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:08.398 [2024-06-07 12:36:31.927000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:08.398 [2024-06-07 12:36:31.927206] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:34:08.398 [2024-06-07 12:36:31.927265] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:34:08.398 [2024-06-07 12:36:31.927492] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:34:08.398 [2024-06-07 12:36:31.927951] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:34:08.398 [2024-06-07 12:36:31.928060] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:34:08.398 [2024-06-07 12:36:31.928327] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:08.398 12:36:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:08.656 12:36:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:08.656 "name": "raid_bdev1", 00:34:08.656 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:08.656 "strip_size_kb": 0, 00:34:08.656 "state": "online", 00:34:08.656 "raid_level": "raid1", 00:34:08.656 "superblock": false, 00:34:08.656 "num_base_bdevs": 2, 00:34:08.656 "num_base_bdevs_discovered": 2, 00:34:08.656 "num_base_bdevs_operational": 2, 00:34:08.656 "base_bdevs_list": [ 00:34:08.656 { 00:34:08.656 "name": "BaseBdev1", 00:34:08.656 "uuid": "e46bc42d-fb50-54ab-98f0-e5b50bf661b3", 00:34:08.656 "is_configured": true, 00:34:08.656 "data_offset": 0, 00:34:08.656 "data_size": 65536 00:34:08.656 }, 00:34:08.656 { 00:34:08.656 "name": "BaseBdev2", 00:34:08.656 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:08.656 "is_configured": true, 00:34:08.656 "data_offset": 0, 00:34:08.656 "data_size": 65536 00:34:08.656 } 00:34:08.656 ] 00:34:08.656 }' 00:34:08.656 12:36:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:08.656 12:36:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:09.600 12:36:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:34:09.600 12:36:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:34:09.600 [2024-06-07 12:36:33.213045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:09.600 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:34:09.600 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:09.600 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:34:09.861 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:34:10.119 [2024-06-07 12:36:33.733046] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:34:10.119 /dev/nbd0 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.376 1+0 records in 00:34:10.376 1+0 records out 00:34:10.376 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000391282 s, 10.5 MB/s 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:34:10.376 12:36:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:34:14.559 65536+0 records in 00:34:14.559 65536+0 records out 00:34:14.559 33554432 bytes (34 MB, 32 MiB) copied, 3.7335 s, 9.0 MB/s 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:34:14.559 [2024-06-07 12:36:37.834454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:34:14.559 12:36:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:34:14.559 [2024-06-07 12:36:38.022280] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:14.559 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:14.559 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:14.559 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:14.559 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:14.560 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:14.818 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:14.818 "name": "raid_bdev1", 00:34:14.818 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:14.818 "strip_size_kb": 0, 00:34:14.818 "state": "online", 00:34:14.818 "raid_level": "raid1", 00:34:14.818 "superblock": false, 00:34:14.818 "num_base_bdevs": 2, 00:34:14.818 "num_base_bdevs_discovered": 1, 00:34:14.818 "num_base_bdevs_operational": 1, 00:34:14.818 "base_bdevs_list": [ 00:34:14.818 { 00:34:14.818 "name": null, 00:34:14.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:14.818 "is_configured": false, 00:34:14.818 "data_offset": 0, 00:34:14.818 "data_size": 65536 00:34:14.818 }, 00:34:14.818 { 00:34:14.818 "name": "BaseBdev2", 00:34:14.818 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:14.818 "is_configured": true, 00:34:14.818 "data_offset": 0, 00:34:14.818 "data_size": 65536 00:34:14.818 } 00:34:14.818 ] 00:34:14.818 }' 00:34:14.818 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:14.818 12:36:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:15.391 12:36:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:15.649 [2024-06-07 12:36:39.230521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:15.649 [2024-06-07 12:36:39.238284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d06220 00:34:15.649 [2024-06-07 12:36:39.240446] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:15.649 12:36:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:17.022 "name": "raid_bdev1", 00:34:17.022 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:17.022 "strip_size_kb": 0, 00:34:17.022 "state": "online", 00:34:17.022 "raid_level": "raid1", 00:34:17.022 "superblock": false, 00:34:17.022 "num_base_bdevs": 2, 00:34:17.022 "num_base_bdevs_discovered": 2, 00:34:17.022 "num_base_bdevs_operational": 2, 00:34:17.022 "process": { 00:34:17.022 "type": "rebuild", 00:34:17.022 "target": "spare", 00:34:17.022 "progress": { 00:34:17.022 "blocks": 24576, 00:34:17.022 "percent": 37 00:34:17.022 } 00:34:17.022 }, 00:34:17.022 "base_bdevs_list": [ 00:34:17.022 { 00:34:17.022 "name": "spare", 00:34:17.022 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:17.022 "is_configured": true, 00:34:17.022 "data_offset": 0, 00:34:17.022 "data_size": 65536 00:34:17.022 }, 00:34:17.022 { 00:34:17.022 "name": "BaseBdev2", 00:34:17.022 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:17.022 "is_configured": true, 00:34:17.022 "data_offset": 0, 00:34:17.022 "data_size": 65536 00:34:17.022 } 00:34:17.022 ] 00:34:17.022 }' 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:17.022 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:34:17.280 [2024-06-07 12:36:40.788625] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:17.280 [2024-06-07 12:36:40.853301] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:17.280 [2024-06-07 12:36:40.854004] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:17.280 [2024-06-07 12:36:40.854140] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:17.280 [2024-06-07 12:36:40.854182] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:17.280 12:36:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:17.538 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:17.538 "name": "raid_bdev1", 00:34:17.538 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:17.538 "strip_size_kb": 0, 00:34:17.538 "state": "online", 00:34:17.538 "raid_level": "raid1", 00:34:17.538 "superblock": false, 00:34:17.538 "num_base_bdevs": 2, 00:34:17.538 "num_base_bdevs_discovered": 1, 00:34:17.538 "num_base_bdevs_operational": 1, 00:34:17.538 "base_bdevs_list": [ 00:34:17.538 { 00:34:17.538 "name": null, 00:34:17.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:17.538 "is_configured": false, 00:34:17.538 "data_offset": 0, 00:34:17.538 "data_size": 65536 00:34:17.538 }, 00:34:17.538 { 00:34:17.538 "name": "BaseBdev2", 00:34:17.538 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:17.538 "is_configured": true, 00:34:17.538 "data_offset": 0, 00:34:17.538 "data_size": 65536 00:34:17.538 } 00:34:17.538 ] 00:34:17.538 }' 00:34:17.538 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:17.538 12:36:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:18.474 12:36:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:18.732 "name": "raid_bdev1", 00:34:18.732 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:18.732 "strip_size_kb": 0, 00:34:18.732 "state": "online", 00:34:18.732 "raid_level": "raid1", 00:34:18.732 "superblock": false, 00:34:18.732 "num_base_bdevs": 2, 00:34:18.732 "num_base_bdevs_discovered": 1, 00:34:18.732 "num_base_bdevs_operational": 1, 00:34:18.732 "base_bdevs_list": [ 00:34:18.732 { 00:34:18.732 "name": null, 00:34:18.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:18.732 "is_configured": false, 00:34:18.732 "data_offset": 0, 00:34:18.732 "data_size": 65536 00:34:18.732 }, 00:34:18.732 { 00:34:18.732 "name": "BaseBdev2", 00:34:18.732 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:18.732 "is_configured": true, 00:34:18.732 "data_offset": 0, 00:34:18.732 "data_size": 65536 00:34:18.732 } 00:34:18.732 ] 00:34:18.732 }' 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:18.732 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:18.991 [2024-06-07 12:36:42.510470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:18.991 [2024-06-07 12:36:42.518053] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d063c0 00:34:18.991 [2024-06-07 12:36:42.520249] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:18.991 12:36:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:19.923 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:20.181 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:20.181 "name": "raid_bdev1", 00:34:20.181 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:20.181 "strip_size_kb": 0, 00:34:20.181 "state": "online", 00:34:20.181 "raid_level": "raid1", 00:34:20.181 "superblock": false, 00:34:20.181 "num_base_bdevs": 2, 00:34:20.181 "num_base_bdevs_discovered": 2, 00:34:20.181 "num_base_bdevs_operational": 2, 00:34:20.181 "process": { 00:34:20.181 "type": "rebuild", 00:34:20.181 "target": "spare", 00:34:20.181 "progress": { 00:34:20.181 "blocks": 24576, 00:34:20.181 "percent": 37 00:34:20.181 } 00:34:20.181 }, 00:34:20.181 "base_bdevs_list": [ 00:34:20.181 { 00:34:20.181 "name": "spare", 00:34:20.181 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:20.181 "is_configured": true, 00:34:20.181 "data_offset": 0, 00:34:20.181 "data_size": 65536 00:34:20.181 }, 00:34:20.181 { 00:34:20.181 "name": "BaseBdev2", 00:34:20.181 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:20.181 "is_configured": true, 00:34:20.181 "data_offset": 0, 00:34:20.181 "data_size": 65536 00:34:20.181 } 00:34:20.182 ] 00:34:20.182 }' 00:34:20.182 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:20.182 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:20.182 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=810 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:20.440 12:36:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:20.698 "name": "raid_bdev1", 00:34:20.698 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:20.698 "strip_size_kb": 0, 00:34:20.698 "state": "online", 00:34:20.698 "raid_level": "raid1", 00:34:20.698 "superblock": false, 00:34:20.698 "num_base_bdevs": 2, 00:34:20.698 "num_base_bdevs_discovered": 2, 00:34:20.698 "num_base_bdevs_operational": 2, 00:34:20.698 "process": { 00:34:20.698 "type": "rebuild", 00:34:20.698 "target": "spare", 00:34:20.698 "progress": { 00:34:20.698 "blocks": 30720, 00:34:20.698 "percent": 46 00:34:20.698 } 00:34:20.698 }, 00:34:20.698 "base_bdevs_list": [ 00:34:20.698 { 00:34:20.698 "name": "spare", 00:34:20.698 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:20.698 "is_configured": true, 00:34:20.698 "data_offset": 0, 00:34:20.698 "data_size": 65536 00:34:20.698 }, 00:34:20.698 { 00:34:20.698 "name": "BaseBdev2", 00:34:20.698 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:20.698 "is_configured": true, 00:34:20.698 "data_offset": 0, 00:34:20.698 "data_size": 65536 00:34:20.698 } 00:34:20.698 ] 00:34:20.698 }' 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:20.698 12:36:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:21.650 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:21.907 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:21.907 "name": "raid_bdev1", 00:34:21.907 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:21.907 "strip_size_kb": 0, 00:34:21.907 "state": "online", 00:34:21.907 "raid_level": "raid1", 00:34:21.907 "superblock": false, 00:34:21.907 "num_base_bdevs": 2, 00:34:21.907 "num_base_bdevs_discovered": 2, 00:34:21.907 "num_base_bdevs_operational": 2, 00:34:21.907 "process": { 00:34:21.907 "type": "rebuild", 00:34:21.907 "target": "spare", 00:34:21.907 "progress": { 00:34:21.907 "blocks": 59392, 00:34:21.907 "percent": 90 00:34:21.907 } 00:34:21.907 }, 00:34:21.907 "base_bdevs_list": [ 00:34:21.907 { 00:34:21.907 "name": "spare", 00:34:21.907 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:21.907 "is_configured": true, 00:34:21.907 "data_offset": 0, 00:34:21.907 "data_size": 65536 00:34:21.907 }, 00:34:21.907 { 00:34:21.907 "name": "BaseBdev2", 00:34:21.907 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:21.907 "is_configured": true, 00:34:21.907 "data_offset": 0, 00:34:21.907 "data_size": 65536 00:34:21.907 } 00:34:21.907 ] 00:34:21.907 }' 00:34:21.907 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:21.907 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:21.907 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:22.193 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:22.193 12:36:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:22.193 [2024-06-07 12:36:45.743007] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:34:22.193 [2024-06-07 12:36:45.743330] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:34:22.193 [2024-06-07 12:36:45.743979] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.128 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:23.387 "name": "raid_bdev1", 00:34:23.387 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:23.387 "strip_size_kb": 0, 00:34:23.387 "state": "online", 00:34:23.387 "raid_level": "raid1", 00:34:23.387 "superblock": false, 00:34:23.387 "num_base_bdevs": 2, 00:34:23.387 "num_base_bdevs_discovered": 2, 00:34:23.387 "num_base_bdevs_operational": 2, 00:34:23.387 "base_bdevs_list": [ 00:34:23.387 { 00:34:23.387 "name": "spare", 00:34:23.387 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:23.387 "is_configured": true, 00:34:23.387 "data_offset": 0, 00:34:23.387 "data_size": 65536 00:34:23.387 }, 00:34:23.387 { 00:34:23.387 "name": "BaseBdev2", 00:34:23.387 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:23.387 "is_configured": true, 00:34:23.387 "data_offset": 0, 00:34:23.387 "data_size": 65536 00:34:23.387 } 00:34:23.387 ] 00:34:23.387 }' 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.387 12:36:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:23.646 "name": "raid_bdev1", 00:34:23.646 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:23.646 "strip_size_kb": 0, 00:34:23.646 "state": "online", 00:34:23.646 "raid_level": "raid1", 00:34:23.646 "superblock": false, 00:34:23.646 "num_base_bdevs": 2, 00:34:23.646 "num_base_bdevs_discovered": 2, 00:34:23.646 "num_base_bdevs_operational": 2, 00:34:23.646 "base_bdevs_list": [ 00:34:23.646 { 00:34:23.646 "name": "spare", 00:34:23.646 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:23.646 "is_configured": true, 00:34:23.646 "data_offset": 0, 00:34:23.646 "data_size": 65536 00:34:23.646 }, 00:34:23.646 { 00:34:23.646 "name": "BaseBdev2", 00:34:23.646 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:23.646 "is_configured": true, 00:34:23.646 "data_offset": 0, 00:34:23.646 "data_size": 65536 00:34:23.646 } 00:34:23.646 ] 00:34:23.646 }' 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:23.646 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.647 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:23.905 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:23.905 "name": "raid_bdev1", 00:34:23.905 "uuid": "75b8ee7a-2010-49b9-9452-e3d09b5c32eb", 00:34:23.905 "strip_size_kb": 0, 00:34:23.905 "state": "online", 00:34:23.905 "raid_level": "raid1", 00:34:23.905 "superblock": false, 00:34:23.905 "num_base_bdevs": 2, 00:34:23.905 "num_base_bdevs_discovered": 2, 00:34:23.905 "num_base_bdevs_operational": 2, 00:34:23.905 "base_bdevs_list": [ 00:34:23.905 { 00:34:23.905 "name": "spare", 00:34:23.905 "uuid": "a5286060-7e40-5049-882f-adc4a0efc451", 00:34:23.905 "is_configured": true, 00:34:23.905 "data_offset": 0, 00:34:23.905 "data_size": 65536 00:34:23.905 }, 00:34:23.905 { 00:34:23.905 "name": "BaseBdev2", 00:34:23.905 "uuid": "eb7afaaa-ddc1-5578-b5ec-bcf20af373c7", 00:34:23.905 "is_configured": true, 00:34:23.905 "data_offset": 0, 00:34:23.905 "data_size": 65536 00:34:23.905 } 00:34:23.905 ] 00:34:23.905 }' 00:34:23.905 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:23.905 12:36:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:24.471 12:36:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:24.730 [2024-06-07 12:36:48.180155] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:24.730 [2024-06-07 12:36:48.180391] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:24.730 [2024-06-07 12:36:48.180616] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:24.730 [2024-06-07 12:36:48.180764] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:24.730 [2024-06-07 12:36:48.180846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:34:24.730 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:24.730 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:24.989 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:34:25.248 /dev/nbd0 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:25.248 1+0 records in 00:34:25.248 1+0 records out 00:34:25.248 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000483463 s, 8.5 MB/s 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:25.248 12:36:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:34:25.507 /dev/nbd1 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:25.507 1+0 records in 00:34:25.507 1+0 records out 00:34:25.507 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000575595 s, 7.1 MB/s 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:25.507 12:36:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:25.766 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:26.026 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 221234 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 221234 ']' 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 221234 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 221234 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 221234' 00:34:26.285 killing process with pid 221234 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 221234 00:34:26.285 Received shutdown signal, test time was about 60.000000 seconds 00:34:26.285 00:34:26.285 Latency(us) 00:34:26.285 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:26.285 =================================================================================================================== 00:34:26.285 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:34:26.285 12:36:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 221234 00:34:26.285 [2024-06-07 12:36:49.772553] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:26.285 [2024-06-07 12:36:49.829048] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:34:26.852 00:34:26.852 real 0m21.592s 00:34:26.852 user 0m29.987s 00:34:26.852 sys 0m5.050s 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:34:26.852 ************************************ 00:34:26.852 END TEST raid_rebuild_test 00:34:26.852 ************************************ 00:34:26.852 12:36:50 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:34:26.852 12:36:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:34:26.852 12:36:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:34:26.852 12:36:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:26.852 ************************************ 00:34:26.852 START TEST raid_rebuild_test_sb 00:34:26.852 ************************************ 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:26.852 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=221750 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 221750 /var/tmp/spdk-raid.sock 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 221750 ']' 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:26.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:34:26.853 12:36:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:26.853 [2024-06-07 12:36:50.321655] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:34:26.853 [2024-06-07 12:36:50.322126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221750 ] 00:34:26.853 I/O size of 3145728 is greater than zero copy threshold (65536). 00:34:26.853 Zero copy mechanism will not be used. 00:34:26.853 [2024-06-07 12:36:50.464119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:27.111 [2024-06-07 12:36:50.582884] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:34:27.111 [2024-06-07 12:36:50.663473] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:27.744 12:36:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:34:27.744 12:36:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:34:27.744 12:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:27.744 12:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:34:28.002 BaseBdev1_malloc 00:34:28.002 12:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:34:28.261 [2024-06-07 12:36:51.802983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:34:28.261 [2024-06-07 12:36:51.803335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:28.261 [2024-06-07 12:36:51.803435] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:34:28.261 [2024-06-07 12:36:51.803708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:28.261 [2024-06-07 12:36:51.806367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:28.261 [2024-06-07 12:36:51.806533] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:28.261 BaseBdev1 00:34:28.261 12:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:28.261 12:36:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:34:28.518 BaseBdev2_malloc 00:34:28.518 12:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:34:28.777 [2024-06-07 12:36:52.238191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:34:28.777 [2024-06-07 12:36:52.238467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:28.777 [2024-06-07 12:36:52.238553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:34:28.777 [2024-06-07 12:36:52.238807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:28.777 [2024-06-07 12:36:52.241105] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:28.777 [2024-06-07 12:36:52.241291] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:34:28.777 BaseBdev2 00:34:28.777 12:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:34:29.036 spare_malloc 00:34:29.036 12:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:34:29.295 spare_delay 00:34:29.295 12:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:29.560 [2024-06-07 12:36:52.950104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:29.560 [2024-06-07 12:36:52.950425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:29.560 [2024-06-07 12:36:52.950502] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:34:29.560 [2024-06-07 12:36:52.950640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:29.560 [2024-06-07 12:36:52.952946] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:29.560 [2024-06-07 12:36:52.953114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:29.560 spare 00:34:29.560 12:36:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:34:29.560 [2024-06-07 12:36:53.166226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:29.560 [2024-06-07 12:36:53.168327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:29.560 [2024-06-07 12:36:53.168619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:34:29.560 [2024-06-07 12:36:53.168720] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:34:29.560 [2024-06-07 12:36:53.168887] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:34:29.560 [2024-06-07 12:36:53.169332] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:34:29.560 [2024-06-07 12:36:53.169430] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:34:29.560 [2024-06-07 12:36:53.169616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:29.560 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:29.818 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:29.818 "name": "raid_bdev1", 00:34:29.818 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:29.818 "strip_size_kb": 0, 00:34:29.818 "state": "online", 00:34:29.818 "raid_level": "raid1", 00:34:29.818 "superblock": true, 00:34:29.818 "num_base_bdevs": 2, 00:34:29.818 "num_base_bdevs_discovered": 2, 00:34:29.818 "num_base_bdevs_operational": 2, 00:34:29.818 "base_bdevs_list": [ 00:34:29.818 { 00:34:29.818 "name": "BaseBdev1", 00:34:29.818 "uuid": "3c98693d-98d9-5b57-9399-50b7deb6a120", 00:34:29.818 "is_configured": true, 00:34:29.818 "data_offset": 2048, 00:34:29.818 "data_size": 63488 00:34:29.818 }, 00:34:29.818 { 00:34:29.818 "name": "BaseBdev2", 00:34:29.818 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:29.818 "is_configured": true, 00:34:29.818 "data_offset": 2048, 00:34:29.818 "data_size": 63488 00:34:29.818 } 00:34:29.818 ] 00:34:29.818 }' 00:34:29.818 12:36:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:29.818 12:36:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:30.810 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:34:30.810 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:34:30.810 [2024-06-07 12:36:54.254454] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:30.810 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:34:30.810 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:30.810 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:34:31.088 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:34:31.346 [2024-06-07 12:36:54.774358] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:34:31.346 /dev/nbd0 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:31.346 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:31.347 1+0 records in 00:34:31.347 1+0 records out 00:34:31.347 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000393104 s, 10.4 MB/s 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:34:31.347 12:36:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:34:35.534 63488+0 records in 00:34:35.534 63488+0 records out 00:34:35.534 32505856 bytes (33 MB, 31 MiB) copied, 3.81833 s, 8.5 MB/s 00:34:35.534 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:34:35.534 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:35.534 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:35.534 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:34:35.535 [2024-06-07 12:36:58.960699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:34:35.535 12:36:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:34:35.794 [2024-06-07 12:36:59.204385] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:35.794 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:35.795 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:36.054 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:36.054 "name": "raid_bdev1", 00:34:36.054 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:36.054 "strip_size_kb": 0, 00:34:36.054 "state": "online", 00:34:36.054 "raid_level": "raid1", 00:34:36.054 "superblock": true, 00:34:36.054 "num_base_bdevs": 2, 00:34:36.054 "num_base_bdevs_discovered": 1, 00:34:36.054 "num_base_bdevs_operational": 1, 00:34:36.054 "base_bdevs_list": [ 00:34:36.054 { 00:34:36.054 "name": null, 00:34:36.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:36.054 "is_configured": false, 00:34:36.054 "data_offset": 2048, 00:34:36.054 "data_size": 63488 00:34:36.054 }, 00:34:36.054 { 00:34:36.054 "name": "BaseBdev2", 00:34:36.054 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:36.054 "is_configured": true, 00:34:36.054 "data_offset": 2048, 00:34:36.054 "data_size": 63488 00:34:36.054 } 00:34:36.054 ] 00:34:36.054 }' 00:34:36.054 12:36:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:36.054 12:36:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:36.618 12:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:36.884 [2024-06-07 12:37:00.440630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:36.884 [2024-06-07 12:37:00.448223] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000c3e280 00:34:36.884 [2024-06-07 12:37:00.450763] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:36.884 12:37:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:38.256 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:38.256 "name": "raid_bdev1", 00:34:38.257 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:38.257 "strip_size_kb": 0, 00:34:38.257 "state": "online", 00:34:38.257 "raid_level": "raid1", 00:34:38.257 "superblock": true, 00:34:38.257 "num_base_bdevs": 2, 00:34:38.257 "num_base_bdevs_discovered": 2, 00:34:38.257 "num_base_bdevs_operational": 2, 00:34:38.257 "process": { 00:34:38.257 "type": "rebuild", 00:34:38.257 "target": "spare", 00:34:38.257 "progress": { 00:34:38.257 "blocks": 26624, 00:34:38.257 "percent": 41 00:34:38.257 } 00:34:38.257 }, 00:34:38.257 "base_bdevs_list": [ 00:34:38.257 { 00:34:38.257 "name": "spare", 00:34:38.257 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:38.257 "is_configured": true, 00:34:38.257 "data_offset": 2048, 00:34:38.257 "data_size": 63488 00:34:38.257 }, 00:34:38.257 { 00:34:38.257 "name": "BaseBdev2", 00:34:38.257 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:38.257 "is_configured": true, 00:34:38.257 "data_offset": 2048, 00:34:38.257 "data_size": 63488 00:34:38.257 } 00:34:38.257 ] 00:34:38.257 }' 00:34:38.257 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:38.257 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:38.257 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:38.515 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:38.515 12:37:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:34:38.772 [2024-06-07 12:37:02.173727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:38.772 [2024-06-07 12:37:02.265538] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:38.772 [2024-06-07 12:37:02.265957] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:38.772 [2024-06-07 12:37:02.266111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:38.772 [2024-06-07 12:37:02.266167] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:38.772 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:38.773 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:38.773 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:38.773 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:38.773 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:39.030 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:39.030 "name": "raid_bdev1", 00:34:39.030 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:39.030 "strip_size_kb": 0, 00:34:39.030 "state": "online", 00:34:39.030 "raid_level": "raid1", 00:34:39.030 "superblock": true, 00:34:39.030 "num_base_bdevs": 2, 00:34:39.030 "num_base_bdevs_discovered": 1, 00:34:39.030 "num_base_bdevs_operational": 1, 00:34:39.030 "base_bdevs_list": [ 00:34:39.030 { 00:34:39.030 "name": null, 00:34:39.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:39.030 "is_configured": false, 00:34:39.030 "data_offset": 2048, 00:34:39.030 "data_size": 63488 00:34:39.030 }, 00:34:39.030 { 00:34:39.030 "name": "BaseBdev2", 00:34:39.030 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:39.030 "is_configured": true, 00:34:39.030 "data_offset": 2048, 00:34:39.030 "data_size": 63488 00:34:39.030 } 00:34:39.030 ] 00:34:39.030 }' 00:34:39.030 12:37:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:39.030 12:37:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:39.596 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:39.854 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:39.854 "name": "raid_bdev1", 00:34:39.854 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:39.854 "strip_size_kb": 0, 00:34:39.854 "state": "online", 00:34:39.854 "raid_level": "raid1", 00:34:39.854 "superblock": true, 00:34:39.854 "num_base_bdevs": 2, 00:34:39.854 "num_base_bdevs_discovered": 1, 00:34:39.854 "num_base_bdevs_operational": 1, 00:34:39.854 "base_bdevs_list": [ 00:34:39.854 { 00:34:39.854 "name": null, 00:34:39.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:39.854 "is_configured": false, 00:34:39.854 "data_offset": 2048, 00:34:39.854 "data_size": 63488 00:34:39.854 }, 00:34:39.854 { 00:34:39.854 "name": "BaseBdev2", 00:34:39.854 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:39.854 "is_configured": true, 00:34:39.854 "data_offset": 2048, 00:34:39.854 "data_size": 63488 00:34:39.854 } 00:34:39.854 ] 00:34:39.854 }' 00:34:39.854 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:39.854 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:39.854 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:40.112 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:40.112 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:40.370 [2024-06-07 12:37:03.807070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:40.370 [2024-06-07 12:37:03.815297] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000c3e420 00:34:40.370 [2024-06-07 12:37:03.817754] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:40.370 12:37:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:34:41.303 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:41.303 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:41.303 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:41.303 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:41.304 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:41.304 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:41.304 12:37:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:41.561 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:41.561 "name": "raid_bdev1", 00:34:41.561 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:41.561 "strip_size_kb": 0, 00:34:41.561 "state": "online", 00:34:41.561 "raid_level": "raid1", 00:34:41.561 "superblock": true, 00:34:41.561 "num_base_bdevs": 2, 00:34:41.561 "num_base_bdevs_discovered": 2, 00:34:41.562 "num_base_bdevs_operational": 2, 00:34:41.562 "process": { 00:34:41.562 "type": "rebuild", 00:34:41.562 "target": "spare", 00:34:41.562 "progress": { 00:34:41.562 "blocks": 26624, 00:34:41.562 "percent": 41 00:34:41.562 } 00:34:41.562 }, 00:34:41.562 "base_bdevs_list": [ 00:34:41.562 { 00:34:41.562 "name": "spare", 00:34:41.562 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:41.562 "is_configured": true, 00:34:41.562 "data_offset": 2048, 00:34:41.562 "data_size": 63488 00:34:41.562 }, 00:34:41.562 { 00:34:41.562 "name": "BaseBdev2", 00:34:41.562 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:41.562 "is_configured": true, 00:34:41.562 "data_offset": 2048, 00:34:41.562 "data_size": 63488 00:34:41.562 } 00:34:41.562 ] 00:34:41.562 }' 00:34:41.562 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:34:41.819 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=832 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:41.819 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:41.820 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:41.820 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:42.077 "name": "raid_bdev1", 00:34:42.077 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:42.077 "strip_size_kb": 0, 00:34:42.077 "state": "online", 00:34:42.077 "raid_level": "raid1", 00:34:42.077 "superblock": true, 00:34:42.077 "num_base_bdevs": 2, 00:34:42.077 "num_base_bdevs_discovered": 2, 00:34:42.077 "num_base_bdevs_operational": 2, 00:34:42.077 "process": { 00:34:42.077 "type": "rebuild", 00:34:42.077 "target": "spare", 00:34:42.077 "progress": { 00:34:42.077 "blocks": 34816, 00:34:42.077 "percent": 54 00:34:42.077 } 00:34:42.077 }, 00:34:42.077 "base_bdevs_list": [ 00:34:42.077 { 00:34:42.077 "name": "spare", 00:34:42.077 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:42.077 "is_configured": true, 00:34:42.077 "data_offset": 2048, 00:34:42.077 "data_size": 63488 00:34:42.077 }, 00:34:42.077 { 00:34:42.077 "name": "BaseBdev2", 00:34:42.077 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:42.077 "is_configured": true, 00:34:42.077 "data_offset": 2048, 00:34:42.077 "data_size": 63488 00:34:42.077 } 00:34:42.077 ] 00:34:42.077 }' 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:42.077 12:37:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:43.019 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:43.584 [2024-06-07 12:37:06.942367] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:34:43.584 [2024-06-07 12:37:06.942463] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:34:43.584 [2024-06-07 12:37:06.942604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:43.584 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:43.584 "name": "raid_bdev1", 00:34:43.584 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:43.584 "strip_size_kb": 0, 00:34:43.584 "state": "online", 00:34:43.584 "raid_level": "raid1", 00:34:43.584 "superblock": true, 00:34:43.584 "num_base_bdevs": 2, 00:34:43.584 "num_base_bdevs_discovered": 2, 00:34:43.584 "num_base_bdevs_operational": 2, 00:34:43.584 "process": { 00:34:43.584 "type": "rebuild", 00:34:43.584 "target": "spare", 00:34:43.584 "progress": { 00:34:43.584 "blocks": 61440, 00:34:43.584 "percent": 96 00:34:43.584 } 00:34:43.584 }, 00:34:43.584 "base_bdevs_list": [ 00:34:43.584 { 00:34:43.584 "name": "spare", 00:34:43.584 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:43.584 "is_configured": true, 00:34:43.584 "data_offset": 2048, 00:34:43.584 "data_size": 63488 00:34:43.584 }, 00:34:43.584 { 00:34:43.584 "name": "BaseBdev2", 00:34:43.584 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:43.584 "is_configured": true, 00:34:43.584 "data_offset": 2048, 00:34:43.584 "data_size": 63488 00:34:43.584 } 00:34:43.584 ] 00:34:43.584 }' 00:34:43.584 12:37:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:43.584 12:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:43.584 12:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:43.584 12:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:43.584 12:37:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:44.517 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:44.775 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:44.775 "name": "raid_bdev1", 00:34:44.775 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:44.775 "strip_size_kb": 0, 00:34:44.775 "state": "online", 00:34:44.775 "raid_level": "raid1", 00:34:44.775 "superblock": true, 00:34:44.775 "num_base_bdevs": 2, 00:34:44.775 "num_base_bdevs_discovered": 2, 00:34:44.775 "num_base_bdevs_operational": 2, 00:34:44.775 "base_bdevs_list": [ 00:34:44.775 { 00:34:44.775 "name": "spare", 00:34:44.776 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:44.776 "is_configured": true, 00:34:44.776 "data_offset": 2048, 00:34:44.776 "data_size": 63488 00:34:44.776 }, 00:34:44.776 { 00:34:44.776 "name": "BaseBdev2", 00:34:44.776 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:44.776 "is_configured": true, 00:34:44.776 "data_offset": 2048, 00:34:44.776 "data_size": 63488 00:34:44.776 } 00:34:44.776 ] 00:34:44.776 }' 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:44.776 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:45.034 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:45.034 "name": "raid_bdev1", 00:34:45.034 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:45.034 "strip_size_kb": 0, 00:34:45.034 "state": "online", 00:34:45.034 "raid_level": "raid1", 00:34:45.034 "superblock": true, 00:34:45.034 "num_base_bdevs": 2, 00:34:45.034 "num_base_bdevs_discovered": 2, 00:34:45.034 "num_base_bdevs_operational": 2, 00:34:45.034 "base_bdevs_list": [ 00:34:45.034 { 00:34:45.034 "name": "spare", 00:34:45.034 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:45.034 "is_configured": true, 00:34:45.034 "data_offset": 2048, 00:34:45.034 "data_size": 63488 00:34:45.034 }, 00:34:45.034 { 00:34:45.034 "name": "BaseBdev2", 00:34:45.034 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:45.034 "is_configured": true, 00:34:45.034 "data_offset": 2048, 00:34:45.034 "data_size": 63488 00:34:45.034 } 00:34:45.034 ] 00:34:45.034 }' 00:34:45.034 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:45.034 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:45.034 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:45.293 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:45.551 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:45.551 "name": "raid_bdev1", 00:34:45.551 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:45.551 "strip_size_kb": 0, 00:34:45.551 "state": "online", 00:34:45.551 "raid_level": "raid1", 00:34:45.551 "superblock": true, 00:34:45.551 "num_base_bdevs": 2, 00:34:45.551 "num_base_bdevs_discovered": 2, 00:34:45.551 "num_base_bdevs_operational": 2, 00:34:45.551 "base_bdevs_list": [ 00:34:45.551 { 00:34:45.551 "name": "spare", 00:34:45.551 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:45.551 "is_configured": true, 00:34:45.551 "data_offset": 2048, 00:34:45.551 "data_size": 63488 00:34:45.551 }, 00:34:45.551 { 00:34:45.551 "name": "BaseBdev2", 00:34:45.551 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:45.551 "is_configured": true, 00:34:45.551 "data_offset": 2048, 00:34:45.551 "data_size": 63488 00:34:45.551 } 00:34:45.551 ] 00:34:45.551 }' 00:34:45.551 12:37:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:45.551 12:37:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:46.117 12:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:46.374 [2024-06-07 12:37:09.811123] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:46.374 [2024-06-07 12:37:09.811405] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:46.374 [2024-06-07 12:37:09.811662] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:46.374 [2024-06-07 12:37:09.811829] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:46.374 [2024-06-07 12:37:09.811926] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:34:46.374 12:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:46.374 12:37:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:46.632 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:34:46.889 /dev/nbd0 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:46.889 1+0 records in 00:34:46.889 1+0 records out 00:34:46.889 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000533958 s, 7.7 MB/s 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:46.889 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:34:47.146 /dev/nbd1 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:47.146 1+0 records in 00:34:47.146 1+0 records out 00:34:47.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000445123 s, 9.2 MB/s 00:34:47.146 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.404 12:37:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.662 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.920 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:34:47.921 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:48.178 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:48.436 [2024-06-07 12:37:11.882626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:48.436 [2024-06-07 12:37:11.882957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:48.436 [2024-06-07 12:37:11.883052] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:34:48.436 [2024-06-07 12:37:11.883154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:48.436 [2024-06-07 12:37:11.885641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:48.436 [2024-06-07 12:37:11.885842] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:48.436 [2024-06-07 12:37:11.886059] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:34:48.436 [2024-06-07 12:37:11.886205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:48.436 [2024-06-07 12:37:11.886503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:48.436 spare 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:48.436 12:37:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:48.436 [2024-06-07 12:37:11.986683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009380 00:34:48.436 [2024-06-07 12:37:11.986905] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:34:48.436 [2024-06-07 12:37:11.987109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caeca0 00:34:48.436 [2024-06-07 12:37:11.987643] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009380 00:34:48.436 [2024-06-07 12:37:11.987748] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009380 00:34:48.436 [2024-06-07 12:37:11.987924] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:48.693 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:48.693 "name": "raid_bdev1", 00:34:48.693 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:48.693 "strip_size_kb": 0, 00:34:48.693 "state": "online", 00:34:48.693 "raid_level": "raid1", 00:34:48.694 "superblock": true, 00:34:48.694 "num_base_bdevs": 2, 00:34:48.694 "num_base_bdevs_discovered": 2, 00:34:48.694 "num_base_bdevs_operational": 2, 00:34:48.694 "base_bdevs_list": [ 00:34:48.694 { 00:34:48.694 "name": "spare", 00:34:48.694 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:48.694 "is_configured": true, 00:34:48.694 "data_offset": 2048, 00:34:48.694 "data_size": 63488 00:34:48.694 }, 00:34:48.694 { 00:34:48.694 "name": "BaseBdev2", 00:34:48.694 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:48.694 "is_configured": true, 00:34:48.694 "data_offset": 2048, 00:34:48.694 "data_size": 63488 00:34:48.694 } 00:34:48.694 ] 00:34:48.694 }' 00:34:48.694 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:48.694 12:37:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:49.259 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:49.259 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:49.260 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:49.260 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:49.260 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:49.260 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:49.260 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:49.518 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:49.518 "name": "raid_bdev1", 00:34:49.518 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:49.518 "strip_size_kb": 0, 00:34:49.518 "state": "online", 00:34:49.518 "raid_level": "raid1", 00:34:49.518 "superblock": true, 00:34:49.518 "num_base_bdevs": 2, 00:34:49.518 "num_base_bdevs_discovered": 2, 00:34:49.518 "num_base_bdevs_operational": 2, 00:34:49.518 "base_bdevs_list": [ 00:34:49.518 { 00:34:49.518 "name": "spare", 00:34:49.518 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:49.518 "is_configured": true, 00:34:49.518 "data_offset": 2048, 00:34:49.518 "data_size": 63488 00:34:49.518 }, 00:34:49.518 { 00:34:49.518 "name": "BaseBdev2", 00:34:49.518 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:49.518 "is_configured": true, 00:34:49.518 "data_offset": 2048, 00:34:49.518 "data_size": 63488 00:34:49.518 } 00:34:49.518 ] 00:34:49.518 }' 00:34:49.518 12:37:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:49.518 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:49.518 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:49.518 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:49.518 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:49.518 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:34:49.776 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:34:49.776 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:34:50.034 [2024-06-07 12:37:13.539057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:50.034 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:50.292 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:50.292 "name": "raid_bdev1", 00:34:50.292 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:50.292 "strip_size_kb": 0, 00:34:50.292 "state": "online", 00:34:50.292 "raid_level": "raid1", 00:34:50.292 "superblock": true, 00:34:50.292 "num_base_bdevs": 2, 00:34:50.292 "num_base_bdevs_discovered": 1, 00:34:50.292 "num_base_bdevs_operational": 1, 00:34:50.292 "base_bdevs_list": [ 00:34:50.292 { 00:34:50.292 "name": null, 00:34:50.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:50.292 "is_configured": false, 00:34:50.292 "data_offset": 2048, 00:34:50.292 "data_size": 63488 00:34:50.292 }, 00:34:50.292 { 00:34:50.292 "name": "BaseBdev2", 00:34:50.292 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:50.292 "is_configured": true, 00:34:50.292 "data_offset": 2048, 00:34:50.292 "data_size": 63488 00:34:50.292 } 00:34:50.292 ] 00:34:50.292 }' 00:34:50.292 12:37:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:50.292 12:37:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:50.858 12:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:51.116 [2024-06-07 12:37:14.711214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:51.116 [2024-06-07 12:37:14.711699] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:34:51.116 [2024-06-07 12:37:14.711827] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:34:51.116 [2024-06-07 12:37:14.711944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:51.116 [2024-06-07 12:37:14.719425] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caee40 00:34:51.116 [2024-06-07 12:37:14.721700] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:51.116 12:37:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:52.490 12:37:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:52.490 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:52.490 "name": "raid_bdev1", 00:34:52.490 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:52.490 "strip_size_kb": 0, 00:34:52.490 "state": "online", 00:34:52.490 "raid_level": "raid1", 00:34:52.490 "superblock": true, 00:34:52.490 "num_base_bdevs": 2, 00:34:52.490 "num_base_bdevs_discovered": 2, 00:34:52.490 "num_base_bdevs_operational": 2, 00:34:52.490 "process": { 00:34:52.490 "type": "rebuild", 00:34:52.490 "target": "spare", 00:34:52.490 "progress": { 00:34:52.490 "blocks": 24576, 00:34:52.490 "percent": 38 00:34:52.490 } 00:34:52.490 }, 00:34:52.490 "base_bdevs_list": [ 00:34:52.490 { 00:34:52.490 "name": "spare", 00:34:52.490 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:52.490 "is_configured": true, 00:34:52.490 "data_offset": 2048, 00:34:52.490 "data_size": 63488 00:34:52.490 }, 00:34:52.490 { 00:34:52.490 "name": "BaseBdev2", 00:34:52.490 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:52.490 "is_configured": true, 00:34:52.490 "data_offset": 2048, 00:34:52.490 "data_size": 63488 00:34:52.490 } 00:34:52.490 ] 00:34:52.490 }' 00:34:52.490 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:52.490 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:52.490 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:52.748 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:52.748 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:53.007 [2024-06-07 12:37:16.427576] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:53.007 [2024-06-07 12:37:16.434296] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:53.007 [2024-06-07 12:37:16.434626] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:53.007 [2024-06-07 12:37:16.434799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:53.007 [2024-06-07 12:37:16.434858] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:53.007 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:53.265 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:53.265 "name": "raid_bdev1", 00:34:53.265 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:53.265 "strip_size_kb": 0, 00:34:53.265 "state": "online", 00:34:53.265 "raid_level": "raid1", 00:34:53.265 "superblock": true, 00:34:53.265 "num_base_bdevs": 2, 00:34:53.265 "num_base_bdevs_discovered": 1, 00:34:53.265 "num_base_bdevs_operational": 1, 00:34:53.265 "base_bdevs_list": [ 00:34:53.265 { 00:34:53.265 "name": null, 00:34:53.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:53.265 "is_configured": false, 00:34:53.265 "data_offset": 2048, 00:34:53.265 "data_size": 63488 00:34:53.265 }, 00:34:53.265 { 00:34:53.265 "name": "BaseBdev2", 00:34:53.265 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:53.265 "is_configured": true, 00:34:53.265 "data_offset": 2048, 00:34:53.265 "data_size": 63488 00:34:53.265 } 00:34:53.265 ] 00:34:53.265 }' 00:34:53.265 12:37:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:53.265 12:37:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:54.200 12:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:54.200 [2024-06-07 12:37:17.807714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:54.200 [2024-06-07 12:37:17.808105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:54.200 [2024-06-07 12:37:17.808188] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:34:54.200 [2024-06-07 12:37:17.808334] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:54.200 [2024-06-07 12:37:17.808778] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:54.200 [2024-06-07 12:37:17.808983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:54.200 [2024-06-07 12:37:17.809200] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:34:54.200 [2024-06-07 12:37:17.809328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:34:54.200 [2024-06-07 12:37:17.809411] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:34:54.200 [2024-06-07 12:37:17.809513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:54.200 [2024-06-07 12:37:17.817417] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caf180 00:34:54.200 spare 00:34:54.200 [2024-06-07 12:37:17.819715] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:54.200 12:37:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:55.624 12:37:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:55.624 "name": "raid_bdev1", 00:34:55.624 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:55.624 "strip_size_kb": 0, 00:34:55.624 "state": "online", 00:34:55.624 "raid_level": "raid1", 00:34:55.624 "superblock": true, 00:34:55.624 "num_base_bdevs": 2, 00:34:55.624 "num_base_bdevs_discovered": 2, 00:34:55.624 "num_base_bdevs_operational": 2, 00:34:55.624 "process": { 00:34:55.624 "type": "rebuild", 00:34:55.624 "target": "spare", 00:34:55.624 "progress": { 00:34:55.624 "blocks": 24576, 00:34:55.624 "percent": 38 00:34:55.624 } 00:34:55.624 }, 00:34:55.624 "base_bdevs_list": [ 00:34:55.624 { 00:34:55.624 "name": "spare", 00:34:55.624 "uuid": "01b91b5d-4027-51e8-868c-19e3c74bcda3", 00:34:55.624 "is_configured": true, 00:34:55.624 "data_offset": 2048, 00:34:55.624 "data_size": 63488 00:34:55.624 }, 00:34:55.624 { 00:34:55.624 "name": "BaseBdev2", 00:34:55.624 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:55.624 "is_configured": true, 00:34:55.624 "data_offset": 2048, 00:34:55.624 "data_size": 63488 00:34:55.624 } 00:34:55.624 ] 00:34:55.624 }' 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:55.624 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:55.882 [2024-06-07 12:37:19.466002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:56.140 [2024-06-07 12:37:19.531863] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:56.140 [2024-06-07 12:37:19.532192] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:56.140 [2024-06-07 12:37:19.532273] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:56.140 [2024-06-07 12:37:19.532374] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:56.140 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:56.397 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:56.397 "name": "raid_bdev1", 00:34:56.397 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:56.397 "strip_size_kb": 0, 00:34:56.397 "state": "online", 00:34:56.397 "raid_level": "raid1", 00:34:56.397 "superblock": true, 00:34:56.397 "num_base_bdevs": 2, 00:34:56.397 "num_base_bdevs_discovered": 1, 00:34:56.397 "num_base_bdevs_operational": 1, 00:34:56.397 "base_bdevs_list": [ 00:34:56.397 { 00:34:56.397 "name": null, 00:34:56.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:56.398 "is_configured": false, 00:34:56.398 "data_offset": 2048, 00:34:56.398 "data_size": 63488 00:34:56.398 }, 00:34:56.398 { 00:34:56.398 "name": "BaseBdev2", 00:34:56.398 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:56.398 "is_configured": true, 00:34:56.398 "data_offset": 2048, 00:34:56.398 "data_size": 63488 00:34:56.398 } 00:34:56.398 ] 00:34:56.398 }' 00:34:56.398 12:37:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:56.398 12:37:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:56.964 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:57.221 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:57.221 "name": "raid_bdev1", 00:34:57.221 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:57.221 "strip_size_kb": 0, 00:34:57.221 "state": "online", 00:34:57.221 "raid_level": "raid1", 00:34:57.221 "superblock": true, 00:34:57.221 "num_base_bdevs": 2, 00:34:57.221 "num_base_bdevs_discovered": 1, 00:34:57.221 "num_base_bdevs_operational": 1, 00:34:57.221 "base_bdevs_list": [ 00:34:57.221 { 00:34:57.221 "name": null, 00:34:57.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:57.221 "is_configured": false, 00:34:57.221 "data_offset": 2048, 00:34:57.221 "data_size": 63488 00:34:57.221 }, 00:34:57.221 { 00:34:57.221 "name": "BaseBdev2", 00:34:57.221 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:57.221 "is_configured": true, 00:34:57.221 "data_offset": 2048, 00:34:57.221 "data_size": 63488 00:34:57.221 } 00:34:57.221 ] 00:34:57.221 }' 00:34:57.221 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:57.221 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:57.221 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:57.478 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:57.478 12:37:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:34:57.735 12:37:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:34:57.993 [2024-06-07 12:37:21.548887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:34:57.993 [2024-06-07 12:37:21.549354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:57.993 [2024-06-07 12:37:21.549568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:34:57.993 [2024-06-07 12:37:21.549702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:57.993 [2024-06-07 12:37:21.550253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:57.993 [2024-06-07 12:37:21.550432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:57.993 [2024-06-07 12:37:21.550645] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:34:57.993 [2024-06-07 12:37:21.550757] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:34:57.993 [2024-06-07 12:37:21.550857] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:34:57.993 BaseBdev1 00:34:57.993 12:37:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:58.949 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:59.206 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:59.464 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:59.464 "name": "raid_bdev1", 00:34:59.464 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:34:59.464 "strip_size_kb": 0, 00:34:59.464 "state": "online", 00:34:59.464 "raid_level": "raid1", 00:34:59.464 "superblock": true, 00:34:59.464 "num_base_bdevs": 2, 00:34:59.464 "num_base_bdevs_discovered": 1, 00:34:59.464 "num_base_bdevs_operational": 1, 00:34:59.464 "base_bdevs_list": [ 00:34:59.464 { 00:34:59.464 "name": null, 00:34:59.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:59.464 "is_configured": false, 00:34:59.464 "data_offset": 2048, 00:34:59.464 "data_size": 63488 00:34:59.464 }, 00:34:59.464 { 00:34:59.464 "name": "BaseBdev2", 00:34:59.464 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:34:59.464 "is_configured": true, 00:34:59.464 "data_offset": 2048, 00:34:59.464 "data_size": 63488 00:34:59.464 } 00:34:59.464 ] 00:34:59.464 }' 00:34:59.464 12:37:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:59.464 12:37:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:00.029 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:00.029 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:00.029 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:00.029 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:00.029 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:00.287 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:00.287 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:00.545 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:00.545 "name": "raid_bdev1", 00:35:00.545 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:35:00.545 "strip_size_kb": 0, 00:35:00.545 "state": "online", 00:35:00.545 "raid_level": "raid1", 00:35:00.545 "superblock": true, 00:35:00.545 "num_base_bdevs": 2, 00:35:00.545 "num_base_bdevs_discovered": 1, 00:35:00.545 "num_base_bdevs_operational": 1, 00:35:00.545 "base_bdevs_list": [ 00:35:00.545 { 00:35:00.545 "name": null, 00:35:00.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:00.545 "is_configured": false, 00:35:00.545 "data_offset": 2048, 00:35:00.545 "data_size": 63488 00:35:00.545 }, 00:35:00.545 { 00:35:00.545 "name": "BaseBdev2", 00:35:00.545 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:35:00.545 "is_configured": true, 00:35:00.545 "data_offset": 2048, 00:35:00.545 "data_size": 63488 00:35:00.545 } 00:35:00.545 ] 00:35:00.545 }' 00:35:00.545 12:37:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:00.545 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:35:00.546 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:00.804 [2024-06-07 12:37:24.377326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:00.804 [2024-06-07 12:37:24.377815] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:35:00.804 [2024-06-07 12:37:24.377941] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:35:00.804 request: 00:35:00.804 { 00:35:00.804 "raid_bdev": "raid_bdev1", 00:35:00.804 "base_bdev": "BaseBdev1", 00:35:00.804 "method": "bdev_raid_add_base_bdev", 00:35:00.804 "req_id": 1 00:35:00.804 } 00:35:00.804 Got JSON-RPC error response 00:35:00.804 response: 00:35:00.804 { 00:35:00.804 "code": -22, 00:35:00.804 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:35:00.804 } 00:35:00.804 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:35:00.804 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:35:00.804 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:35:00.804 12:37:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:35:00.804 12:37:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:02.219 "name": "raid_bdev1", 00:35:02.219 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:35:02.219 "strip_size_kb": 0, 00:35:02.219 "state": "online", 00:35:02.219 "raid_level": "raid1", 00:35:02.219 "superblock": true, 00:35:02.219 "num_base_bdevs": 2, 00:35:02.219 "num_base_bdevs_discovered": 1, 00:35:02.219 "num_base_bdevs_operational": 1, 00:35:02.219 "base_bdevs_list": [ 00:35:02.219 { 00:35:02.219 "name": null, 00:35:02.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:02.219 "is_configured": false, 00:35:02.219 "data_offset": 2048, 00:35:02.219 "data_size": 63488 00:35:02.219 }, 00:35:02.219 { 00:35:02.219 "name": "BaseBdev2", 00:35:02.219 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:35:02.219 "is_configured": true, 00:35:02.219 "data_offset": 2048, 00:35:02.219 "data_size": 63488 00:35:02.219 } 00:35:02.219 ] 00:35:02.219 }' 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:02.219 12:37:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:02.784 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:03.042 "name": "raid_bdev1", 00:35:03.042 "uuid": "32e1bc9b-6e16-4078-afc4-ad38361d134e", 00:35:03.042 "strip_size_kb": 0, 00:35:03.042 "state": "online", 00:35:03.042 "raid_level": "raid1", 00:35:03.042 "superblock": true, 00:35:03.042 "num_base_bdevs": 2, 00:35:03.042 "num_base_bdevs_discovered": 1, 00:35:03.042 "num_base_bdevs_operational": 1, 00:35:03.042 "base_bdevs_list": [ 00:35:03.042 { 00:35:03.042 "name": null, 00:35:03.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:03.042 "is_configured": false, 00:35:03.042 "data_offset": 2048, 00:35:03.042 "data_size": 63488 00:35:03.042 }, 00:35:03.042 { 00:35:03.042 "name": "BaseBdev2", 00:35:03.042 "uuid": "f9baac08-91ce-5b87-84d7-4c86cc5caabc", 00:35:03.042 "is_configured": true, 00:35:03.042 "data_offset": 2048, 00:35:03.042 "data_size": 63488 00:35:03.042 } 00:35:03.042 ] 00:35:03.042 }' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 221750 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 221750 ']' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 221750 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:03.042 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 221750 00:35:03.299 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:03.299 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:03.299 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 221750' 00:35:03.299 killing process with pid 221750 00:35:03.299 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 221750 00:35:03.299 Received shutdown signal, test time was about 60.000000 seconds 00:35:03.299 00:35:03.299 Latency(us) 00:35:03.299 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:03.299 =================================================================================================================== 00:35:03.299 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:35:03.299 12:37:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 221750 00:35:03.299 [2024-06-07 12:37:26.700150] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:03.299 [2024-06-07 12:37:26.700307] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:03.299 [2024-06-07 12:37:26.700356] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:03.300 [2024-06-07 12:37:26.700367] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state offline 00:35:03.300 [2024-06-07 12:37:26.760641] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:03.557 12:37:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:35:03.557 00:35:03.557 real 0m36.878s 00:35:03.557 user 0m55.107s 00:35:03.557 sys 0m6.808s 00:35:03.557 12:37:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:03.557 12:37:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:35:03.557 ************************************ 00:35:03.557 END TEST raid_rebuild_test_sb 00:35:03.557 ************************************ 00:35:03.557 12:37:27 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:35:03.557 12:37:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:35:03.557 12:37:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:03.557 12:37:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:03.814 ************************************ 00:35:03.814 START TEST raid_rebuild_test_io 00:35:03.814 ************************************ 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false true true 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=222667 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 222667 /var/tmp/spdk-raid.sock 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 222667 ']' 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:03.814 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:03.814 12:37:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:03.814 [2024-06-07 12:37:27.255019] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:35:03.814 [2024-06-07 12:37:27.255518] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid222667 ] 00:35:03.814 I/O size of 3145728 is greater than zero copy threshold (65536). 00:35:03.814 Zero copy mechanism will not be used. 00:35:03.814 [2024-06-07 12:37:27.398318] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:04.072 [2024-06-07 12:37:27.492538] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:04.072 [2024-06-07 12:37:27.579456] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:04.637 12:37:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:04.637 12:37:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:35:04.637 12:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:04.637 12:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:35:04.895 BaseBdev1_malloc 00:35:04.895 12:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:35:05.153 [2024-06-07 12:37:28.774293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:35:05.153 [2024-06-07 12:37:28.774676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:05.153 [2024-06-07 12:37:28.774768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:35:05.153 [2024-06-07 12:37:28.774974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:05.153 [2024-06-07 12:37:28.777597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:05.153 [2024-06-07 12:37:28.777829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:05.153 BaseBdev1 00:35:05.410 12:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:05.410 12:37:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:35:05.410 BaseBdev2_malloc 00:35:05.410 12:37:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:35:05.668 [2024-06-07 12:37:29.310134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:35:05.668 [2024-06-07 12:37:29.310540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:05.668 [2024-06-07 12:37:29.310632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:35:05.668 [2024-06-07 12:37:29.310800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:05.668 [2024-06-07 12:37:29.313256] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:05.668 [2024-06-07 12:37:29.313456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:35:05.968 BaseBdev2 00:35:05.968 12:37:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:35:05.968 spare_malloc 00:35:05.968 12:37:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:35:06.227 spare_delay 00:35:06.485 12:37:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:35:06.744 [2024-06-07 12:37:30.153216] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:35:06.744 [2024-06-07 12:37:30.153606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:06.744 [2024-06-07 12:37:30.153775] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:35:06.744 [2024-06-07 12:37:30.153920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:06.744 [2024-06-07 12:37:30.156404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:06.744 [2024-06-07 12:37:30.156605] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:35:06.744 spare 00:35:06.744 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:35:07.002 [2024-06-07 12:37:30.393442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:07.002 [2024-06-07 12:37:30.395810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:07.002 [2024-06-07 12:37:30.396069] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:35:07.002 [2024-06-07 12:37:30.396174] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:35:07.002 [2024-06-07 12:37:30.396390] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:35:07.002 [2024-06-07 12:37:30.396837] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:35:07.002 [2024-06-07 12:37:30.396947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:35:07.002 [2024-06-07 12:37:30.397272] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:07.002 "name": "raid_bdev1", 00:35:07.002 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:07.002 "strip_size_kb": 0, 00:35:07.002 "state": "online", 00:35:07.002 "raid_level": "raid1", 00:35:07.002 "superblock": false, 00:35:07.002 "num_base_bdevs": 2, 00:35:07.002 "num_base_bdevs_discovered": 2, 00:35:07.002 "num_base_bdevs_operational": 2, 00:35:07.002 "base_bdevs_list": [ 00:35:07.002 { 00:35:07.002 "name": "BaseBdev1", 00:35:07.002 "uuid": "83cffdaf-4f01-53f0-8fc3-81e5fd01fca9", 00:35:07.002 "is_configured": true, 00:35:07.002 "data_offset": 0, 00:35:07.002 "data_size": 65536 00:35:07.002 }, 00:35:07.002 { 00:35:07.002 "name": "BaseBdev2", 00:35:07.002 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:07.002 "is_configured": true, 00:35:07.002 "data_offset": 0, 00:35:07.002 "data_size": 65536 00:35:07.002 } 00:35:07.002 ] 00:35:07.002 }' 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:07.002 12:37:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:07.934 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:35:07.934 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:07.934 [2024-06-07 12:37:31.485776] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:07.934 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:35:07.934 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:07.934 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:35:08.192 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:35:08.192 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:35:08.192 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:35:08.192 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:35:08.192 [2024-06-07 12:37:31.825138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:35:08.192 I/O size of 3145728 is greater than zero copy threshold (65536). 00:35:08.192 Zero copy mechanism will not be used. 00:35:08.192 Running I/O for 60 seconds... 00:35:08.450 [2024-06-07 12:37:31.956335] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:35:08.450 [2024-06-07 12:37:31.960524] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d0000026d0 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:08.450 12:37:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:08.708 12:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:08.708 "name": "raid_bdev1", 00:35:08.708 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:08.708 "strip_size_kb": 0, 00:35:08.708 "state": "online", 00:35:08.708 "raid_level": "raid1", 00:35:08.708 "superblock": false, 00:35:08.708 "num_base_bdevs": 2, 00:35:08.708 "num_base_bdevs_discovered": 1, 00:35:08.708 "num_base_bdevs_operational": 1, 00:35:08.708 "base_bdevs_list": [ 00:35:08.708 { 00:35:08.708 "name": null, 00:35:08.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:08.708 "is_configured": false, 00:35:08.708 "data_offset": 0, 00:35:08.708 "data_size": 65536 00:35:08.708 }, 00:35:08.708 { 00:35:08.708 "name": "BaseBdev2", 00:35:08.708 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:08.708 "is_configured": true, 00:35:08.708 "data_offset": 0, 00:35:08.708 "data_size": 65536 00:35:08.708 } 00:35:08.708 ] 00:35:08.708 }' 00:35:08.708 12:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:08.708 12:37:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:09.273 12:37:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:35:09.531 [2024-06-07 12:37:33.162414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:09.788 12:37:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:35:09.788 [2024-06-07 12:37:33.222185] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:35:09.788 [2024-06-07 12:37:33.225168] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:09.788 [2024-06-07 12:37:33.341430] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:09.788 [2024-06-07 12:37:33.342634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:10.046 [2024-06-07 12:37:33.564316] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:10.046 [2024-06-07 12:37:33.565114] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:10.675 [2024-06-07 12:37:34.015303] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:10.675 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:10.934 [2024-06-07 12:37:34.331357] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:10.934 [2024-06-07 12:37:34.332328] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:10.934 [2024-06-07 12:37:34.448891] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:10.934 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:10.934 "name": "raid_bdev1", 00:35:10.934 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:10.934 "strip_size_kb": 0, 00:35:10.934 "state": "online", 00:35:10.934 "raid_level": "raid1", 00:35:10.934 "superblock": false, 00:35:10.934 "num_base_bdevs": 2, 00:35:10.934 "num_base_bdevs_discovered": 2, 00:35:10.934 "num_base_bdevs_operational": 2, 00:35:10.934 "process": { 00:35:10.934 "type": "rebuild", 00:35:10.934 "target": "spare", 00:35:10.934 "progress": { 00:35:10.934 "blocks": 16384, 00:35:10.934 "percent": 25 00:35:10.934 } 00:35:10.934 }, 00:35:10.934 "base_bdevs_list": [ 00:35:10.934 { 00:35:10.934 "name": "spare", 00:35:10.934 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:10.934 "is_configured": true, 00:35:10.934 "data_offset": 0, 00:35:10.934 "data_size": 65536 00:35:10.934 }, 00:35:10.934 { 00:35:10.934 "name": "BaseBdev2", 00:35:10.934 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:10.934 "is_configured": true, 00:35:10.934 "data_offset": 0, 00:35:10.934 "data_size": 65536 00:35:10.934 } 00:35:10.934 ] 00:35:10.934 }' 00:35:10.934 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:10.935 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:10.935 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:11.193 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:11.193 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:35:11.193 [2024-06-07 12:37:34.669853] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:35:11.193 [2024-06-07 12:37:34.670735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:35:11.193 [2024-06-07 12:37:34.824083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:11.451 [2024-06-07 12:37:34.873196] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:35:11.451 [2024-06-07 12:37:34.887908] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:35:11.451 [2024-06-07 12:37:34.892218] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:11.451 [2024-06-07 12:37:34.892524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:11.451 [2024-06-07 12:37:34.892576] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:35:11.451 [2024-06-07 12:37:34.915049] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d0000026d0 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:11.451 12:37:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:11.710 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:11.710 "name": "raid_bdev1", 00:35:11.710 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:11.710 "strip_size_kb": 0, 00:35:11.710 "state": "online", 00:35:11.710 "raid_level": "raid1", 00:35:11.710 "superblock": false, 00:35:11.710 "num_base_bdevs": 2, 00:35:11.710 "num_base_bdevs_discovered": 1, 00:35:11.710 "num_base_bdevs_operational": 1, 00:35:11.710 "base_bdevs_list": [ 00:35:11.710 { 00:35:11.710 "name": null, 00:35:11.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:11.710 "is_configured": false, 00:35:11.710 "data_offset": 0, 00:35:11.710 "data_size": 65536 00:35:11.710 }, 00:35:11.710 { 00:35:11.710 "name": "BaseBdev2", 00:35:11.710 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:11.710 "is_configured": true, 00:35:11.710 "data_offset": 0, 00:35:11.710 "data_size": 65536 00:35:11.710 } 00:35:11.710 ] 00:35:11.710 }' 00:35:11.710 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:11.710 12:37:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:12.277 12:37:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:12.535 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:12.535 "name": "raid_bdev1", 00:35:12.535 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:12.535 "strip_size_kb": 0, 00:35:12.535 "state": "online", 00:35:12.535 "raid_level": "raid1", 00:35:12.535 "superblock": false, 00:35:12.535 "num_base_bdevs": 2, 00:35:12.535 "num_base_bdevs_discovered": 1, 00:35:12.535 "num_base_bdevs_operational": 1, 00:35:12.535 "base_bdevs_list": [ 00:35:12.535 { 00:35:12.535 "name": null, 00:35:12.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:12.535 "is_configured": false, 00:35:12.535 "data_offset": 0, 00:35:12.535 "data_size": 65536 00:35:12.535 }, 00:35:12.535 { 00:35:12.535 "name": "BaseBdev2", 00:35:12.535 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:12.535 "is_configured": true, 00:35:12.535 "data_offset": 0, 00:35:12.535 "data_size": 65536 00:35:12.535 } 00:35:12.535 ] 00:35:12.535 }' 00:35:12.535 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:12.793 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:12.793 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:12.794 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:12.794 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:35:13.051 [2024-06-07 12:37:36.461063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:13.051 [2024-06-07 12:37:36.499392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002940 00:35:13.051 12:37:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:35:13.052 [2024-06-07 12:37:36.501785] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:13.052 [2024-06-07 12:37:36.620061] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:13.052 [2024-06-07 12:37:36.620949] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:13.310 [2024-06-07 12:37:36.844787] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:13.310 [2024-06-07 12:37:36.845443] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:13.567 [2024-06-07 12:37:37.162458] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:35:13.567 [2024-06-07 12:37:37.163442] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:35:13.825 [2024-06-07 12:37:37.376286] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:14.085 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:14.085 [2024-06-07 12:37:37.603735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:14.086 [2024-06-07 12:37:37.711690] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:14.345 "name": "raid_bdev1", 00:35:14.345 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:14.345 "strip_size_kb": 0, 00:35:14.345 "state": "online", 00:35:14.345 "raid_level": "raid1", 00:35:14.345 "superblock": false, 00:35:14.345 "num_base_bdevs": 2, 00:35:14.345 "num_base_bdevs_discovered": 2, 00:35:14.345 "num_base_bdevs_operational": 2, 00:35:14.345 "process": { 00:35:14.345 "type": "rebuild", 00:35:14.345 "target": "spare", 00:35:14.345 "progress": { 00:35:14.345 "blocks": 16384, 00:35:14.345 "percent": 25 00:35:14.345 } 00:35:14.345 }, 00:35:14.345 "base_bdevs_list": [ 00:35:14.345 { 00:35:14.345 "name": "spare", 00:35:14.345 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:14.345 "is_configured": true, 00:35:14.345 "data_offset": 0, 00:35:14.345 "data_size": 65536 00:35:14.345 }, 00:35:14.345 { 00:35:14.345 "name": "BaseBdev2", 00:35:14.345 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:14.345 "is_configured": true, 00:35:14.345 "data_offset": 0, 00:35:14.345 "data_size": 65536 00:35:14.345 } 00:35:14.345 ] 00:35:14.345 }' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=864 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:14.345 12:37:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:14.602 [2024-06-07 12:37:38.036550] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:35:14.603 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:14.603 "name": "raid_bdev1", 00:35:14.603 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:14.603 "strip_size_kb": 0, 00:35:14.603 "state": "online", 00:35:14.603 "raid_level": "raid1", 00:35:14.603 "superblock": false, 00:35:14.603 "num_base_bdevs": 2, 00:35:14.603 "num_base_bdevs_discovered": 2, 00:35:14.603 "num_base_bdevs_operational": 2, 00:35:14.603 "process": { 00:35:14.603 "type": "rebuild", 00:35:14.603 "target": "spare", 00:35:14.603 "progress": { 00:35:14.603 "blocks": 20480, 00:35:14.603 "percent": 31 00:35:14.603 } 00:35:14.603 }, 00:35:14.603 "base_bdevs_list": [ 00:35:14.603 { 00:35:14.603 "name": "spare", 00:35:14.603 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:14.603 "is_configured": true, 00:35:14.603 "data_offset": 0, 00:35:14.603 "data_size": 65536 00:35:14.603 }, 00:35:14.603 { 00:35:14.603 "name": "BaseBdev2", 00:35:14.603 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:14.603 "is_configured": true, 00:35:14.603 "data_offset": 0, 00:35:14.603 "data_size": 65536 00:35:14.603 } 00:35:14.603 ] 00:35:14.603 }' 00:35:14.603 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:14.860 [2024-06-07 12:37:38.252342] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:35:14.860 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:14.860 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:14.860 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:14.860 12:37:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:15.425 [2024-06-07 12:37:38.892016] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:35:15.425 [2024-06-07 12:37:38.995963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:35:15.682 [2024-06-07 12:37:39.310218] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:35:15.682 [2024-06-07 12:37:39.311133] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:15.940 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:15.940 [2024-06-07 12:37:39.534752] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:16.198 "name": "raid_bdev1", 00:35:16.198 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:16.198 "strip_size_kb": 0, 00:35:16.198 "state": "online", 00:35:16.198 "raid_level": "raid1", 00:35:16.198 "superblock": false, 00:35:16.198 "num_base_bdevs": 2, 00:35:16.198 "num_base_bdevs_discovered": 2, 00:35:16.198 "num_base_bdevs_operational": 2, 00:35:16.198 "process": { 00:35:16.198 "type": "rebuild", 00:35:16.198 "target": "spare", 00:35:16.198 "progress": { 00:35:16.198 "blocks": 40960, 00:35:16.198 "percent": 62 00:35:16.198 } 00:35:16.198 }, 00:35:16.198 "base_bdevs_list": [ 00:35:16.198 { 00:35:16.198 "name": "spare", 00:35:16.198 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:16.198 "is_configured": true, 00:35:16.198 "data_offset": 0, 00:35:16.198 "data_size": 65536 00:35:16.198 }, 00:35:16.198 { 00:35:16.198 "name": "BaseBdev2", 00:35:16.198 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:16.198 "is_configured": true, 00:35:16.198 "data_offset": 0, 00:35:16.198 "data_size": 65536 00:35:16.198 } 00:35:16.198 ] 00:35:16.198 }' 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:16.198 12:37:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:16.456 [2024-06-07 12:37:39.865144] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:35:17.390 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:17.390 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:17.390 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:17.390 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:17.391 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:17.391 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:17.391 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:17.391 12:37:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:17.391 [2024-06-07 12:37:41.032035] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:17.649 "name": "raid_bdev1", 00:35:17.649 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:17.649 "strip_size_kb": 0, 00:35:17.649 "state": "online", 00:35:17.649 "raid_level": "raid1", 00:35:17.649 "superblock": false, 00:35:17.649 "num_base_bdevs": 2, 00:35:17.649 "num_base_bdevs_discovered": 2, 00:35:17.649 "num_base_bdevs_operational": 2, 00:35:17.649 "process": { 00:35:17.649 "type": "rebuild", 00:35:17.649 "target": "spare", 00:35:17.649 "progress": { 00:35:17.649 "blocks": 63488, 00:35:17.649 "percent": 96 00:35:17.649 } 00:35:17.649 }, 00:35:17.649 "base_bdevs_list": [ 00:35:17.649 { 00:35:17.649 "name": "spare", 00:35:17.649 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:17.649 "is_configured": true, 00:35:17.649 "data_offset": 0, 00:35:17.649 "data_size": 65536 00:35:17.649 }, 00:35:17.649 { 00:35:17.649 "name": "BaseBdev2", 00:35:17.649 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:17.649 "is_configured": true, 00:35:17.649 "data_offset": 0, 00:35:17.649 "data_size": 65536 00:35:17.649 } 00:35:17.649 ] 00:35:17.649 }' 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:17.649 [2024-06-07 12:37:41.132042] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:17.649 [2024-06-07 12:37:41.140043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:17.649 12:37:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:18.585 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:18.843 "name": "raid_bdev1", 00:35:18.843 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:18.843 "strip_size_kb": 0, 00:35:18.843 "state": "online", 00:35:18.843 "raid_level": "raid1", 00:35:18.843 "superblock": false, 00:35:18.843 "num_base_bdevs": 2, 00:35:18.843 "num_base_bdevs_discovered": 2, 00:35:18.843 "num_base_bdevs_operational": 2, 00:35:18.843 "base_bdevs_list": [ 00:35:18.843 { 00:35:18.843 "name": "spare", 00:35:18.843 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:18.843 "is_configured": true, 00:35:18.843 "data_offset": 0, 00:35:18.843 "data_size": 65536 00:35:18.843 }, 00:35:18.843 { 00:35:18.843 "name": "BaseBdev2", 00:35:18.843 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:18.843 "is_configured": true, 00:35:18.843 "data_offset": 0, 00:35:18.843 "data_size": 65536 00:35:18.843 } 00:35:18.843 ] 00:35:18.843 }' 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:18.843 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:19.410 "name": "raid_bdev1", 00:35:19.410 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:19.410 "strip_size_kb": 0, 00:35:19.410 "state": "online", 00:35:19.410 "raid_level": "raid1", 00:35:19.410 "superblock": false, 00:35:19.410 "num_base_bdevs": 2, 00:35:19.410 "num_base_bdevs_discovered": 2, 00:35:19.410 "num_base_bdevs_operational": 2, 00:35:19.410 "base_bdevs_list": [ 00:35:19.410 { 00:35:19.410 "name": "spare", 00:35:19.410 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:19.410 "is_configured": true, 00:35:19.410 "data_offset": 0, 00:35:19.410 "data_size": 65536 00:35:19.410 }, 00:35:19.410 { 00:35:19.410 "name": "BaseBdev2", 00:35:19.410 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:19.410 "is_configured": true, 00:35:19.410 "data_offset": 0, 00:35:19.410 "data_size": 65536 00:35:19.410 } 00:35:19.410 ] 00:35:19.410 }' 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:19.410 12:37:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:19.667 12:37:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:19.667 "name": "raid_bdev1", 00:35:19.667 "uuid": "cb7ab76a-e403-41b7-ba13-3ce7866b2a53", 00:35:19.667 "strip_size_kb": 0, 00:35:19.667 "state": "online", 00:35:19.667 "raid_level": "raid1", 00:35:19.667 "superblock": false, 00:35:19.667 "num_base_bdevs": 2, 00:35:19.667 "num_base_bdevs_discovered": 2, 00:35:19.667 "num_base_bdevs_operational": 2, 00:35:19.667 "base_bdevs_list": [ 00:35:19.667 { 00:35:19.667 "name": "spare", 00:35:19.667 "uuid": "2ca44b17-a519-5e7a-9cf5-ea18583776f4", 00:35:19.667 "is_configured": true, 00:35:19.667 "data_offset": 0, 00:35:19.667 "data_size": 65536 00:35:19.667 }, 00:35:19.667 { 00:35:19.667 "name": "BaseBdev2", 00:35:19.667 "uuid": "846c97de-fdaf-5a41-aa26-b9e226abca2a", 00:35:19.667 "is_configured": true, 00:35:19.667 "data_offset": 0, 00:35:19.667 "data_size": 65536 00:35:19.667 } 00:35:19.667 ] 00:35:19.667 }' 00:35:19.667 12:37:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:19.667 12:37:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:20.232 12:37:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:35:20.798 [2024-06-07 12:37:44.153751] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:20.798 [2024-06-07 12:37:44.154081] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:20.798 00:35:20.798 Latency(us) 00:35:20.798 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:20.798 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:35:20.798 raid_bdev1 : 12.41 129.22 387.65 0.00 0.00 11373.61 333.53 111348.78 00:35:20.798 =================================================================================================================== 00:35:20.798 Total : 129.22 387.65 0.00 0.00 11373.61 333.53 111348.78 00:35:20.798 [2024-06-07 12:37:44.237910] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:20.798 [2024-06-07 12:37:44.238124] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:20.798 [2024-06-07 12:37:44.238272] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:20.798 0 00:35:20.798 [2024-06-07 12:37:44.238422] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:35:20.798 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:20.798 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:21.057 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:35:21.315 /dev/nbd0 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:21.315 1+0 records in 00:35:21.315 1+0 records out 00:35:21.315 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000511693 s, 8.0 MB/s 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:21.315 12:37:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:35:21.574 /dev/nbd1 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:21.574 1+0 records in 00:35:21.574 1+0 records out 00:35:21.574 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000686131 s, 6.0 MB/s 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:21.574 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:35:21.836 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:22.094 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 222667 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 222667 ']' 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 222667 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 222667 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 222667' 00:35:22.353 killing process with pid 222667 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 222667 00:35:22.353 Received shutdown signal, test time was about 14.043035 seconds 00:35:22.353 00:35:22.353 Latency(us) 00:35:22.353 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:22.353 =================================================================================================================== 00:35:22.353 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:22.353 12:37:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 222667 00:35:22.353 [2024-06-07 12:37:45.871290] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:22.353 [2024-06-07 12:37:45.920904] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:35:22.919 00:35:22.919 real 0m19.097s 00:35:22.919 user 0m29.209s 00:35:22.919 sys 0m2.964s 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:35:22.919 ************************************ 00:35:22.919 END TEST raid_rebuild_test_io 00:35:22.919 ************************************ 00:35:22.919 12:37:46 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:35:22.919 12:37:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:35:22.919 12:37:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:22.919 12:37:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:22.919 ************************************ 00:35:22.919 START TEST raid_rebuild_test_sb_io 00:35:22.919 ************************************ 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true true true 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=223136 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:35:22.919 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 223136 /var/tmp/spdk-raid.sock 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 223136 ']' 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:22.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:22.920 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:22.920 [2024-06-07 12:37:46.429371] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:35:22.920 [2024-06-07 12:37:46.429961] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid223136 ] 00:35:22.920 I/O size of 3145728 is greater than zero copy threshold (65536). 00:35:22.920 Zero copy mechanism will not be used. 00:35:23.178 [2024-06-07 12:37:46.578479] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:23.178 [2024-06-07 12:37:46.677204] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:23.178 [2024-06-07 12:37:46.764380] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:23.436 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:23.436 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:35:23.436 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:23.436 12:37:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:35:23.693 BaseBdev1_malloc 00:35:23.693 12:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:35:23.952 [2024-06-07 12:37:47.469685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:35:23.952 [2024-06-07 12:37:47.470085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:23.952 [2024-06-07 12:37:47.470183] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:35:23.952 [2024-06-07 12:37:47.470487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:23.952 [2024-06-07 12:37:47.473098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:23.952 [2024-06-07 12:37:47.473340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:23.952 BaseBdev1 00:35:23.952 12:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:23.952 12:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:35:24.210 BaseBdev2_malloc 00:35:24.210 12:37:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:35:24.468 [2024-06-07 12:37:48.086377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:35:24.468 [2024-06-07 12:37:48.086776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:24.468 [2024-06-07 12:37:48.086929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:35:24.468 [2024-06-07 12:37:48.087066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:24.468 [2024-06-07 12:37:48.089572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:24.468 [2024-06-07 12:37:48.089781] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:35:24.468 BaseBdev2 00:35:24.726 12:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:35:24.984 spare_malloc 00:35:24.984 12:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:35:25.242 spare_delay 00:35:25.242 12:37:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:35:25.501 [2024-06-07 12:37:49.035268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:35:25.501 [2024-06-07 12:37:49.035658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:25.501 [2024-06-07 12:37:49.035750] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:35:25.501 [2024-06-07 12:37:49.036034] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:25.501 [2024-06-07 12:37:49.038508] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:25.501 [2024-06-07 12:37:49.038739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:35:25.501 spare 00:35:25.501 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:35:25.817 [2024-06-07 12:37:49.395384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:25.817 [2024-06-07 12:37:49.397920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:25.817 [2024-06-07 12:37:49.398311] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:35:25.817 [2024-06-07 12:37:49.398434] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:35:25.817 [2024-06-07 12:37:49.398636] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:35:25.817 [2024-06-07 12:37:49.399104] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:35:25.817 [2024-06-07 12:37:49.399221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:35:25.817 [2024-06-07 12:37:49.399538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:25.817 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:26.384 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:26.384 "name": "raid_bdev1", 00:35:26.384 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:26.384 "strip_size_kb": 0, 00:35:26.384 "state": "online", 00:35:26.384 "raid_level": "raid1", 00:35:26.384 "superblock": true, 00:35:26.384 "num_base_bdevs": 2, 00:35:26.384 "num_base_bdevs_discovered": 2, 00:35:26.384 "num_base_bdevs_operational": 2, 00:35:26.384 "base_bdevs_list": [ 00:35:26.384 { 00:35:26.384 "name": "BaseBdev1", 00:35:26.384 "uuid": "abe0c01a-2a14-5170-83f4-e475ae2938ff", 00:35:26.384 "is_configured": true, 00:35:26.384 "data_offset": 2048, 00:35:26.384 "data_size": 63488 00:35:26.384 }, 00:35:26.384 { 00:35:26.384 "name": "BaseBdev2", 00:35:26.384 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:26.384 "is_configured": true, 00:35:26.384 "data_offset": 2048, 00:35:26.384 "data_size": 63488 00:35:26.384 } 00:35:26.384 ] 00:35:26.384 }' 00:35:26.384 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:26.384 12:37:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:26.950 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:35:26.950 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:35:26.950 [2024-06-07 12:37:50.575926] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:35:27.209 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:35:27.209 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:27.209 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:35:27.468 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:35:27.468 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:35:27.468 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:35:27.468 12:37:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:35:27.468 [2024-06-07 12:37:51.067299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:35:27.468 I/O size of 3145728 is greater than zero copy threshold (65536). 00:35:27.468 Zero copy mechanism will not be used. 00:35:27.468 Running I/O for 60 seconds... 00:35:27.728 [2024-06-07 12:37:51.162135] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:35:27.728 [2024-06-07 12:37:51.170649] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d0000026d0 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:27.728 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:27.986 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:27.986 "name": "raid_bdev1", 00:35:27.986 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:27.986 "strip_size_kb": 0, 00:35:27.986 "state": "online", 00:35:27.986 "raid_level": "raid1", 00:35:27.986 "superblock": true, 00:35:27.986 "num_base_bdevs": 2, 00:35:27.986 "num_base_bdevs_discovered": 1, 00:35:27.986 "num_base_bdevs_operational": 1, 00:35:27.986 "base_bdevs_list": [ 00:35:27.986 { 00:35:27.986 "name": null, 00:35:27.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:27.986 "is_configured": false, 00:35:27.986 "data_offset": 2048, 00:35:27.986 "data_size": 63488 00:35:27.986 }, 00:35:27.986 { 00:35:27.986 "name": "BaseBdev2", 00:35:27.986 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:27.986 "is_configured": true, 00:35:27.986 "data_offset": 2048, 00:35:27.986 "data_size": 63488 00:35:27.986 } 00:35:27.986 ] 00:35:27.986 }' 00:35:27.986 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:27.986 12:37:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:28.918 12:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:35:28.918 [2024-06-07 12:37:52.556997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:29.177 [2024-06-07 12:37:52.597136] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:35:29.177 [2024-06-07 12:37:52.599877] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:29.177 12:37:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:35:29.177 [2024-06-07 12:37:52.718887] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:29.177 [2024-06-07 12:37:52.719843] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:29.435 [2024-06-07 12:37:52.928069] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:29.435 [2024-06-07 12:37:52.928756] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:29.696 [2024-06-07 12:37:53.153673] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:35:29.955 [2024-06-07 12:37:53.379583] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:35:30.214 [2024-06-07 12:37:53.604999] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:30.214 [2024-06-07 12:37:53.606040] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:30.214 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:30.214 [2024-06-07 12:37:53.815750] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:30.214 [2024-06-07 12:37:53.816436] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:30.472 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:30.472 "name": "raid_bdev1", 00:35:30.472 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:30.472 "strip_size_kb": 0, 00:35:30.472 "state": "online", 00:35:30.472 "raid_level": "raid1", 00:35:30.472 "superblock": true, 00:35:30.472 "num_base_bdevs": 2, 00:35:30.472 "num_base_bdevs_discovered": 2, 00:35:30.472 "num_base_bdevs_operational": 2, 00:35:30.472 "process": { 00:35:30.472 "type": "rebuild", 00:35:30.472 "target": "spare", 00:35:30.472 "progress": { 00:35:30.472 "blocks": 16384, 00:35:30.472 "percent": 25 00:35:30.472 } 00:35:30.472 }, 00:35:30.472 "base_bdevs_list": [ 00:35:30.472 { 00:35:30.472 "name": "spare", 00:35:30.472 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:30.472 "is_configured": true, 00:35:30.472 "data_offset": 2048, 00:35:30.472 "data_size": 63488 00:35:30.472 }, 00:35:30.472 { 00:35:30.472 "name": "BaseBdev2", 00:35:30.472 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:30.472 "is_configured": true, 00:35:30.472 "data_offset": 2048, 00:35:30.472 "data_size": 63488 00:35:30.472 } 00:35:30.472 ] 00:35:30.472 }' 00:35:30.472 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:30.472 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:30.472 12:37:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:30.472 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:30.472 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:35:30.730 [2024-06-07 12:37:54.256589] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:35:30.730 [2024-06-07 12:37:54.305608] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:30.988 [2024-06-07 12:37:54.389632] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:35:30.988 [2024-06-07 12:37:54.397508] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:30.988 [2024-06-07 12:37:54.397843] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:30.988 [2024-06-07 12:37:54.397896] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:35:30.988 [2024-06-07 12:37:54.416743] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d0000026d0 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:30.988 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:31.246 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:31.246 "name": "raid_bdev1", 00:35:31.246 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:31.246 "strip_size_kb": 0, 00:35:31.246 "state": "online", 00:35:31.246 "raid_level": "raid1", 00:35:31.246 "superblock": true, 00:35:31.246 "num_base_bdevs": 2, 00:35:31.246 "num_base_bdevs_discovered": 1, 00:35:31.246 "num_base_bdevs_operational": 1, 00:35:31.246 "base_bdevs_list": [ 00:35:31.246 { 00:35:31.246 "name": null, 00:35:31.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:31.246 "is_configured": false, 00:35:31.246 "data_offset": 2048, 00:35:31.246 "data_size": 63488 00:35:31.246 }, 00:35:31.246 { 00:35:31.246 "name": "BaseBdev2", 00:35:31.246 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:31.246 "is_configured": true, 00:35:31.246 "data_offset": 2048, 00:35:31.246 "data_size": 63488 00:35:31.246 } 00:35:31.246 ] 00:35:31.246 }' 00:35:31.246 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:31.246 12:37:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:31.812 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:32.378 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:32.378 "name": "raid_bdev1", 00:35:32.378 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:32.378 "strip_size_kb": 0, 00:35:32.378 "state": "online", 00:35:32.378 "raid_level": "raid1", 00:35:32.378 "superblock": true, 00:35:32.378 "num_base_bdevs": 2, 00:35:32.378 "num_base_bdevs_discovered": 1, 00:35:32.378 "num_base_bdevs_operational": 1, 00:35:32.378 "base_bdevs_list": [ 00:35:32.378 { 00:35:32.378 "name": null, 00:35:32.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:32.378 "is_configured": false, 00:35:32.378 "data_offset": 2048, 00:35:32.378 "data_size": 63488 00:35:32.378 }, 00:35:32.378 { 00:35:32.378 "name": "BaseBdev2", 00:35:32.378 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:32.378 "is_configured": true, 00:35:32.378 "data_offset": 2048, 00:35:32.378 "data_size": 63488 00:35:32.378 } 00:35:32.378 ] 00:35:32.378 }' 00:35:32.378 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:32.378 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:32.378 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:32.378 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:32.379 12:37:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:35:32.636 [2024-06-07 12:37:56.059541] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:32.636 [2024-06-07 12:37:56.111215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002940 00:35:32.636 [2024-06-07 12:37:56.113738] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:32.636 12:37:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:35:32.636 [2024-06-07 12:37:56.232763] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:32.636 [2024-06-07 12:37:56.233740] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:35:32.893 [2024-06-07 12:37:56.458359] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:32.893 [2024-06-07 12:37:56.459068] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:35:33.151 [2024-06-07 12:37:56.692709] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:35:33.409 [2024-06-07 12:37:56.905845] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:35:33.409 [2024-06-07 12:37:56.906497] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:33.667 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:33.667 [2024-06-07 12:37:57.226563] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:33.926 "name": "raid_bdev1", 00:35:33.926 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:33.926 "strip_size_kb": 0, 00:35:33.926 "state": "online", 00:35:33.926 "raid_level": "raid1", 00:35:33.926 "superblock": true, 00:35:33.926 "num_base_bdevs": 2, 00:35:33.926 "num_base_bdevs_discovered": 2, 00:35:33.926 "num_base_bdevs_operational": 2, 00:35:33.926 "process": { 00:35:33.926 "type": "rebuild", 00:35:33.926 "target": "spare", 00:35:33.926 "progress": { 00:35:33.926 "blocks": 14336, 00:35:33.926 "percent": 22 00:35:33.926 } 00:35:33.926 }, 00:35:33.926 "base_bdevs_list": [ 00:35:33.926 { 00:35:33.926 "name": "spare", 00:35:33.926 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:33.926 "is_configured": true, 00:35:33.926 "data_offset": 2048, 00:35:33.926 "data_size": 63488 00:35:33.926 }, 00:35:33.926 { 00:35:33.926 "name": "BaseBdev2", 00:35:33.926 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:33.926 "is_configured": true, 00:35:33.926 "data_offset": 2048, 00:35:33.926 "data_size": 63488 00:35:33.926 } 00:35:33.926 ] 00:35:33.926 }' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:33.926 [2024-06-07 12:37:57.443140] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:33.926 [2024-06-07 12:37:57.443842] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:35:33.926 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=884 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:33.926 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:34.192 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:34.192 "name": "raid_bdev1", 00:35:34.192 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:34.192 "strip_size_kb": 0, 00:35:34.192 "state": "online", 00:35:34.192 "raid_level": "raid1", 00:35:34.192 "superblock": true, 00:35:34.192 "num_base_bdevs": 2, 00:35:34.192 "num_base_bdevs_discovered": 2, 00:35:34.192 "num_base_bdevs_operational": 2, 00:35:34.192 "process": { 00:35:34.192 "type": "rebuild", 00:35:34.192 "target": "spare", 00:35:34.192 "progress": { 00:35:34.192 "blocks": 18432, 00:35:34.192 "percent": 29 00:35:34.192 } 00:35:34.192 }, 00:35:34.192 "base_bdevs_list": [ 00:35:34.192 { 00:35:34.192 "name": "spare", 00:35:34.192 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:34.192 "is_configured": true, 00:35:34.192 "data_offset": 2048, 00:35:34.192 "data_size": 63488 00:35:34.192 }, 00:35:34.192 { 00:35:34.192 "name": "BaseBdev2", 00:35:34.192 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:34.192 "is_configured": true, 00:35:34.192 "data_offset": 2048, 00:35:34.192 "data_size": 63488 00:35:34.192 } 00:35:34.192 ] 00:35:34.192 }' 00:35:34.192 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:34.452 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:34.452 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:34.452 [2024-06-07 12:37:57.881998] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:35:34.452 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:34.452 12:37:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:34.709 [2024-06-07 12:37:58.188589] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:35:34.967 [2024-06-07 12:37:58.402549] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:35:35.225 [2024-06-07 12:37:58.836603] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:35.484 12:37:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:35.742 "name": "raid_bdev1", 00:35:35.742 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:35.742 "strip_size_kb": 0, 00:35:35.742 "state": "online", 00:35:35.742 "raid_level": "raid1", 00:35:35.742 "superblock": true, 00:35:35.742 "num_base_bdevs": 2, 00:35:35.742 "num_base_bdevs_discovered": 2, 00:35:35.742 "num_base_bdevs_operational": 2, 00:35:35.742 "process": { 00:35:35.742 "type": "rebuild", 00:35:35.742 "target": "spare", 00:35:35.742 "progress": { 00:35:35.742 "blocks": 38912, 00:35:35.742 "percent": 61 00:35:35.742 } 00:35:35.742 }, 00:35:35.742 "base_bdevs_list": [ 00:35:35.742 { 00:35:35.742 "name": "spare", 00:35:35.742 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:35.742 "is_configured": true, 00:35:35.742 "data_offset": 2048, 00:35:35.742 "data_size": 63488 00:35:35.742 }, 00:35:35.742 { 00:35:35.742 "name": "BaseBdev2", 00:35:35.742 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:35.742 "is_configured": true, 00:35:35.742 "data_offset": 2048, 00:35:35.742 "data_size": 63488 00:35:35.742 } 00:35:35.742 ] 00:35:35.742 }' 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:35.742 [2024-06-07 12:37:59.162814] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:35.742 12:37:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:36.309 [2024-06-07 12:37:59.806925] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:35:36.566 [2024-06-07 12:38:00.020895] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:36.824 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:36.824 [2024-06-07 12:38:00.339817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:37.082 "name": "raid_bdev1", 00:35:37.082 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:37.082 "strip_size_kb": 0, 00:35:37.082 "state": "online", 00:35:37.082 "raid_level": "raid1", 00:35:37.082 "superblock": true, 00:35:37.082 "num_base_bdevs": 2, 00:35:37.082 "num_base_bdevs_discovered": 2, 00:35:37.082 "num_base_bdevs_operational": 2, 00:35:37.082 "process": { 00:35:37.082 "type": "rebuild", 00:35:37.082 "target": "spare", 00:35:37.082 "progress": { 00:35:37.082 "blocks": 59392, 00:35:37.082 "percent": 93 00:35:37.082 } 00:35:37.082 }, 00:35:37.082 "base_bdevs_list": [ 00:35:37.082 { 00:35:37.082 "name": "spare", 00:35:37.082 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:37.082 "is_configured": true, 00:35:37.082 "data_offset": 2048, 00:35:37.082 "data_size": 63488 00:35:37.082 }, 00:35:37.082 { 00:35:37.082 "name": "BaseBdev2", 00:35:37.082 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:37.082 "is_configured": true, 00:35:37.082 "data_offset": 2048, 00:35:37.082 "data_size": 63488 00:35:37.082 } 00:35:37.082 ] 00:35:37.082 }' 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:37.082 12:38:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:35:37.082 [2024-06-07 12:38:00.664882] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:35:37.341 [2024-06-07 12:38:00.764832] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:35:37.341 [2024-06-07 12:38:00.767734] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:38.275 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:38.534 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:38.534 "name": "raid_bdev1", 00:35:38.534 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:38.534 "strip_size_kb": 0, 00:35:38.534 "state": "online", 00:35:38.534 "raid_level": "raid1", 00:35:38.534 "superblock": true, 00:35:38.534 "num_base_bdevs": 2, 00:35:38.534 "num_base_bdevs_discovered": 2, 00:35:38.534 "num_base_bdevs_operational": 2, 00:35:38.534 "base_bdevs_list": [ 00:35:38.534 { 00:35:38.534 "name": "spare", 00:35:38.534 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:38.534 "is_configured": true, 00:35:38.534 "data_offset": 2048, 00:35:38.534 "data_size": 63488 00:35:38.534 }, 00:35:38.534 { 00:35:38.534 "name": "BaseBdev2", 00:35:38.534 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:38.534 "is_configured": true, 00:35:38.534 "data_offset": 2048, 00:35:38.534 "data_size": 63488 00:35:38.534 } 00:35:38.534 ] 00:35:38.534 }' 00:35:38.534 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:38.534 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:35:38.534 12:38:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:38.534 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:38.792 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:38.792 "name": "raid_bdev1", 00:35:38.792 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:38.792 "strip_size_kb": 0, 00:35:38.792 "state": "online", 00:35:38.792 "raid_level": "raid1", 00:35:38.792 "superblock": true, 00:35:38.793 "num_base_bdevs": 2, 00:35:38.793 "num_base_bdevs_discovered": 2, 00:35:38.793 "num_base_bdevs_operational": 2, 00:35:38.793 "base_bdevs_list": [ 00:35:38.793 { 00:35:38.793 "name": "spare", 00:35:38.793 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:38.793 "is_configured": true, 00:35:38.793 "data_offset": 2048, 00:35:38.793 "data_size": 63488 00:35:38.793 }, 00:35:38.793 { 00:35:38.793 "name": "BaseBdev2", 00:35:38.793 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:38.793 "is_configured": true, 00:35:38.793 "data_offset": 2048, 00:35:38.793 "data_size": 63488 00:35:38.793 } 00:35:38.793 ] 00:35:38.793 }' 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:38.793 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:39.050 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:39.050 "name": "raid_bdev1", 00:35:39.050 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:39.050 "strip_size_kb": 0, 00:35:39.050 "state": "online", 00:35:39.050 "raid_level": "raid1", 00:35:39.050 "superblock": true, 00:35:39.050 "num_base_bdevs": 2, 00:35:39.050 "num_base_bdevs_discovered": 2, 00:35:39.050 "num_base_bdevs_operational": 2, 00:35:39.050 "base_bdevs_list": [ 00:35:39.050 { 00:35:39.050 "name": "spare", 00:35:39.050 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:39.050 "is_configured": true, 00:35:39.050 "data_offset": 2048, 00:35:39.050 "data_size": 63488 00:35:39.050 }, 00:35:39.050 { 00:35:39.050 "name": "BaseBdev2", 00:35:39.050 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:39.050 "is_configured": true, 00:35:39.050 "data_offset": 2048, 00:35:39.050 "data_size": 63488 00:35:39.050 } 00:35:39.050 ] 00:35:39.050 }' 00:35:39.050 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:39.050 12:38:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:39.615 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:35:39.872 [2024-06-07 12:38:03.512700] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:35:39.872 [2024-06-07 12:38:03.512763] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:35:40.130 00:35:40.130 Latency(us) 00:35:40.130 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:40.130 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:35:40.130 raid_bdev1 : 12.46 133.28 399.85 0.00 0.00 10742.19 401.80 111848.11 00:35:40.130 =================================================================================================================== 00:35:40.130 Total : 133.28 399.85 0.00 0.00 10742.19 401.80 111848.11 00:35:40.130 [2024-06-07 12:38:03.536884] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:40.130 [2024-06-07 12:38:03.536969] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:40.130 [2024-06-07 12:38:03.537058] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:40.130 0 00:35:40.130 [2024-06-07 12:38:03.537070] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:35:40.130 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:40.130 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:40.390 12:38:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:35:40.649 /dev/nbd0 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:40.649 1+0 records in 00:35:40.649 1+0 records out 00:35:40.649 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000614151 s, 6.7 MB/s 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:40.649 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:35:40.907 /dev/nbd1 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:40.907 1+0 records in 00:35:40.907 1+0 records out 00:35:40.907 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295397 s, 13.9 MB/s 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:35:40.907 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:41.164 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:41.421 12:38:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:35:41.679 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:35:41.936 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:35:42.193 [2024-06-07 12:38:05.771975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:35:42.193 [2024-06-07 12:38:05.772419] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:42.194 [2024-06-07 12:38:05.772514] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:35:42.194 [2024-06-07 12:38:05.772852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:42.194 [2024-06-07 12:38:05.775314] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:42.194 [2024-06-07 12:38:05.775536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:35:42.194 [2024-06-07 12:38:05.775818] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:35:42.194 [2024-06-07 12:38:05.776013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:42.194 [2024-06-07 12:38:05.776311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:35:42.194 spare 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:42.194 12:38:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:42.452 [2024-06-07 12:38:05.876605] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:35:42.452 [2024-06-07 12:38:05.876908] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:35:42.452 [2024-06-07 12:38:05.877216] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000027a60 00:35:42.452 [2024-06-07 12:38:05.877870] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:35:42.452 [2024-06-07 12:38:05.877990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:35:42.452 [2024-06-07 12:38:05.878203] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:42.710 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:42.710 "name": "raid_bdev1", 00:35:42.710 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:42.710 "strip_size_kb": 0, 00:35:42.710 "state": "online", 00:35:42.710 "raid_level": "raid1", 00:35:42.710 "superblock": true, 00:35:42.710 "num_base_bdevs": 2, 00:35:42.710 "num_base_bdevs_discovered": 2, 00:35:42.710 "num_base_bdevs_operational": 2, 00:35:42.710 "base_bdevs_list": [ 00:35:42.710 { 00:35:42.710 "name": "spare", 00:35:42.710 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:42.710 "is_configured": true, 00:35:42.710 "data_offset": 2048, 00:35:42.710 "data_size": 63488 00:35:42.710 }, 00:35:42.710 { 00:35:42.710 "name": "BaseBdev2", 00:35:42.710 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:42.710 "is_configured": true, 00:35:42.710 "data_offset": 2048, 00:35:42.710 "data_size": 63488 00:35:42.710 } 00:35:42.710 ] 00:35:42.710 }' 00:35:42.710 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:42.710 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:43.277 12:38:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:43.536 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:43.536 "name": "raid_bdev1", 00:35:43.536 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:43.536 "strip_size_kb": 0, 00:35:43.536 "state": "online", 00:35:43.536 "raid_level": "raid1", 00:35:43.536 "superblock": true, 00:35:43.536 "num_base_bdevs": 2, 00:35:43.536 "num_base_bdevs_discovered": 2, 00:35:43.536 "num_base_bdevs_operational": 2, 00:35:43.536 "base_bdevs_list": [ 00:35:43.536 { 00:35:43.536 "name": "spare", 00:35:43.536 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:43.536 "is_configured": true, 00:35:43.536 "data_offset": 2048, 00:35:43.536 "data_size": 63488 00:35:43.536 }, 00:35:43.536 { 00:35:43.536 "name": "BaseBdev2", 00:35:43.536 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:43.536 "is_configured": true, 00:35:43.536 "data_offset": 2048, 00:35:43.536 "data_size": 63488 00:35:43.536 } 00:35:43.536 ] 00:35:43.536 }' 00:35:43.536 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:43.536 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:43.536 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:43.795 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:43.795 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:43.795 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:35:44.053 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:35:44.053 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:35:44.355 [2024-06-07 12:38:07.728894] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:44.355 12:38:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:44.654 12:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:44.654 "name": "raid_bdev1", 00:35:44.654 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:44.654 "strip_size_kb": 0, 00:35:44.654 "state": "online", 00:35:44.654 "raid_level": "raid1", 00:35:44.654 "superblock": true, 00:35:44.654 "num_base_bdevs": 2, 00:35:44.654 "num_base_bdevs_discovered": 1, 00:35:44.654 "num_base_bdevs_operational": 1, 00:35:44.654 "base_bdevs_list": [ 00:35:44.654 { 00:35:44.654 "name": null, 00:35:44.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:44.654 "is_configured": false, 00:35:44.654 "data_offset": 2048, 00:35:44.654 "data_size": 63488 00:35:44.654 }, 00:35:44.654 { 00:35:44.654 "name": "BaseBdev2", 00:35:44.654 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:44.654 "is_configured": true, 00:35:44.654 "data_offset": 2048, 00:35:44.654 "data_size": 63488 00:35:44.654 } 00:35:44.654 ] 00:35:44.654 }' 00:35:44.654 12:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:44.654 12:38:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:45.221 12:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:35:45.480 [2024-06-07 12:38:08.897208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:45.480 [2024-06-07 12:38:08.897776] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:35:45.480 [2024-06-07 12:38:08.897902] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:35:45.480 [2024-06-07 12:38:08.898086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:45.480 [2024-06-07 12:38:08.906895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000027c00 00:35:45.480 [2024-06-07 12:38:08.909461] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:45.480 12:38:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:46.415 12:38:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:46.674 "name": "raid_bdev1", 00:35:46.674 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:46.674 "strip_size_kb": 0, 00:35:46.674 "state": "online", 00:35:46.674 "raid_level": "raid1", 00:35:46.674 "superblock": true, 00:35:46.674 "num_base_bdevs": 2, 00:35:46.674 "num_base_bdevs_discovered": 2, 00:35:46.674 "num_base_bdevs_operational": 2, 00:35:46.674 "process": { 00:35:46.674 "type": "rebuild", 00:35:46.674 "target": "spare", 00:35:46.674 "progress": { 00:35:46.674 "blocks": 24576, 00:35:46.674 "percent": 38 00:35:46.674 } 00:35:46.674 }, 00:35:46.674 "base_bdevs_list": [ 00:35:46.674 { 00:35:46.674 "name": "spare", 00:35:46.674 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:46.674 "is_configured": true, 00:35:46.674 "data_offset": 2048, 00:35:46.674 "data_size": 63488 00:35:46.674 }, 00:35:46.674 { 00:35:46.674 "name": "BaseBdev2", 00:35:46.674 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:46.674 "is_configured": true, 00:35:46.674 "data_offset": 2048, 00:35:46.674 "data_size": 63488 00:35:46.674 } 00:35:46.674 ] 00:35:46.674 }' 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:46.674 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:35:47.242 [2024-06-07 12:38:10.583807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:47.242 [2024-06-07 12:38:10.622367] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:35:47.242 [2024-06-07 12:38:10.622753] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:47.242 [2024-06-07 12:38:10.622813] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:47.242 [2024-06-07 12:38:10.622900] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:47.242 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:47.501 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:47.501 "name": "raid_bdev1", 00:35:47.501 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:47.501 "strip_size_kb": 0, 00:35:47.501 "state": "online", 00:35:47.501 "raid_level": "raid1", 00:35:47.501 "superblock": true, 00:35:47.501 "num_base_bdevs": 2, 00:35:47.501 "num_base_bdevs_discovered": 1, 00:35:47.501 "num_base_bdevs_operational": 1, 00:35:47.501 "base_bdevs_list": [ 00:35:47.501 { 00:35:47.501 "name": null, 00:35:47.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:47.501 "is_configured": false, 00:35:47.501 "data_offset": 2048, 00:35:47.501 "data_size": 63488 00:35:47.501 }, 00:35:47.501 { 00:35:47.501 "name": "BaseBdev2", 00:35:47.501 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:47.501 "is_configured": true, 00:35:47.501 "data_offset": 2048, 00:35:47.501 "data_size": 63488 00:35:47.501 } 00:35:47.501 ] 00:35:47.501 }' 00:35:47.501 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:47.501 12:38:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:48.068 12:38:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:35:48.327 [2024-06-07 12:38:11.839906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:35:48.327 [2024-06-07 12:38:11.840353] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:48.327 [2024-06-07 12:38:11.840564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009c80 00:35:48.327 [2024-06-07 12:38:11.840695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:48.327 [2024-06-07 12:38:11.841178] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:48.327 [2024-06-07 12:38:11.841360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:35:48.327 [2024-06-07 12:38:11.841586] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:35:48.327 [2024-06-07 12:38:11.841708] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:35:48.327 [2024-06-07 12:38:11.841803] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:35:48.327 [2024-06-07 12:38:11.841908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:35:48.327 [2024-06-07 12:38:11.850775] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000027f40 00:35:48.327 spare 00:35:48.327 [2024-06-07 12:38:11.853375] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:35:48.327 12:38:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:49.263 12:38:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:49.829 "name": "raid_bdev1", 00:35:49.829 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:49.829 "strip_size_kb": 0, 00:35:49.829 "state": "online", 00:35:49.829 "raid_level": "raid1", 00:35:49.829 "superblock": true, 00:35:49.829 "num_base_bdevs": 2, 00:35:49.829 "num_base_bdevs_discovered": 2, 00:35:49.829 "num_base_bdevs_operational": 2, 00:35:49.829 "process": { 00:35:49.829 "type": "rebuild", 00:35:49.829 "target": "spare", 00:35:49.829 "progress": { 00:35:49.829 "blocks": 26624, 00:35:49.829 "percent": 41 00:35:49.829 } 00:35:49.829 }, 00:35:49.829 "base_bdevs_list": [ 00:35:49.829 { 00:35:49.829 "name": "spare", 00:35:49.829 "uuid": "b466b501-8b26-54f0-9971-6557d41c8b3e", 00:35:49.829 "is_configured": true, 00:35:49.829 "data_offset": 2048, 00:35:49.829 "data_size": 63488 00:35:49.829 }, 00:35:49.829 { 00:35:49.829 "name": "BaseBdev2", 00:35:49.829 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:49.829 "is_configured": true, 00:35:49.829 "data_offset": 2048, 00:35:49.829 "data_size": 63488 00:35:49.829 } 00:35:49.829 ] 00:35:49.829 }' 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:35:49.829 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:35:50.087 [2024-06-07 12:38:13.616090] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:50.087 [2024-06-07 12:38:13.665625] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:35:50.087 [2024-06-07 12:38:13.665963] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:35:50.087 [2024-06-07 12:38:13.666089] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:35:50.087 [2024-06-07 12:38:13.666133] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:50.087 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:50.653 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:50.653 "name": "raid_bdev1", 00:35:50.653 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:50.653 "strip_size_kb": 0, 00:35:50.653 "state": "online", 00:35:50.653 "raid_level": "raid1", 00:35:50.653 "superblock": true, 00:35:50.653 "num_base_bdevs": 2, 00:35:50.653 "num_base_bdevs_discovered": 1, 00:35:50.653 "num_base_bdevs_operational": 1, 00:35:50.653 "base_bdevs_list": [ 00:35:50.653 { 00:35:50.653 "name": null, 00:35:50.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:50.653 "is_configured": false, 00:35:50.653 "data_offset": 2048, 00:35:50.653 "data_size": 63488 00:35:50.653 }, 00:35:50.653 { 00:35:50.653 "name": "BaseBdev2", 00:35:50.653 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:50.653 "is_configured": true, 00:35:50.653 "data_offset": 2048, 00:35:50.653 "data_size": 63488 00:35:50.653 } 00:35:50.653 ] 00:35:50.653 }' 00:35:50.653 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:50.653 12:38:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:51.217 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:51.475 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:51.475 "name": "raid_bdev1", 00:35:51.475 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:51.475 "strip_size_kb": 0, 00:35:51.475 "state": "online", 00:35:51.475 "raid_level": "raid1", 00:35:51.475 "superblock": true, 00:35:51.475 "num_base_bdevs": 2, 00:35:51.475 "num_base_bdevs_discovered": 1, 00:35:51.475 "num_base_bdevs_operational": 1, 00:35:51.476 "base_bdevs_list": [ 00:35:51.476 { 00:35:51.476 "name": null, 00:35:51.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:51.476 "is_configured": false, 00:35:51.476 "data_offset": 2048, 00:35:51.476 "data_size": 63488 00:35:51.476 }, 00:35:51.476 { 00:35:51.476 "name": "BaseBdev2", 00:35:51.476 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:51.476 "is_configured": true, 00:35:51.476 "data_offset": 2048, 00:35:51.476 "data_size": 63488 00:35:51.476 } 00:35:51.476 ] 00:35:51.476 }' 00:35:51.476 12:38:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:51.476 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:51.476 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:51.476 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:51.476 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:35:51.733 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:35:52.298 [2024-06-07 12:38:15.703516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:35:52.298 [2024-06-07 12:38:15.703937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:52.298 [2024-06-07 12:38:15.704170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a280 00:35:52.298 [2024-06-07 12:38:15.704365] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:52.298 [2024-06-07 12:38:15.704926] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:52.298 [2024-06-07 12:38:15.705106] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:52.298 [2024-06-07 12:38:15.705370] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:35:52.298 [2024-06-07 12:38:15.705498] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:35:52.298 [2024-06-07 12:38:15.705637] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:35:52.298 BaseBdev1 00:35:52.298 12:38:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:53.231 12:38:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:53.489 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:53.489 "name": "raid_bdev1", 00:35:53.489 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:53.489 "strip_size_kb": 0, 00:35:53.489 "state": "online", 00:35:53.489 "raid_level": "raid1", 00:35:53.489 "superblock": true, 00:35:53.489 "num_base_bdevs": 2, 00:35:53.489 "num_base_bdevs_discovered": 1, 00:35:53.489 "num_base_bdevs_operational": 1, 00:35:53.489 "base_bdevs_list": [ 00:35:53.489 { 00:35:53.489 "name": null, 00:35:53.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:53.489 "is_configured": false, 00:35:53.489 "data_offset": 2048, 00:35:53.489 "data_size": 63488 00:35:53.489 }, 00:35:53.489 { 00:35:53.489 "name": "BaseBdev2", 00:35:53.489 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:53.489 "is_configured": true, 00:35:53.489 "data_offset": 2048, 00:35:53.489 "data_size": 63488 00:35:53.489 } 00:35:53.489 ] 00:35:53.489 }' 00:35:53.489 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:53.489 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:54.421 12:38:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:54.421 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:54.421 "name": "raid_bdev1", 00:35:54.421 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:54.421 "strip_size_kb": 0, 00:35:54.421 "state": "online", 00:35:54.421 "raid_level": "raid1", 00:35:54.421 "superblock": true, 00:35:54.421 "num_base_bdevs": 2, 00:35:54.421 "num_base_bdevs_discovered": 1, 00:35:54.421 "num_base_bdevs_operational": 1, 00:35:54.421 "base_bdevs_list": [ 00:35:54.421 { 00:35:54.421 "name": null, 00:35:54.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:54.421 "is_configured": false, 00:35:54.421 "data_offset": 2048, 00:35:54.421 "data_size": 63488 00:35:54.421 }, 00:35:54.421 { 00:35:54.421 "name": "BaseBdev2", 00:35:54.421 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:54.421 "is_configured": true, 00:35:54.421 "data_offset": 2048, 00:35:54.421 "data_size": 63488 00:35:54.421 } 00:35:54.421 ] 00:35:54.421 }' 00:35:54.421 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:35:54.679 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:35:54.955 [2024-06-07 12:38:18.356344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:35:54.955 [2024-06-07 12:38:18.356791] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:35:54.955 [2024-06-07 12:38:18.356915] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:35:54.955 request: 00:35:54.955 { 00:35:54.955 "raid_bdev": "raid_bdev1", 00:35:54.955 "base_bdev": "BaseBdev1", 00:35:54.955 "method": "bdev_raid_add_base_bdev", 00:35:54.955 "req_id": 1 00:35:54.955 } 00:35:54.955 Got JSON-RPC error response 00:35:54.955 response: 00:35:54.955 { 00:35:54.955 "code": -22, 00:35:54.955 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:35:54.955 } 00:35:54.955 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:35:54.955 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:35:54.955 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:35:54.955 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:35:54.955 12:38:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:55.887 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:56.169 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:35:56.169 "name": "raid_bdev1", 00:35:56.169 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:56.169 "strip_size_kb": 0, 00:35:56.169 "state": "online", 00:35:56.169 "raid_level": "raid1", 00:35:56.169 "superblock": true, 00:35:56.169 "num_base_bdevs": 2, 00:35:56.169 "num_base_bdevs_discovered": 1, 00:35:56.169 "num_base_bdevs_operational": 1, 00:35:56.169 "base_bdevs_list": [ 00:35:56.169 { 00:35:56.169 "name": null, 00:35:56.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:56.169 "is_configured": false, 00:35:56.169 "data_offset": 2048, 00:35:56.169 "data_size": 63488 00:35:56.169 }, 00:35:56.169 { 00:35:56.169 "name": "BaseBdev2", 00:35:56.169 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:56.169 "is_configured": true, 00:35:56.169 "data_offset": 2048, 00:35:56.169 "data_size": 63488 00:35:56.169 } 00:35:56.169 ] 00:35:56.169 }' 00:35:56.169 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:35:56.169 12:38:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:35:57.102 "name": "raid_bdev1", 00:35:57.102 "uuid": "55005bba-5e0f-4a1b-94bd-6bc2d3759837", 00:35:57.102 "strip_size_kb": 0, 00:35:57.102 "state": "online", 00:35:57.102 "raid_level": "raid1", 00:35:57.102 "superblock": true, 00:35:57.102 "num_base_bdevs": 2, 00:35:57.102 "num_base_bdevs_discovered": 1, 00:35:57.102 "num_base_bdevs_operational": 1, 00:35:57.102 "base_bdevs_list": [ 00:35:57.102 { 00:35:57.102 "name": null, 00:35:57.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:35:57.102 "is_configured": false, 00:35:57.102 "data_offset": 2048, 00:35:57.102 "data_size": 63488 00:35:57.102 }, 00:35:57.102 { 00:35:57.102 "name": "BaseBdev2", 00:35:57.102 "uuid": "7442bca3-03c7-5b97-981f-a7decc3e6fd0", 00:35:57.102 "is_configured": true, 00:35:57.102 "data_offset": 2048, 00:35:57.102 "data_size": 63488 00:35:57.102 } 00:35:57.102 ] 00:35:57.102 }' 00:35:57.102 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 223136 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 223136 ']' 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 223136 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 223136 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 223136' 00:35:57.360 killing process with pid 223136 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 223136 00:35:57.360 Received shutdown signal, test time was about 29.778482 seconds 00:35:57.360 00:35:57.360 Latency(us) 00:35:57.360 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:57.360 =================================================================================================================== 00:35:57.360 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:57.360 12:38:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 223136 00:35:57.360 [2024-06-07 12:38:20.849254] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:35:57.361 [2024-06-07 12:38:20.849525] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:35:57.361 [2024-06-07 12:38:20.849673] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:35:57.361 [2024-06-07 12:38:20.849752] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:35:57.361 [2024-06-07 12:38:20.901681] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:35:57.926 00:35:57.926 real 0m34.919s 00:35:57.926 user 0m55.813s 00:35:57.926 sys 0m4.882s 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:35:57.926 ************************************ 00:35:57.926 END TEST raid_rebuild_test_sb_io 00:35:57.926 ************************************ 00:35:57.926 12:38:21 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:35:57.926 12:38:21 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:35:57.926 12:38:21 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:35:57.926 12:38:21 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:35:57.926 12:38:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:35:57.926 ************************************ 00:35:57.926 START TEST raid_rebuild_test 00:35:57.926 ************************************ 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false false true 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev3 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # echo BaseBdev4 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:35:57.926 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=224016 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 224016 /var/tmp/spdk-raid.sock 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 224016 ']' 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:35:57.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:35:57.927 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:35:57.927 [2024-06-07 12:38:21.419723] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:35:57.927 [2024-06-07 12:38:21.420322] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid224016 ] 00:35:57.927 I/O size of 3145728 is greater than zero copy threshold (65536). 00:35:57.927 Zero copy mechanism will not be used. 00:35:57.927 [2024-06-07 12:38:21.566361] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:58.184 [2024-06-07 12:38:21.671150] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:35:58.184 [2024-06-07 12:38:21.753760] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:35:58.184 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:35:58.184 12:38:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:35:58.184 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:58.184 12:38:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:35:58.748 BaseBdev1_malloc 00:35:58.748 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:35:58.748 [2024-06-07 12:38:22.349007] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:35:58.748 [2024-06-07 12:38:22.349487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:58.748 [2024-06-07 12:38:22.349685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:35:58.748 [2024-06-07 12:38:22.349887] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:58.748 [2024-06-07 12:38:22.352632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:58.748 [2024-06-07 12:38:22.352846] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:35:58.748 BaseBdev1 00:35:58.748 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:58.748 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:35:59.313 BaseBdev2_malloc 00:35:59.313 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:35:59.572 [2024-06-07 12:38:22.973504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:35:59.572 [2024-06-07 12:38:22.973893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:59.572 [2024-06-07 12:38:22.974000] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:35:59.572 [2024-06-07 12:38:22.974175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:59.572 [2024-06-07 12:38:22.976853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:59.572 [2024-06-07 12:38:22.977082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:35:59.572 BaseBdev2 00:35:59.572 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:35:59.572 12:38:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:35:59.858 BaseBdev3_malloc 00:35:59.858 12:38:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:36:00.117 [2024-06-07 12:38:23.501411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:36:00.117 [2024-06-07 12:38:23.501782] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:00.117 [2024-06-07 12:38:23.501933] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:36:00.117 [2024-06-07 12:38:23.502086] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:00.117 [2024-06-07 12:38:23.504394] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:00.117 [2024-06-07 12:38:23.504580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:36:00.117 BaseBdev3 00:36:00.117 12:38:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:00.117 12:38:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:36:00.117 BaseBdev4_malloc 00:36:00.376 12:38:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:36:00.376 [2024-06-07 12:38:23.985304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:36:00.376 [2024-06-07 12:38:23.985715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:00.376 [2024-06-07 12:38:23.985798] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:36:00.376 [2024-06-07 12:38:23.985957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:00.376 [2024-06-07 12:38:23.988305] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:00.376 [2024-06-07 12:38:23.988544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:36:00.376 BaseBdev4 00:36:00.376 12:38:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:36:00.943 spare_malloc 00:36:00.943 12:38:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:36:01.201 spare_delay 00:36:01.201 12:38:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:36:01.201 [2024-06-07 12:38:24.837147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:36:01.201 [2024-06-07 12:38:24.837530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:01.201 [2024-06-07 12:38:24.837613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:36:01.202 [2024-06-07 12:38:24.837777] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:01.202 [2024-06-07 12:38:24.840177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:01.202 [2024-06-07 12:38:24.840413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:36:01.202 spare 00:36:01.460 12:38:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:36:01.460 [2024-06-07 12:38:25.065327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:01.460 [2024-06-07 12:38:25.067677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:01.460 [2024-06-07 12:38:25.067883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:01.460 [2024-06-07 12:38:25.067952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:01.460 [2024-06-07 12:38:25.068178] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:36:01.460 [2024-06-07 12:38:25.068286] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:36:01.460 [2024-06-07 12:38:25.068547] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:36:01.460 [2024-06-07 12:38:25.068977] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:36:01.460 [2024-06-07 12:38:25.069089] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:36:01.460 [2024-06-07 12:38:25.069405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:01.460 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:01.718 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:01.718 "name": "raid_bdev1", 00:36:01.718 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:01.718 "strip_size_kb": 0, 00:36:01.718 "state": "online", 00:36:01.718 "raid_level": "raid1", 00:36:01.718 "superblock": false, 00:36:01.718 "num_base_bdevs": 4, 00:36:01.718 "num_base_bdevs_discovered": 4, 00:36:01.718 "num_base_bdevs_operational": 4, 00:36:01.718 "base_bdevs_list": [ 00:36:01.718 { 00:36:01.718 "name": "BaseBdev1", 00:36:01.718 "uuid": "e7ea5c53-7715-595e-b235-2dccc52115f6", 00:36:01.718 "is_configured": true, 00:36:01.718 "data_offset": 0, 00:36:01.718 "data_size": 65536 00:36:01.718 }, 00:36:01.718 { 00:36:01.718 "name": "BaseBdev2", 00:36:01.718 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:01.718 "is_configured": true, 00:36:01.718 "data_offset": 0, 00:36:01.718 "data_size": 65536 00:36:01.718 }, 00:36:01.718 { 00:36:01.718 "name": "BaseBdev3", 00:36:01.718 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:01.718 "is_configured": true, 00:36:01.718 "data_offset": 0, 00:36:01.718 "data_size": 65536 00:36:01.718 }, 00:36:01.718 { 00:36:01.718 "name": "BaseBdev4", 00:36:01.718 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:01.718 "is_configured": true, 00:36:01.719 "data_offset": 0, 00:36:01.719 "data_size": 65536 00:36:01.719 } 00:36:01.719 ] 00:36:01.719 }' 00:36:01.719 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:01.719 12:38:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:02.652 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:36:02.652 12:38:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:02.652 [2024-06-07 12:38:26.249795] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:02.652 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:36:02.652 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:36:02.652 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:02.910 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:36:03.233 [2024-06-07 12:38:26.821688] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002940 00:36:03.233 /dev/nbd0 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:03.491 1+0 records in 00:36:03.491 1+0 records out 00:36:03.491 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0001903 s, 21.5 MB/s 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:36:03.491 12:38:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:36:07.753 65536+0 records in 00:36:07.753 65536+0 records out 00:36:07.753 33554432 bytes (34 MB, 32 MiB) copied, 4.14281 s, 8.1 MB/s 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:36:07.753 [2024-06-07 12:38:31.349459] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:36:07.753 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:36:08.010 [2024-06-07 12:38:31.577254] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:08.011 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:08.268 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:08.268 "name": "raid_bdev1", 00:36:08.268 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:08.268 "strip_size_kb": 0, 00:36:08.268 "state": "online", 00:36:08.268 "raid_level": "raid1", 00:36:08.268 "superblock": false, 00:36:08.268 "num_base_bdevs": 4, 00:36:08.268 "num_base_bdevs_discovered": 3, 00:36:08.268 "num_base_bdevs_operational": 3, 00:36:08.268 "base_bdevs_list": [ 00:36:08.268 { 00:36:08.268 "name": null, 00:36:08.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:08.268 "is_configured": false, 00:36:08.268 "data_offset": 0, 00:36:08.268 "data_size": 65536 00:36:08.268 }, 00:36:08.268 { 00:36:08.268 "name": "BaseBdev2", 00:36:08.268 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:08.268 "is_configured": true, 00:36:08.268 "data_offset": 0, 00:36:08.268 "data_size": 65536 00:36:08.268 }, 00:36:08.268 { 00:36:08.268 "name": "BaseBdev3", 00:36:08.268 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:08.268 "is_configured": true, 00:36:08.268 "data_offset": 0, 00:36:08.268 "data_size": 65536 00:36:08.268 }, 00:36:08.268 { 00:36:08.268 "name": "BaseBdev4", 00:36:08.268 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:08.268 "is_configured": true, 00:36:08.268 "data_offset": 0, 00:36:08.268 "data_size": 65536 00:36:08.268 } 00:36:08.268 ] 00:36:08.268 }' 00:36:08.268 12:38:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:08.268 12:38:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:08.831 12:38:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:09.088 [2024-06-07 12:38:32.693383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:09.088 [2024-06-07 12:38:32.699753] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d06560 00:36:09.088 [2024-06-07 12:38:32.701966] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:09.088 12:38:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:36:10.455 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:10.456 12:38:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:10.456 "name": "raid_bdev1", 00:36:10.456 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:10.456 "strip_size_kb": 0, 00:36:10.456 "state": "online", 00:36:10.456 "raid_level": "raid1", 00:36:10.456 "superblock": false, 00:36:10.456 "num_base_bdevs": 4, 00:36:10.456 "num_base_bdevs_discovered": 4, 00:36:10.456 "num_base_bdevs_operational": 4, 00:36:10.456 "process": { 00:36:10.456 "type": "rebuild", 00:36:10.456 "target": "spare", 00:36:10.456 "progress": { 00:36:10.456 "blocks": 24576, 00:36:10.456 "percent": 37 00:36:10.456 } 00:36:10.456 }, 00:36:10.456 "base_bdevs_list": [ 00:36:10.456 { 00:36:10.456 "name": "spare", 00:36:10.456 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:10.456 "is_configured": true, 00:36:10.456 "data_offset": 0, 00:36:10.456 "data_size": 65536 00:36:10.456 }, 00:36:10.456 { 00:36:10.456 "name": "BaseBdev2", 00:36:10.456 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:10.456 "is_configured": true, 00:36:10.456 "data_offset": 0, 00:36:10.456 "data_size": 65536 00:36:10.456 }, 00:36:10.456 { 00:36:10.456 "name": "BaseBdev3", 00:36:10.456 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:10.456 "is_configured": true, 00:36:10.456 "data_offset": 0, 00:36:10.456 "data_size": 65536 00:36:10.456 }, 00:36:10.456 { 00:36:10.456 "name": "BaseBdev4", 00:36:10.456 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:10.456 "is_configured": true, 00:36:10.456 "data_offset": 0, 00:36:10.456 "data_size": 65536 00:36:10.456 } 00:36:10.456 ] 00:36:10.456 }' 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:10.456 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:36:10.712 [2024-06-07 12:38:34.332200] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:10.991 [2024-06-07 12:38:34.415938] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:36:10.991 [2024-06-07 12:38:34.416104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:10.991 [2024-06-07 12:38:34.416124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:10.991 [2024-06-07 12:38:34.416134] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:10.991 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:11.249 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:11.249 "name": "raid_bdev1", 00:36:11.249 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:11.249 "strip_size_kb": 0, 00:36:11.249 "state": "online", 00:36:11.249 "raid_level": "raid1", 00:36:11.249 "superblock": false, 00:36:11.249 "num_base_bdevs": 4, 00:36:11.249 "num_base_bdevs_discovered": 3, 00:36:11.249 "num_base_bdevs_operational": 3, 00:36:11.249 "base_bdevs_list": [ 00:36:11.249 { 00:36:11.249 "name": null, 00:36:11.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:11.249 "is_configured": false, 00:36:11.249 "data_offset": 0, 00:36:11.249 "data_size": 65536 00:36:11.249 }, 00:36:11.249 { 00:36:11.249 "name": "BaseBdev2", 00:36:11.249 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:11.249 "is_configured": true, 00:36:11.249 "data_offset": 0, 00:36:11.249 "data_size": 65536 00:36:11.249 }, 00:36:11.249 { 00:36:11.249 "name": "BaseBdev3", 00:36:11.249 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:11.249 "is_configured": true, 00:36:11.249 "data_offset": 0, 00:36:11.249 "data_size": 65536 00:36:11.249 }, 00:36:11.249 { 00:36:11.249 "name": "BaseBdev4", 00:36:11.249 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:11.249 "is_configured": true, 00:36:11.249 "data_offset": 0, 00:36:11.249 "data_size": 65536 00:36:11.249 } 00:36:11.249 ] 00:36:11.249 }' 00:36:11.249 12:38:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:11.249 12:38:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:11.813 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:12.377 "name": "raid_bdev1", 00:36:12.377 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:12.377 "strip_size_kb": 0, 00:36:12.377 "state": "online", 00:36:12.377 "raid_level": "raid1", 00:36:12.377 "superblock": false, 00:36:12.377 "num_base_bdevs": 4, 00:36:12.377 "num_base_bdevs_discovered": 3, 00:36:12.377 "num_base_bdevs_operational": 3, 00:36:12.377 "base_bdevs_list": [ 00:36:12.377 { 00:36:12.377 "name": null, 00:36:12.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:12.377 "is_configured": false, 00:36:12.377 "data_offset": 0, 00:36:12.377 "data_size": 65536 00:36:12.377 }, 00:36:12.377 { 00:36:12.377 "name": "BaseBdev2", 00:36:12.377 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:12.377 "is_configured": true, 00:36:12.377 "data_offset": 0, 00:36:12.377 "data_size": 65536 00:36:12.377 }, 00:36:12.377 { 00:36:12.377 "name": "BaseBdev3", 00:36:12.377 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:12.377 "is_configured": true, 00:36:12.377 "data_offset": 0, 00:36:12.377 "data_size": 65536 00:36:12.377 }, 00:36:12.377 { 00:36:12.377 "name": "BaseBdev4", 00:36:12.377 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:12.377 "is_configured": true, 00:36:12.377 "data_offset": 0, 00:36:12.377 "data_size": 65536 00:36:12.377 } 00:36:12.377 ] 00:36:12.377 }' 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:12.377 12:38:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:12.633 [2024-06-07 12:38:36.035907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:12.633 [2024-06-07 12:38:36.042166] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000d06700 00:36:12.633 [2024-06-07 12:38:36.044374] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:12.633 12:38:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:13.565 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:13.822 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:13.822 "name": "raid_bdev1", 00:36:13.822 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:13.822 "strip_size_kb": 0, 00:36:13.822 "state": "online", 00:36:13.822 "raid_level": "raid1", 00:36:13.822 "superblock": false, 00:36:13.822 "num_base_bdevs": 4, 00:36:13.822 "num_base_bdevs_discovered": 4, 00:36:13.822 "num_base_bdevs_operational": 4, 00:36:13.822 "process": { 00:36:13.822 "type": "rebuild", 00:36:13.822 "target": "spare", 00:36:13.822 "progress": { 00:36:13.822 "blocks": 24576, 00:36:13.822 "percent": 37 00:36:13.822 } 00:36:13.822 }, 00:36:13.822 "base_bdevs_list": [ 00:36:13.822 { 00:36:13.822 "name": "spare", 00:36:13.822 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:13.822 "is_configured": true, 00:36:13.822 "data_offset": 0, 00:36:13.822 "data_size": 65536 00:36:13.822 }, 00:36:13.822 { 00:36:13.822 "name": "BaseBdev2", 00:36:13.822 "uuid": "5e3b6262-fea6-5af4-b440-2d5860a98e37", 00:36:13.822 "is_configured": true, 00:36:13.822 "data_offset": 0, 00:36:13.822 "data_size": 65536 00:36:13.822 }, 00:36:13.822 { 00:36:13.822 "name": "BaseBdev3", 00:36:13.822 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:13.822 "is_configured": true, 00:36:13.822 "data_offset": 0, 00:36:13.822 "data_size": 65536 00:36:13.822 }, 00:36:13.822 { 00:36:13.822 "name": "BaseBdev4", 00:36:13.822 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:13.823 "is_configured": true, 00:36:13.823 "data_offset": 0, 00:36:13.823 "data_size": 65536 00:36:13.823 } 00:36:13.823 ] 00:36:13.823 }' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:36:13.823 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:36:14.081 [2024-06-07 12:38:37.630637] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:36:14.081 [2024-06-07 12:38:37.656637] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000d06700 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:14.081 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:14.369 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:14.369 "name": "raid_bdev1", 00:36:14.369 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:14.369 "strip_size_kb": 0, 00:36:14.369 "state": "online", 00:36:14.369 "raid_level": "raid1", 00:36:14.369 "superblock": false, 00:36:14.369 "num_base_bdevs": 4, 00:36:14.369 "num_base_bdevs_discovered": 3, 00:36:14.369 "num_base_bdevs_operational": 3, 00:36:14.369 "process": { 00:36:14.369 "type": "rebuild", 00:36:14.369 "target": "spare", 00:36:14.369 "progress": { 00:36:14.369 "blocks": 36864, 00:36:14.369 "percent": 56 00:36:14.369 } 00:36:14.369 }, 00:36:14.369 "base_bdevs_list": [ 00:36:14.369 { 00:36:14.369 "name": "spare", 00:36:14.369 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:14.369 "is_configured": true, 00:36:14.369 "data_offset": 0, 00:36:14.369 "data_size": 65536 00:36:14.369 }, 00:36:14.369 { 00:36:14.369 "name": null, 00:36:14.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:14.369 "is_configured": false, 00:36:14.369 "data_offset": 0, 00:36:14.369 "data_size": 65536 00:36:14.369 }, 00:36:14.369 { 00:36:14.369 "name": "BaseBdev3", 00:36:14.369 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:14.369 "is_configured": true, 00:36:14.369 "data_offset": 0, 00:36:14.369 "data_size": 65536 00:36:14.369 }, 00:36:14.369 { 00:36:14.369 "name": "BaseBdev4", 00:36:14.369 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:14.369 "is_configured": true, 00:36:14.369 "data_offset": 0, 00:36:14.369 "data_size": 65536 00:36:14.369 } 00:36:14.369 ] 00:36:14.369 }' 00:36:14.369 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:14.369 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:14.369 12:38:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=925 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:14.639 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:14.897 "name": "raid_bdev1", 00:36:14.897 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:14.897 "strip_size_kb": 0, 00:36:14.897 "state": "online", 00:36:14.897 "raid_level": "raid1", 00:36:14.897 "superblock": false, 00:36:14.897 "num_base_bdevs": 4, 00:36:14.897 "num_base_bdevs_discovered": 3, 00:36:14.897 "num_base_bdevs_operational": 3, 00:36:14.897 "process": { 00:36:14.897 "type": "rebuild", 00:36:14.897 "target": "spare", 00:36:14.897 "progress": { 00:36:14.897 "blocks": 45056, 00:36:14.897 "percent": 68 00:36:14.897 } 00:36:14.897 }, 00:36:14.897 "base_bdevs_list": [ 00:36:14.897 { 00:36:14.897 "name": "spare", 00:36:14.897 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:14.897 "is_configured": true, 00:36:14.897 "data_offset": 0, 00:36:14.897 "data_size": 65536 00:36:14.897 }, 00:36:14.897 { 00:36:14.897 "name": null, 00:36:14.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:14.897 "is_configured": false, 00:36:14.897 "data_offset": 0, 00:36:14.897 "data_size": 65536 00:36:14.897 }, 00:36:14.897 { 00:36:14.897 "name": "BaseBdev3", 00:36:14.897 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:14.897 "is_configured": true, 00:36:14.897 "data_offset": 0, 00:36:14.897 "data_size": 65536 00:36:14.897 }, 00:36:14.897 { 00:36:14.897 "name": "BaseBdev4", 00:36:14.897 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:14.897 "is_configured": true, 00:36:14.897 "data_offset": 0, 00:36:14.897 "data_size": 65536 00:36:14.897 } 00:36:14.897 ] 00:36:14.897 }' 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:14.897 12:38:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:36:15.830 [2024-06-07 12:38:39.268877] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:36:15.830 [2024-06-07 12:38:39.268982] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:36:15.830 [2024-06-07 12:38:39.269086] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:15.830 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:16.089 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:16.089 "name": "raid_bdev1", 00:36:16.089 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:16.089 "strip_size_kb": 0, 00:36:16.089 "state": "online", 00:36:16.089 "raid_level": "raid1", 00:36:16.089 "superblock": false, 00:36:16.089 "num_base_bdevs": 4, 00:36:16.089 "num_base_bdevs_discovered": 3, 00:36:16.089 "num_base_bdevs_operational": 3, 00:36:16.089 "base_bdevs_list": [ 00:36:16.089 { 00:36:16.089 "name": "spare", 00:36:16.089 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:16.089 "is_configured": true, 00:36:16.089 "data_offset": 0, 00:36:16.089 "data_size": 65536 00:36:16.089 }, 00:36:16.089 { 00:36:16.089 "name": null, 00:36:16.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:16.089 "is_configured": false, 00:36:16.089 "data_offset": 0, 00:36:16.089 "data_size": 65536 00:36:16.089 }, 00:36:16.089 { 00:36:16.089 "name": "BaseBdev3", 00:36:16.089 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:16.089 "is_configured": true, 00:36:16.089 "data_offset": 0, 00:36:16.089 "data_size": 65536 00:36:16.089 }, 00:36:16.089 { 00:36:16.089 "name": "BaseBdev4", 00:36:16.089 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:16.089 "is_configured": true, 00:36:16.089 "data_offset": 0, 00:36:16.089 "data_size": 65536 00:36:16.089 } 00:36:16.089 ] 00:36:16.089 }' 00:36:16.089 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:16.347 12:38:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:16.605 "name": "raid_bdev1", 00:36:16.605 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:16.605 "strip_size_kb": 0, 00:36:16.605 "state": "online", 00:36:16.605 "raid_level": "raid1", 00:36:16.605 "superblock": false, 00:36:16.605 "num_base_bdevs": 4, 00:36:16.605 "num_base_bdevs_discovered": 3, 00:36:16.605 "num_base_bdevs_operational": 3, 00:36:16.605 "base_bdevs_list": [ 00:36:16.605 { 00:36:16.605 "name": "spare", 00:36:16.605 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:16.605 "is_configured": true, 00:36:16.605 "data_offset": 0, 00:36:16.605 "data_size": 65536 00:36:16.605 }, 00:36:16.605 { 00:36:16.605 "name": null, 00:36:16.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:16.605 "is_configured": false, 00:36:16.605 "data_offset": 0, 00:36:16.605 "data_size": 65536 00:36:16.605 }, 00:36:16.605 { 00:36:16.605 "name": "BaseBdev3", 00:36:16.605 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:16.605 "is_configured": true, 00:36:16.605 "data_offset": 0, 00:36:16.605 "data_size": 65536 00:36:16.605 }, 00:36:16.605 { 00:36:16.605 "name": "BaseBdev4", 00:36:16.605 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:16.605 "is_configured": true, 00:36:16.605 "data_offset": 0, 00:36:16.605 "data_size": 65536 00:36:16.605 } 00:36:16.605 ] 00:36:16.605 }' 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:16.605 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:16.864 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:16.864 "name": "raid_bdev1", 00:36:16.864 "uuid": "e348b614-c6b8-43ba-978a-f33e919e1fd1", 00:36:16.864 "strip_size_kb": 0, 00:36:16.864 "state": "online", 00:36:16.864 "raid_level": "raid1", 00:36:16.864 "superblock": false, 00:36:16.864 "num_base_bdevs": 4, 00:36:16.864 "num_base_bdevs_discovered": 3, 00:36:16.864 "num_base_bdevs_operational": 3, 00:36:16.864 "base_bdevs_list": [ 00:36:16.864 { 00:36:16.864 "name": "spare", 00:36:16.864 "uuid": "33c7e267-3a32-521b-999c-b70db84df076", 00:36:16.864 "is_configured": true, 00:36:16.864 "data_offset": 0, 00:36:16.864 "data_size": 65536 00:36:16.864 }, 00:36:16.864 { 00:36:16.864 "name": null, 00:36:16.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:16.864 "is_configured": false, 00:36:16.864 "data_offset": 0, 00:36:16.864 "data_size": 65536 00:36:16.864 }, 00:36:16.864 { 00:36:16.864 "name": "BaseBdev3", 00:36:16.864 "uuid": "62f59c9a-d518-5217-949e-901e3fd1d5f8", 00:36:16.864 "is_configured": true, 00:36:16.864 "data_offset": 0, 00:36:16.864 "data_size": 65536 00:36:16.864 }, 00:36:16.864 { 00:36:16.864 "name": "BaseBdev4", 00:36:16.864 "uuid": "f6351bd9-5352-50c5-aac1-5da15f5bea3a", 00:36:16.864 "is_configured": true, 00:36:16.864 "data_offset": 0, 00:36:16.864 "data_size": 65536 00:36:16.864 } 00:36:16.864 ] 00:36:16.864 }' 00:36:16.864 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:16.864 12:38:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:17.430 12:38:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:17.688 [2024-06-07 12:38:41.232666] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:17.688 [2024-06-07 12:38:41.232954] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:17.688 [2024-06-07 12:38:41.233130] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:17.688 [2024-06-07 12:38:41.233317] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:17.688 [2024-06-07 12:38:41.233407] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:36:17.688 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:17.688 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:17.946 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:36:18.204 /dev/nbd0 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:18.204 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:18.204 1+0 records in 00:36:18.204 1+0 records out 00:36:18.205 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000467667 s, 8.8 MB/s 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:18.205 12:38:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:36:18.463 /dev/nbd1 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:18.463 1+0 records in 00:36:18.463 1+0 records out 00:36:18.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000496774 s, 8.2 MB/s 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:18.463 12:38:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:36:18.721 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:36:18.978 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 224016 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 224016 ']' 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 224016 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 224016 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 224016' 00:36:19.237 killing process with pid 224016 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 224016 00:36:19.237 Received shutdown signal, test time was about 60.000000 seconds 00:36:19.237 00:36:19.237 Latency(us) 00:36:19.237 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:19.237 =================================================================================================================== 00:36:19.237 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:36:19.237 12:38:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 224016 00:36:19.237 [2024-06-07 12:38:42.670611] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:19.237 [2024-06-07 12:38:42.771043] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:36:19.805 00:36:19.805 real 0m21.785s 00:36:19.805 user 0m31.124s 00:36:19.805 sys 0m5.126s 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:36:19.805 ************************************ 00:36:19.805 END TEST raid_rebuild_test 00:36:19.805 ************************************ 00:36:19.805 12:38:43 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:36:19.805 12:38:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:36:19.805 12:38:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:19.805 12:38:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:19.805 ************************************ 00:36:19.805 START TEST raid_rebuild_test_sb 00:36:19.805 ************************************ 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true false true 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev3 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # echo BaseBdev4 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=224527 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 224527 /var/tmp/spdk-raid.sock 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 224527 ']' 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:19.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:19.805 12:38:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:19.805 [2024-06-07 12:38:43.266917] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:36:19.805 [2024-06-07 12:38:43.267437] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid224527 ] 00:36:19.805 I/O size of 3145728 is greater than zero copy threshold (65536). 00:36:19.805 Zero copy mechanism will not be used. 00:36:19.805 [2024-06-07 12:38:43.410183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:20.063 [2024-06-07 12:38:43.503377] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:20.063 [2024-06-07 12:38:43.588854] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:20.996 12:38:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:20.996 12:38:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:36:20.996 12:38:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:20.996 12:38:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:36:20.996 BaseBdev1_malloc 00:36:20.996 12:38:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:36:21.253 [2024-06-07 12:38:44.836558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:36:21.253 [2024-06-07 12:38:44.836857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:21.253 [2024-06-07 12:38:44.836956] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:36:21.253 [2024-06-07 12:38:44.838796] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:21.253 [2024-06-07 12:38:44.841719] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:21.253 [2024-06-07 12:38:44.841943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:21.253 BaseBdev1 00:36:21.253 12:38:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:21.253 12:38:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:36:21.820 BaseBdev2_malloc 00:36:21.820 12:38:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:36:22.078 [2024-06-07 12:38:45.466893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:36:22.078 [2024-06-07 12:38:45.467264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:22.078 [2024-06-07 12:38:45.467362] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:36:22.078 [2024-06-07 12:38:45.467616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:22.078 [2024-06-07 12:38:45.469904] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:22.078 [2024-06-07 12:38:45.470082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:36:22.078 BaseBdev2 00:36:22.078 12:38:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:22.078 12:38:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:36:22.337 BaseBdev3_malloc 00:36:22.337 12:38:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:36:22.594 [2024-06-07 12:38:46.074829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:36:22.594 [2024-06-07 12:38:46.075219] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:22.594 [2024-06-07 12:38:46.075328] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:36:22.594 [2024-06-07 12:38:46.075514] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:22.594 [2024-06-07 12:38:46.077829] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:22.594 [2024-06-07 12:38:46.078028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:36:22.594 BaseBdev3 00:36:22.594 12:38:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:22.594 12:38:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:36:22.852 BaseBdev4_malloc 00:36:22.852 12:38:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:36:23.110 [2024-06-07 12:38:46.570749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:36:23.110 [2024-06-07 12:38:46.571114] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:23.110 [2024-06-07 12:38:46.571325] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:36:23.110 [2024-06-07 12:38:46.571503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:23.110 [2024-06-07 12:38:46.574181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:23.110 [2024-06-07 12:38:46.574405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:36:23.110 BaseBdev4 00:36:23.110 12:38:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:36:23.368 spare_malloc 00:36:23.368 12:38:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:36:23.627 spare_delay 00:36:23.627 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:36:23.887 [2024-06-07 12:38:47.334982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:36:23.887 [2024-06-07 12:38:47.335738] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:23.887 [2024-06-07 12:38:47.335976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:36:23.887 [2024-06-07 12:38:47.336220] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:23.887 [2024-06-07 12:38:47.338648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:23.887 [2024-06-07 12:38:47.338882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:36:23.887 spare 00:36:23.887 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:36:24.147 [2024-06-07 12:38:47.563406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:24.147 [2024-06-07 12:38:47.565726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:36:24.147 [2024-06-07 12:38:47.565934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:24.147 [2024-06-07 12:38:47.566056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:24.147 [2024-06-07 12:38:47.566290] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:36:24.147 [2024-06-07 12:38:47.566336] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:24.147 [2024-06-07 12:38:47.566599] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:36:24.147 [2024-06-07 12:38:47.567051] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:36:24.147 [2024-06-07 12:38:47.567151] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:36:24.147 [2024-06-07 12:38:47.567425] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:24.147 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:24.423 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:24.423 "name": "raid_bdev1", 00:36:24.423 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:24.423 "strip_size_kb": 0, 00:36:24.423 "state": "online", 00:36:24.423 "raid_level": "raid1", 00:36:24.423 "superblock": true, 00:36:24.423 "num_base_bdevs": 4, 00:36:24.423 "num_base_bdevs_discovered": 4, 00:36:24.423 "num_base_bdevs_operational": 4, 00:36:24.423 "base_bdevs_list": [ 00:36:24.423 { 00:36:24.423 "name": "BaseBdev1", 00:36:24.423 "uuid": "6908b86f-f587-5f73-9723-756e4d83d456", 00:36:24.423 "is_configured": true, 00:36:24.423 "data_offset": 2048, 00:36:24.423 "data_size": 63488 00:36:24.423 }, 00:36:24.423 { 00:36:24.423 "name": "BaseBdev2", 00:36:24.423 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:24.423 "is_configured": true, 00:36:24.423 "data_offset": 2048, 00:36:24.423 "data_size": 63488 00:36:24.423 }, 00:36:24.423 { 00:36:24.423 "name": "BaseBdev3", 00:36:24.423 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:24.423 "is_configured": true, 00:36:24.423 "data_offset": 2048, 00:36:24.423 "data_size": 63488 00:36:24.423 }, 00:36:24.423 { 00:36:24.423 "name": "BaseBdev4", 00:36:24.423 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:24.423 "is_configured": true, 00:36:24.423 "data_offset": 2048, 00:36:24.423 "data_size": 63488 00:36:24.423 } 00:36:24.423 ] 00:36:24.423 }' 00:36:24.424 12:38:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:24.424 12:38:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:25.017 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:36:25.017 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:36:25.275 [2024-06-07 12:38:48.715831] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:36:25.275 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:36:25.275 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:25.275 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:25.534 12:38:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:36:25.793 [2024-06-07 12:38:49.255743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002940 00:36:25.793 /dev/nbd0 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:25.793 1+0 records in 00:36:25.793 1+0 records out 00:36:25.793 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000477189 s, 8.6 MB/s 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:36:25.793 12:38:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:36:30.000 63488+0 records in 00:36:30.000 63488+0 records out 00:36:30.000 32505856 bytes (33 MB, 31 MiB) copied, 3.6764 s, 8.8 MB/s 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:36:30.000 [2024-06-07 12:38:53.287196] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:36:30.000 [2024-06-07 12:38:53.590872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:30.000 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:30.566 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:30.566 "name": "raid_bdev1", 00:36:30.566 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:30.566 "strip_size_kb": 0, 00:36:30.566 "state": "online", 00:36:30.566 "raid_level": "raid1", 00:36:30.566 "superblock": true, 00:36:30.566 "num_base_bdevs": 4, 00:36:30.566 "num_base_bdevs_discovered": 3, 00:36:30.566 "num_base_bdevs_operational": 3, 00:36:30.566 "base_bdevs_list": [ 00:36:30.566 { 00:36:30.566 "name": null, 00:36:30.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:30.566 "is_configured": false, 00:36:30.566 "data_offset": 2048, 00:36:30.566 "data_size": 63488 00:36:30.566 }, 00:36:30.566 { 00:36:30.566 "name": "BaseBdev2", 00:36:30.566 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:30.566 "is_configured": true, 00:36:30.566 "data_offset": 2048, 00:36:30.566 "data_size": 63488 00:36:30.566 }, 00:36:30.566 { 00:36:30.566 "name": "BaseBdev3", 00:36:30.566 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:30.566 "is_configured": true, 00:36:30.566 "data_offset": 2048, 00:36:30.566 "data_size": 63488 00:36:30.566 }, 00:36:30.566 { 00:36:30.566 "name": "BaseBdev4", 00:36:30.566 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:30.566 "is_configured": true, 00:36:30.566 "data_offset": 2048, 00:36:30.566 "data_size": 63488 00:36:30.566 } 00:36:30.566 ] 00:36:30.566 }' 00:36:30.566 12:38:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:30.566 12:38:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:31.131 12:38:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:31.389 [2024-06-07 12:38:54.830997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:31.389 [2024-06-07 12:38:54.837573] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000c3e5c0 00:36:31.389 [2024-06-07 12:38:54.840247] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:31.389 12:38:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:32.365 12:38:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:32.623 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:32.623 "name": "raid_bdev1", 00:36:32.623 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:32.623 "strip_size_kb": 0, 00:36:32.623 "state": "online", 00:36:32.623 "raid_level": "raid1", 00:36:32.623 "superblock": true, 00:36:32.623 "num_base_bdevs": 4, 00:36:32.623 "num_base_bdevs_discovered": 4, 00:36:32.623 "num_base_bdevs_operational": 4, 00:36:32.623 "process": { 00:36:32.623 "type": "rebuild", 00:36:32.623 "target": "spare", 00:36:32.623 "progress": { 00:36:32.623 "blocks": 24576, 00:36:32.623 "percent": 38 00:36:32.623 } 00:36:32.623 }, 00:36:32.623 "base_bdevs_list": [ 00:36:32.623 { 00:36:32.623 "name": "spare", 00:36:32.623 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:32.623 "is_configured": true, 00:36:32.623 "data_offset": 2048, 00:36:32.623 "data_size": 63488 00:36:32.623 }, 00:36:32.623 { 00:36:32.623 "name": "BaseBdev2", 00:36:32.623 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:32.623 "is_configured": true, 00:36:32.623 "data_offset": 2048, 00:36:32.623 "data_size": 63488 00:36:32.623 }, 00:36:32.623 { 00:36:32.623 "name": "BaseBdev3", 00:36:32.623 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:32.623 "is_configured": true, 00:36:32.623 "data_offset": 2048, 00:36:32.623 "data_size": 63488 00:36:32.623 }, 00:36:32.623 { 00:36:32.623 "name": "BaseBdev4", 00:36:32.623 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:32.623 "is_configured": true, 00:36:32.623 "data_offset": 2048, 00:36:32.623 "data_size": 63488 00:36:32.623 } 00:36:32.623 ] 00:36:32.623 }' 00:36:32.624 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:32.624 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:32.624 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:32.624 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:32.624 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:36:32.902 [2024-06-07 12:38:56.474769] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:33.160 [2024-06-07 12:38:56.554382] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:36:33.160 [2024-06-07 12:38:56.555293] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:33.160 [2024-06-07 12:38:56.555459] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:33.160 [2024-06-07 12:38:56.555581] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:33.160 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:33.417 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:33.417 "name": "raid_bdev1", 00:36:33.417 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:33.417 "strip_size_kb": 0, 00:36:33.417 "state": "online", 00:36:33.417 "raid_level": "raid1", 00:36:33.417 "superblock": true, 00:36:33.417 "num_base_bdevs": 4, 00:36:33.417 "num_base_bdevs_discovered": 3, 00:36:33.417 "num_base_bdevs_operational": 3, 00:36:33.417 "base_bdevs_list": [ 00:36:33.417 { 00:36:33.417 "name": null, 00:36:33.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:33.417 "is_configured": false, 00:36:33.417 "data_offset": 2048, 00:36:33.417 "data_size": 63488 00:36:33.417 }, 00:36:33.417 { 00:36:33.417 "name": "BaseBdev2", 00:36:33.417 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:33.417 "is_configured": true, 00:36:33.417 "data_offset": 2048, 00:36:33.417 "data_size": 63488 00:36:33.417 }, 00:36:33.417 { 00:36:33.417 "name": "BaseBdev3", 00:36:33.417 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:33.417 "is_configured": true, 00:36:33.417 "data_offset": 2048, 00:36:33.417 "data_size": 63488 00:36:33.417 }, 00:36:33.417 { 00:36:33.417 "name": "BaseBdev4", 00:36:33.417 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:33.417 "is_configured": true, 00:36:33.417 "data_offset": 2048, 00:36:33.417 "data_size": 63488 00:36:33.417 } 00:36:33.417 ] 00:36:33.417 }' 00:36:33.417 12:38:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:33.417 12:38:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:34.002 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:34.261 "name": "raid_bdev1", 00:36:34.261 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:34.261 "strip_size_kb": 0, 00:36:34.261 "state": "online", 00:36:34.261 "raid_level": "raid1", 00:36:34.261 "superblock": true, 00:36:34.261 "num_base_bdevs": 4, 00:36:34.261 "num_base_bdevs_discovered": 3, 00:36:34.261 "num_base_bdevs_operational": 3, 00:36:34.261 "base_bdevs_list": [ 00:36:34.261 { 00:36:34.261 "name": null, 00:36:34.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:34.261 "is_configured": false, 00:36:34.261 "data_offset": 2048, 00:36:34.261 "data_size": 63488 00:36:34.261 }, 00:36:34.261 { 00:36:34.261 "name": "BaseBdev2", 00:36:34.261 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:34.261 "is_configured": true, 00:36:34.261 "data_offset": 2048, 00:36:34.261 "data_size": 63488 00:36:34.261 }, 00:36:34.261 { 00:36:34.261 "name": "BaseBdev3", 00:36:34.261 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:34.261 "is_configured": true, 00:36:34.261 "data_offset": 2048, 00:36:34.261 "data_size": 63488 00:36:34.261 }, 00:36:34.261 { 00:36:34.261 "name": "BaseBdev4", 00:36:34.261 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:34.261 "is_configured": true, 00:36:34.261 "data_offset": 2048, 00:36:34.261 "data_size": 63488 00:36:34.261 } 00:36:34.261 ] 00:36:34.261 }' 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:34.261 12:38:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:34.518 [2024-06-07 12:38:58.049514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:34.518 [2024-06-07 12:38:58.055945] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000c3e760 00:36:34.518 [2024-06-07 12:38:58.058310] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:34.518 12:38:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:35.452 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:36.018 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:36.018 "name": "raid_bdev1", 00:36:36.018 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:36.018 "strip_size_kb": 0, 00:36:36.018 "state": "online", 00:36:36.018 "raid_level": "raid1", 00:36:36.018 "superblock": true, 00:36:36.018 "num_base_bdevs": 4, 00:36:36.018 "num_base_bdevs_discovered": 4, 00:36:36.018 "num_base_bdevs_operational": 4, 00:36:36.018 "process": { 00:36:36.018 "type": "rebuild", 00:36:36.018 "target": "spare", 00:36:36.018 "progress": { 00:36:36.018 "blocks": 24576, 00:36:36.018 "percent": 38 00:36:36.018 } 00:36:36.019 }, 00:36:36.019 "base_bdevs_list": [ 00:36:36.019 { 00:36:36.019 "name": "spare", 00:36:36.019 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:36.019 "is_configured": true, 00:36:36.019 "data_offset": 2048, 00:36:36.019 "data_size": 63488 00:36:36.019 }, 00:36:36.019 { 00:36:36.019 "name": "BaseBdev2", 00:36:36.019 "uuid": "6704e151-f850-5382-bbfe-bcefe5e94b0a", 00:36:36.019 "is_configured": true, 00:36:36.019 "data_offset": 2048, 00:36:36.019 "data_size": 63488 00:36:36.019 }, 00:36:36.019 { 00:36:36.019 "name": "BaseBdev3", 00:36:36.019 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:36.019 "is_configured": true, 00:36:36.019 "data_offset": 2048, 00:36:36.019 "data_size": 63488 00:36:36.019 }, 00:36:36.019 { 00:36:36.019 "name": "BaseBdev4", 00:36:36.019 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:36.019 "is_configured": true, 00:36:36.019 "data_offset": 2048, 00:36:36.019 "data_size": 63488 00:36:36.019 } 00:36:36.019 ] 00:36:36.019 }' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:36:36.019 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:36:36.019 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:36:36.277 [2024-06-07 12:38:59.731854] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:36:36.277 [2024-06-07 12:38:59.871143] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000c3e760 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:36.277 12:38:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:36.536 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:36.536 "name": "raid_bdev1", 00:36:36.536 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:36.536 "strip_size_kb": 0, 00:36:36.536 "state": "online", 00:36:36.536 "raid_level": "raid1", 00:36:36.536 "superblock": true, 00:36:36.536 "num_base_bdevs": 4, 00:36:36.536 "num_base_bdevs_discovered": 3, 00:36:36.536 "num_base_bdevs_operational": 3, 00:36:36.536 "process": { 00:36:36.536 "type": "rebuild", 00:36:36.536 "target": "spare", 00:36:36.536 "progress": { 00:36:36.536 "blocks": 38912, 00:36:36.536 "percent": 61 00:36:36.536 } 00:36:36.536 }, 00:36:36.536 "base_bdevs_list": [ 00:36:36.536 { 00:36:36.536 "name": "spare", 00:36:36.536 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:36.536 "is_configured": true, 00:36:36.536 "data_offset": 2048, 00:36:36.536 "data_size": 63488 00:36:36.536 }, 00:36:36.536 { 00:36:36.536 "name": null, 00:36:36.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:36.536 "is_configured": false, 00:36:36.536 "data_offset": 2048, 00:36:36.536 "data_size": 63488 00:36:36.536 }, 00:36:36.536 { 00:36:36.536 "name": "BaseBdev3", 00:36:36.536 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:36.536 "is_configured": true, 00:36:36.536 "data_offset": 2048, 00:36:36.536 "data_size": 63488 00:36:36.536 }, 00:36:36.536 { 00:36:36.536 "name": "BaseBdev4", 00:36:36.536 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:36.536 "is_configured": true, 00:36:36.536 "data_offset": 2048, 00:36:36.536 "data_size": 63488 00:36:36.536 } 00:36:36.536 ] 00:36:36.536 }' 00:36:36.536 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=947 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:36.793 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:37.051 "name": "raid_bdev1", 00:36:37.051 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:37.051 "strip_size_kb": 0, 00:36:37.051 "state": "online", 00:36:37.051 "raid_level": "raid1", 00:36:37.051 "superblock": true, 00:36:37.051 "num_base_bdevs": 4, 00:36:37.051 "num_base_bdevs_discovered": 3, 00:36:37.051 "num_base_bdevs_operational": 3, 00:36:37.051 "process": { 00:36:37.051 "type": "rebuild", 00:36:37.051 "target": "spare", 00:36:37.051 "progress": { 00:36:37.051 "blocks": 47104, 00:36:37.051 "percent": 74 00:36:37.051 } 00:36:37.051 }, 00:36:37.051 "base_bdevs_list": [ 00:36:37.051 { 00:36:37.051 "name": "spare", 00:36:37.051 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:37.051 "is_configured": true, 00:36:37.051 "data_offset": 2048, 00:36:37.051 "data_size": 63488 00:36:37.051 }, 00:36:37.051 { 00:36:37.051 "name": null, 00:36:37.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:37.051 "is_configured": false, 00:36:37.051 "data_offset": 2048, 00:36:37.051 "data_size": 63488 00:36:37.051 }, 00:36:37.051 { 00:36:37.051 "name": "BaseBdev3", 00:36:37.051 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:37.051 "is_configured": true, 00:36:37.051 "data_offset": 2048, 00:36:37.051 "data_size": 63488 00:36:37.051 }, 00:36:37.051 { 00:36:37.051 "name": "BaseBdev4", 00:36:37.051 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:37.051 "is_configured": true, 00:36:37.051 "data_offset": 2048, 00:36:37.051 "data_size": 63488 00:36:37.051 } 00:36:37.051 ] 00:36:37.051 }' 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:37.051 12:39:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:36:38.010 [2024-06-07 12:39:01.281843] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:36:38.010 [2024-06-07 12:39:01.281965] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:36:38.010 [2024-06-07 12:39:01.282666] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:38.010 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:38.268 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:38.268 "name": "raid_bdev1", 00:36:38.268 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:38.268 "strip_size_kb": 0, 00:36:38.268 "state": "online", 00:36:38.268 "raid_level": "raid1", 00:36:38.268 "superblock": true, 00:36:38.268 "num_base_bdevs": 4, 00:36:38.268 "num_base_bdevs_discovered": 3, 00:36:38.268 "num_base_bdevs_operational": 3, 00:36:38.268 "base_bdevs_list": [ 00:36:38.268 { 00:36:38.268 "name": "spare", 00:36:38.268 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:38.268 "is_configured": true, 00:36:38.268 "data_offset": 2048, 00:36:38.268 "data_size": 63488 00:36:38.268 }, 00:36:38.268 { 00:36:38.268 "name": null, 00:36:38.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:38.268 "is_configured": false, 00:36:38.268 "data_offset": 2048, 00:36:38.268 "data_size": 63488 00:36:38.268 }, 00:36:38.268 { 00:36:38.268 "name": "BaseBdev3", 00:36:38.268 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:38.268 "is_configured": true, 00:36:38.268 "data_offset": 2048, 00:36:38.268 "data_size": 63488 00:36:38.268 }, 00:36:38.268 { 00:36:38.268 "name": "BaseBdev4", 00:36:38.268 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:38.268 "is_configured": true, 00:36:38.268 "data_offset": 2048, 00:36:38.268 "data_size": 63488 00:36:38.268 } 00:36:38.268 ] 00:36:38.268 }' 00:36:38.268 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:38.526 12:39:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:38.784 "name": "raid_bdev1", 00:36:38.784 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:38.784 "strip_size_kb": 0, 00:36:38.784 "state": "online", 00:36:38.784 "raid_level": "raid1", 00:36:38.784 "superblock": true, 00:36:38.784 "num_base_bdevs": 4, 00:36:38.784 "num_base_bdevs_discovered": 3, 00:36:38.784 "num_base_bdevs_operational": 3, 00:36:38.784 "base_bdevs_list": [ 00:36:38.784 { 00:36:38.784 "name": "spare", 00:36:38.784 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:38.784 "is_configured": true, 00:36:38.784 "data_offset": 2048, 00:36:38.784 "data_size": 63488 00:36:38.784 }, 00:36:38.784 { 00:36:38.784 "name": null, 00:36:38.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:38.784 "is_configured": false, 00:36:38.784 "data_offset": 2048, 00:36:38.784 "data_size": 63488 00:36:38.784 }, 00:36:38.784 { 00:36:38.784 "name": "BaseBdev3", 00:36:38.784 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:38.784 "is_configured": true, 00:36:38.784 "data_offset": 2048, 00:36:38.784 "data_size": 63488 00:36:38.784 }, 00:36:38.784 { 00:36:38.784 "name": "BaseBdev4", 00:36:38.784 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:38.784 "is_configured": true, 00:36:38.784 "data_offset": 2048, 00:36:38.784 "data_size": 63488 00:36:38.784 } 00:36:38.784 ] 00:36:38.784 }' 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:38.784 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:39.041 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:39.041 "name": "raid_bdev1", 00:36:39.041 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:39.041 "strip_size_kb": 0, 00:36:39.041 "state": "online", 00:36:39.041 "raid_level": "raid1", 00:36:39.041 "superblock": true, 00:36:39.041 "num_base_bdevs": 4, 00:36:39.041 "num_base_bdevs_discovered": 3, 00:36:39.041 "num_base_bdevs_operational": 3, 00:36:39.041 "base_bdevs_list": [ 00:36:39.041 { 00:36:39.041 "name": "spare", 00:36:39.041 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:39.041 "is_configured": true, 00:36:39.041 "data_offset": 2048, 00:36:39.041 "data_size": 63488 00:36:39.041 }, 00:36:39.041 { 00:36:39.041 "name": null, 00:36:39.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:39.041 "is_configured": false, 00:36:39.041 "data_offset": 2048, 00:36:39.041 "data_size": 63488 00:36:39.041 }, 00:36:39.041 { 00:36:39.041 "name": "BaseBdev3", 00:36:39.041 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:39.041 "is_configured": true, 00:36:39.041 "data_offset": 2048, 00:36:39.041 "data_size": 63488 00:36:39.041 }, 00:36:39.041 { 00:36:39.041 "name": "BaseBdev4", 00:36:39.041 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:39.041 "is_configured": true, 00:36:39.041 "data_offset": 2048, 00:36:39.041 "data_size": 63488 00:36:39.041 } 00:36:39.041 ] 00:36:39.041 }' 00:36:39.042 12:39:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:39.042 12:39:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:39.606 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:36:39.865 [2024-06-07 12:39:03.350254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:36:39.865 [2024-06-07 12:39:03.350321] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:36:39.865 [2024-06-07 12:39:03.350440] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:39.865 [2024-06-07 12:39:03.350539] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:39.865 [2024-06-07 12:39:03.350552] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:36:39.865 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:39.865 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:40.123 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:36:40.381 /dev/nbd0 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:40.381 1+0 records in 00:36:40.381 1+0 records out 00:36:40.381 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184406 s, 22.2 MB/s 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:40.381 12:39:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:36:40.649 /dev/nbd1 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:40.649 1+0 records in 00:36:40.649 1+0 records out 00:36:40.649 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293302 s, 14.0 MB/s 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:40.649 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:40.907 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:36:41.472 12:39:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:36:41.729 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:36:41.987 [2024-06-07 12:39:05.392792] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:36:41.987 [2024-06-07 12:39:05.393429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:41.987 [2024-06-07 12:39:05.393553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a580 00:36:41.987 [2024-06-07 12:39:05.393630] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:41.987 [2024-06-07 12:39:05.396074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:41.987 [2024-06-07 12:39:05.396221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:36:41.987 [2024-06-07 12:39:05.396383] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:36:41.987 [2024-06-07 12:39:05.396478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:41.987 [2024-06-07 12:39:05.396619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:36:41.987 [2024-06-07 12:39:05.396705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:36:41.987 spare 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:41.987 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:41.987 [2024-06-07 12:39:05.496820] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600000ab80 00:36:41.987 [2024-06-07 12:39:05.496880] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:36:41.987 [2024-06-07 12:39:05.497047] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caefe0 00:36:41.987 [2024-06-07 12:39:05.497463] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600000ab80 00:36:41.987 [2024-06-07 12:39:05.497484] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600000ab80 00:36:41.987 [2024-06-07 12:39:05.497600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:42.245 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:42.245 "name": "raid_bdev1", 00:36:42.245 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:42.245 "strip_size_kb": 0, 00:36:42.245 "state": "online", 00:36:42.245 "raid_level": "raid1", 00:36:42.245 "superblock": true, 00:36:42.245 "num_base_bdevs": 4, 00:36:42.245 "num_base_bdevs_discovered": 3, 00:36:42.245 "num_base_bdevs_operational": 3, 00:36:42.245 "base_bdevs_list": [ 00:36:42.245 { 00:36:42.245 "name": "spare", 00:36:42.245 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:42.245 "is_configured": true, 00:36:42.245 "data_offset": 2048, 00:36:42.245 "data_size": 63488 00:36:42.245 }, 00:36:42.245 { 00:36:42.245 "name": null, 00:36:42.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:42.245 "is_configured": false, 00:36:42.245 "data_offset": 2048, 00:36:42.245 "data_size": 63488 00:36:42.245 }, 00:36:42.245 { 00:36:42.245 "name": "BaseBdev3", 00:36:42.245 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:42.245 "is_configured": true, 00:36:42.245 "data_offset": 2048, 00:36:42.245 "data_size": 63488 00:36:42.245 }, 00:36:42.245 { 00:36:42.245 "name": "BaseBdev4", 00:36:42.245 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:42.245 "is_configured": true, 00:36:42.245 "data_offset": 2048, 00:36:42.245 "data_size": 63488 00:36:42.245 } 00:36:42.245 ] 00:36:42.245 }' 00:36:42.245 12:39:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:42.245 12:39:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:42.810 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:43.068 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:43.068 "name": "raid_bdev1", 00:36:43.068 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:43.068 "strip_size_kb": 0, 00:36:43.068 "state": "online", 00:36:43.068 "raid_level": "raid1", 00:36:43.068 "superblock": true, 00:36:43.068 "num_base_bdevs": 4, 00:36:43.068 "num_base_bdevs_discovered": 3, 00:36:43.068 "num_base_bdevs_operational": 3, 00:36:43.068 "base_bdevs_list": [ 00:36:43.068 { 00:36:43.068 "name": "spare", 00:36:43.068 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:43.068 "is_configured": true, 00:36:43.068 "data_offset": 2048, 00:36:43.068 "data_size": 63488 00:36:43.068 }, 00:36:43.068 { 00:36:43.068 "name": null, 00:36:43.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:43.068 "is_configured": false, 00:36:43.068 "data_offset": 2048, 00:36:43.068 "data_size": 63488 00:36:43.068 }, 00:36:43.068 { 00:36:43.068 "name": "BaseBdev3", 00:36:43.068 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:43.068 "is_configured": true, 00:36:43.068 "data_offset": 2048, 00:36:43.068 "data_size": 63488 00:36:43.068 }, 00:36:43.068 { 00:36:43.068 "name": "BaseBdev4", 00:36:43.068 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:43.068 "is_configured": true, 00:36:43.068 "data_offset": 2048, 00:36:43.068 "data_size": 63488 00:36:43.068 } 00:36:43.068 ] 00:36:43.068 }' 00:36:43.068 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:43.068 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:43.068 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:43.325 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:43.325 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:43.325 12:39:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:36:43.582 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:36:43.582 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:36:43.851 [2024-06-07 12:39:07.341189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:43.851 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:43.852 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:43.852 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:44.109 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:44.109 "name": "raid_bdev1", 00:36:44.109 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:44.109 "strip_size_kb": 0, 00:36:44.109 "state": "online", 00:36:44.109 "raid_level": "raid1", 00:36:44.109 "superblock": true, 00:36:44.109 "num_base_bdevs": 4, 00:36:44.109 "num_base_bdevs_discovered": 2, 00:36:44.109 "num_base_bdevs_operational": 2, 00:36:44.109 "base_bdevs_list": [ 00:36:44.109 { 00:36:44.109 "name": null, 00:36:44.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:44.109 "is_configured": false, 00:36:44.109 "data_offset": 2048, 00:36:44.109 "data_size": 63488 00:36:44.109 }, 00:36:44.109 { 00:36:44.109 "name": null, 00:36:44.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:44.109 "is_configured": false, 00:36:44.109 "data_offset": 2048, 00:36:44.109 "data_size": 63488 00:36:44.109 }, 00:36:44.109 { 00:36:44.109 "name": "BaseBdev3", 00:36:44.109 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:44.109 "is_configured": true, 00:36:44.109 "data_offset": 2048, 00:36:44.109 "data_size": 63488 00:36:44.109 }, 00:36:44.109 { 00:36:44.109 "name": "BaseBdev4", 00:36:44.109 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:44.109 "is_configured": true, 00:36:44.109 "data_offset": 2048, 00:36:44.109 "data_size": 63488 00:36:44.109 } 00:36:44.109 ] 00:36:44.109 }' 00:36:44.109 12:39:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:44.109 12:39:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:44.674 12:39:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:36:44.931 [2024-06-07 12:39:08.389312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:44.931 [2024-06-07 12:39:08.389504] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:36:44.931 [2024-06-07 12:39:08.389517] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:36:44.931 [2024-06-07 12:39:08.390029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:44.932 [2024-06-07 12:39:08.396030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caf180 00:36:44.932 [2024-06-07 12:39:08.398136] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:44.932 12:39:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:36:45.864 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:45.864 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:45.865 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:45.865 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:45.865 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:45.865 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:45.865 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:46.121 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:46.121 "name": "raid_bdev1", 00:36:46.121 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:46.121 "strip_size_kb": 0, 00:36:46.121 "state": "online", 00:36:46.121 "raid_level": "raid1", 00:36:46.121 "superblock": true, 00:36:46.121 "num_base_bdevs": 4, 00:36:46.121 "num_base_bdevs_discovered": 3, 00:36:46.121 "num_base_bdevs_operational": 3, 00:36:46.121 "process": { 00:36:46.121 "type": "rebuild", 00:36:46.121 "target": "spare", 00:36:46.121 "progress": { 00:36:46.121 "blocks": 24576, 00:36:46.121 "percent": 38 00:36:46.121 } 00:36:46.121 }, 00:36:46.121 "base_bdevs_list": [ 00:36:46.121 { 00:36:46.121 "name": "spare", 00:36:46.121 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:46.121 "is_configured": true, 00:36:46.121 "data_offset": 2048, 00:36:46.121 "data_size": 63488 00:36:46.121 }, 00:36:46.121 { 00:36:46.121 "name": null, 00:36:46.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:46.121 "is_configured": false, 00:36:46.121 "data_offset": 2048, 00:36:46.121 "data_size": 63488 00:36:46.121 }, 00:36:46.121 { 00:36:46.121 "name": "BaseBdev3", 00:36:46.121 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:46.121 "is_configured": true, 00:36:46.121 "data_offset": 2048, 00:36:46.121 "data_size": 63488 00:36:46.121 }, 00:36:46.121 { 00:36:46.121 "name": "BaseBdev4", 00:36:46.122 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:46.122 "is_configured": true, 00:36:46.122 "data_offset": 2048, 00:36:46.122 "data_size": 63488 00:36:46.122 } 00:36:46.122 ] 00:36:46.122 }' 00:36:46.122 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:46.378 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:46.378 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:46.378 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:46.378 12:39:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:36:46.378 [2024-06-07 12:39:10.020476] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:46.636 [2024-06-07 12:39:10.110259] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:36:46.636 [2024-06-07 12:39:10.110847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:46.636 [2024-06-07 12:39:10.110883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:46.636 [2024-06-07 12:39:10.110895] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:46.636 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:46.894 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:46.894 "name": "raid_bdev1", 00:36:46.894 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:46.894 "strip_size_kb": 0, 00:36:46.894 "state": "online", 00:36:46.894 "raid_level": "raid1", 00:36:46.894 "superblock": true, 00:36:46.894 "num_base_bdevs": 4, 00:36:46.894 "num_base_bdevs_discovered": 2, 00:36:46.894 "num_base_bdevs_operational": 2, 00:36:46.894 "base_bdevs_list": [ 00:36:46.894 { 00:36:46.894 "name": null, 00:36:46.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:46.894 "is_configured": false, 00:36:46.894 "data_offset": 2048, 00:36:46.894 "data_size": 63488 00:36:46.894 }, 00:36:46.894 { 00:36:46.894 "name": null, 00:36:46.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:46.894 "is_configured": false, 00:36:46.894 "data_offset": 2048, 00:36:46.894 "data_size": 63488 00:36:46.894 }, 00:36:46.894 { 00:36:46.894 "name": "BaseBdev3", 00:36:46.894 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:46.894 "is_configured": true, 00:36:46.894 "data_offset": 2048, 00:36:46.894 "data_size": 63488 00:36:46.894 }, 00:36:46.894 { 00:36:46.894 "name": "BaseBdev4", 00:36:46.894 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:46.894 "is_configured": true, 00:36:46.894 "data_offset": 2048, 00:36:46.894 "data_size": 63488 00:36:46.894 } 00:36:46.894 ] 00:36:46.894 }' 00:36:46.894 12:39:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:46.894 12:39:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:47.499 12:39:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:36:47.757 [2024-06-07 12:39:11.282561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:36:47.757 [2024-06-07 12:39:11.283108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:47.757 [2024-06-07 12:39:11.283221] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b180 00:36:47.757 [2024-06-07 12:39:11.283301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:47.757 [2024-06-07 12:39:11.283753] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:47.757 [2024-06-07 12:39:11.283855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:36:47.757 [2024-06-07 12:39:11.284037] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:36:47.757 [2024-06-07 12:39:11.284063] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:36:47.757 [2024-06-07 12:39:11.284073] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:36:47.757 [2024-06-07 12:39:11.284172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:36:47.757 [2024-06-07 12:39:11.290246] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000caf4c0 00:36:47.757 spare 00:36:47.757 [2024-06-07 12:39:11.292184] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:36:47.757 12:39:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:48.692 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:49.260 "name": "raid_bdev1", 00:36:49.260 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:49.260 "strip_size_kb": 0, 00:36:49.260 "state": "online", 00:36:49.260 "raid_level": "raid1", 00:36:49.260 "superblock": true, 00:36:49.260 "num_base_bdevs": 4, 00:36:49.260 "num_base_bdevs_discovered": 3, 00:36:49.260 "num_base_bdevs_operational": 3, 00:36:49.260 "process": { 00:36:49.260 "type": "rebuild", 00:36:49.260 "target": "spare", 00:36:49.260 "progress": { 00:36:49.260 "blocks": 24576, 00:36:49.260 "percent": 38 00:36:49.260 } 00:36:49.260 }, 00:36:49.260 "base_bdevs_list": [ 00:36:49.260 { 00:36:49.260 "name": "spare", 00:36:49.260 "uuid": "5c2dab06-ed33-549e-b499-93d0914691ab", 00:36:49.260 "is_configured": true, 00:36:49.260 "data_offset": 2048, 00:36:49.260 "data_size": 63488 00:36:49.260 }, 00:36:49.260 { 00:36:49.260 "name": null, 00:36:49.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:49.260 "is_configured": false, 00:36:49.260 "data_offset": 2048, 00:36:49.260 "data_size": 63488 00:36:49.260 }, 00:36:49.260 { 00:36:49.260 "name": "BaseBdev3", 00:36:49.260 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:49.260 "is_configured": true, 00:36:49.260 "data_offset": 2048, 00:36:49.260 "data_size": 63488 00:36:49.260 }, 00:36:49.260 { 00:36:49.260 "name": "BaseBdev4", 00:36:49.260 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:49.260 "is_configured": true, 00:36:49.260 "data_offset": 2048, 00:36:49.260 "data_size": 63488 00:36:49.260 } 00:36:49.260 ] 00:36:49.260 }' 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:36:49.260 12:39:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:36:49.518 [2024-06-07 12:39:12.954550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:49.519 [2024-06-07 12:39:13.004391] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:36:49.519 [2024-06-07 12:39:13.004975] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:36:49.519 [2024-06-07 12:39:13.005007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:36:49.519 [2024-06-07 12:39:13.005018] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:49.519 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:49.777 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:49.777 "name": "raid_bdev1", 00:36:49.777 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:49.777 "strip_size_kb": 0, 00:36:49.777 "state": "online", 00:36:49.777 "raid_level": "raid1", 00:36:49.777 "superblock": true, 00:36:49.777 "num_base_bdevs": 4, 00:36:49.777 "num_base_bdevs_discovered": 2, 00:36:49.777 "num_base_bdevs_operational": 2, 00:36:49.777 "base_bdevs_list": [ 00:36:49.777 { 00:36:49.777 "name": null, 00:36:49.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:49.777 "is_configured": false, 00:36:49.777 "data_offset": 2048, 00:36:49.777 "data_size": 63488 00:36:49.777 }, 00:36:49.777 { 00:36:49.777 "name": null, 00:36:49.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:49.777 "is_configured": false, 00:36:49.777 "data_offset": 2048, 00:36:49.777 "data_size": 63488 00:36:49.777 }, 00:36:49.777 { 00:36:49.777 "name": "BaseBdev3", 00:36:49.777 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:49.777 "is_configured": true, 00:36:49.777 "data_offset": 2048, 00:36:49.777 "data_size": 63488 00:36:49.777 }, 00:36:49.777 { 00:36:49.777 "name": "BaseBdev4", 00:36:49.777 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:49.777 "is_configured": true, 00:36:49.777 "data_offset": 2048, 00:36:49.777 "data_size": 63488 00:36:49.777 } 00:36:49.777 ] 00:36:49.777 }' 00:36:49.777 12:39:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:49.777 12:39:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:50.719 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:50.719 "name": "raid_bdev1", 00:36:50.719 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:50.719 "strip_size_kb": 0, 00:36:50.719 "state": "online", 00:36:50.719 "raid_level": "raid1", 00:36:50.719 "superblock": true, 00:36:50.719 "num_base_bdevs": 4, 00:36:50.719 "num_base_bdevs_discovered": 2, 00:36:50.719 "num_base_bdevs_operational": 2, 00:36:50.719 "base_bdevs_list": [ 00:36:50.719 { 00:36:50.719 "name": null, 00:36:50.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:50.719 "is_configured": false, 00:36:50.719 "data_offset": 2048, 00:36:50.719 "data_size": 63488 00:36:50.719 }, 00:36:50.719 { 00:36:50.719 "name": null, 00:36:50.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:50.719 "is_configured": false, 00:36:50.719 "data_offset": 2048, 00:36:50.719 "data_size": 63488 00:36:50.719 }, 00:36:50.719 { 00:36:50.719 "name": "BaseBdev3", 00:36:50.719 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:50.719 "is_configured": true, 00:36:50.719 "data_offset": 2048, 00:36:50.719 "data_size": 63488 00:36:50.719 }, 00:36:50.719 { 00:36:50.719 "name": "BaseBdev4", 00:36:50.719 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:50.719 "is_configured": true, 00:36:50.719 "data_offset": 2048, 00:36:50.719 "data_size": 63488 00:36:50.719 } 00:36:50.719 ] 00:36:50.719 }' 00:36:50.720 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:50.720 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:50.720 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:50.720 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:50.720 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:36:50.977 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:36:51.235 [2024-06-07 12:39:14.764647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:36:51.235 [2024-06-07 12:39:14.765189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:51.235 [2024-06-07 12:39:14.765355] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b780 00:36:51.235 [2024-06-07 12:39:14.765433] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:51.235 [2024-06-07 12:39:14.765849] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:51.235 [2024-06-07 12:39:14.765950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:51.235 [2024-06-07 12:39:14.766073] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:36:51.235 [2024-06-07 12:39:14.766089] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:36:51.235 [2024-06-07 12:39:14.766099] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:36:51.235 BaseBdev1 00:36:51.235 12:39:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:52.170 12:39:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:52.738 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:52.738 "name": "raid_bdev1", 00:36:52.738 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:52.738 "strip_size_kb": 0, 00:36:52.738 "state": "online", 00:36:52.738 "raid_level": "raid1", 00:36:52.738 "superblock": true, 00:36:52.738 "num_base_bdevs": 4, 00:36:52.738 "num_base_bdevs_discovered": 2, 00:36:52.738 "num_base_bdevs_operational": 2, 00:36:52.738 "base_bdevs_list": [ 00:36:52.738 { 00:36:52.738 "name": null, 00:36:52.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:52.738 "is_configured": false, 00:36:52.738 "data_offset": 2048, 00:36:52.738 "data_size": 63488 00:36:52.738 }, 00:36:52.738 { 00:36:52.738 "name": null, 00:36:52.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:52.738 "is_configured": false, 00:36:52.738 "data_offset": 2048, 00:36:52.738 "data_size": 63488 00:36:52.738 }, 00:36:52.738 { 00:36:52.738 "name": "BaseBdev3", 00:36:52.738 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:52.738 "is_configured": true, 00:36:52.738 "data_offset": 2048, 00:36:52.738 "data_size": 63488 00:36:52.738 }, 00:36:52.738 { 00:36:52.738 "name": "BaseBdev4", 00:36:52.738 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:52.738 "is_configured": true, 00:36:52.738 "data_offset": 2048, 00:36:52.738 "data_size": 63488 00:36:52.738 } 00:36:52.738 ] 00:36:52.738 }' 00:36:52.738 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:52.738 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:53.305 "name": "raid_bdev1", 00:36:53.305 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:53.305 "strip_size_kb": 0, 00:36:53.305 "state": "online", 00:36:53.305 "raid_level": "raid1", 00:36:53.305 "superblock": true, 00:36:53.305 "num_base_bdevs": 4, 00:36:53.305 "num_base_bdevs_discovered": 2, 00:36:53.305 "num_base_bdevs_operational": 2, 00:36:53.305 "base_bdevs_list": [ 00:36:53.305 { 00:36:53.305 "name": null, 00:36:53.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:53.305 "is_configured": false, 00:36:53.305 "data_offset": 2048, 00:36:53.305 "data_size": 63488 00:36:53.305 }, 00:36:53.305 { 00:36:53.305 "name": null, 00:36:53.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:53.305 "is_configured": false, 00:36:53.305 "data_offset": 2048, 00:36:53.305 "data_size": 63488 00:36:53.305 }, 00:36:53.305 { 00:36:53.305 "name": "BaseBdev3", 00:36:53.305 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:53.305 "is_configured": true, 00:36:53.305 "data_offset": 2048, 00:36:53.305 "data_size": 63488 00:36:53.305 }, 00:36:53.305 { 00:36:53.305 "name": "BaseBdev4", 00:36:53.305 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:53.305 "is_configured": true, 00:36:53.305 "data_offset": 2048, 00:36:53.305 "data_size": 63488 00:36:53.305 } 00:36:53.305 ] 00:36:53.305 }' 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:53.305 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:36:53.564 12:39:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:36:53.564 [2024-06-07 12:39:17.193098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:36:53.564 [2024-06-07 12:39:17.193328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:36:53.564 [2024-06-07 12:39:17.193343] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:36:53.564 request: 00:36:53.564 { 00:36:53.564 "raid_bdev": "raid_bdev1", 00:36:53.564 "base_bdev": "BaseBdev1", 00:36:53.564 "method": "bdev_raid_add_base_bdev", 00:36:53.564 "req_id": 1 00:36:53.564 } 00:36:53.564 Got JSON-RPC error response 00:36:53.564 response: 00:36:53.564 { 00:36:53.564 "code": -22, 00:36:53.564 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:36:53.564 } 00:36:53.822 12:39:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:36:53.822 12:39:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:36:53.822 12:39:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:36:53.822 12:39:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:36:53.822 12:39:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:54.776 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:55.034 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:36:55.034 "name": "raid_bdev1", 00:36:55.034 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:55.034 "strip_size_kb": 0, 00:36:55.034 "state": "online", 00:36:55.034 "raid_level": "raid1", 00:36:55.034 "superblock": true, 00:36:55.034 "num_base_bdevs": 4, 00:36:55.034 "num_base_bdevs_discovered": 2, 00:36:55.034 "num_base_bdevs_operational": 2, 00:36:55.034 "base_bdevs_list": [ 00:36:55.034 { 00:36:55.034 "name": null, 00:36:55.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:55.034 "is_configured": false, 00:36:55.034 "data_offset": 2048, 00:36:55.034 "data_size": 63488 00:36:55.034 }, 00:36:55.034 { 00:36:55.034 "name": null, 00:36:55.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:55.034 "is_configured": false, 00:36:55.034 "data_offset": 2048, 00:36:55.034 "data_size": 63488 00:36:55.034 }, 00:36:55.034 { 00:36:55.034 "name": "BaseBdev3", 00:36:55.034 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:55.034 "is_configured": true, 00:36:55.034 "data_offset": 2048, 00:36:55.034 "data_size": 63488 00:36:55.034 }, 00:36:55.034 { 00:36:55.034 "name": "BaseBdev4", 00:36:55.034 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:55.034 "is_configured": true, 00:36:55.034 "data_offset": 2048, 00:36:55.034 "data_size": 63488 00:36:55.034 } 00:36:55.034 ] 00:36:55.034 }' 00:36:55.034 12:39:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:36:55.034 12:39:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:36:55.600 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:36:55.857 "name": "raid_bdev1", 00:36:55.857 "uuid": "7597b2dc-07da-412d-a2ca-b4b7ffed4cba", 00:36:55.857 "strip_size_kb": 0, 00:36:55.857 "state": "online", 00:36:55.857 "raid_level": "raid1", 00:36:55.857 "superblock": true, 00:36:55.857 "num_base_bdevs": 4, 00:36:55.857 "num_base_bdevs_discovered": 2, 00:36:55.857 "num_base_bdevs_operational": 2, 00:36:55.857 "base_bdevs_list": [ 00:36:55.857 { 00:36:55.857 "name": null, 00:36:55.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:55.857 "is_configured": false, 00:36:55.857 "data_offset": 2048, 00:36:55.857 "data_size": 63488 00:36:55.857 }, 00:36:55.857 { 00:36:55.857 "name": null, 00:36:55.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:36:55.857 "is_configured": false, 00:36:55.857 "data_offset": 2048, 00:36:55.857 "data_size": 63488 00:36:55.857 }, 00:36:55.857 { 00:36:55.857 "name": "BaseBdev3", 00:36:55.857 "uuid": "218b2356-ddcf-5de1-8fbe-4a121b410c5e", 00:36:55.857 "is_configured": true, 00:36:55.857 "data_offset": 2048, 00:36:55.857 "data_size": 63488 00:36:55.857 }, 00:36:55.857 { 00:36:55.857 "name": "BaseBdev4", 00:36:55.857 "uuid": "a3419da3-2f80-5937-8a94-9e652b6a5293", 00:36:55.857 "is_configured": true, 00:36:55.857 "data_offset": 2048, 00:36:55.857 "data_size": 63488 00:36:55.857 } 00:36:55.857 ] 00:36:55.857 }' 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 224527 00:36:55.857 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 224527 ']' 00:36:55.858 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 224527 00:36:55.858 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 224527 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 224527' 00:36:56.115 killing process with pid 224527 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 224527 00:36:56.115 Received shutdown signal, test time was about 60.000000 seconds 00:36:56.115 00:36:56.115 Latency(us) 00:36:56.115 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:56.115 =================================================================================================================== 00:36:56.115 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:36:56.115 [2024-06-07 12:39:19.531399] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:36:56.115 [2024-06-07 12:39:19.531555] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:36:56.115 [2024-06-07 12:39:19.531626] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:36:56.115 [2024-06-07 12:39:19.531639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000ab80 name raid_bdev1, state offline 00:36:56.115 12:39:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 224527 00:36:56.115 [2024-06-07 12:39:19.634471] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:36:56.681 00:36:56.681 real 0m36.802s 00:36:56.681 user 0m55.368s 00:36:56.681 sys 0m6.829s 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:36:56.681 ************************************ 00:36:56.681 END TEST raid_rebuild_test_sb 00:36:56.681 ************************************ 00:36:56.681 12:39:20 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:36:56.681 12:39:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:36:56.681 12:39:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:36:56.681 12:39:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:36:56.681 ************************************ 00:36:56.681 START TEST raid_rebuild_test_io 00:36:56.681 ************************************ 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false true true 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev3 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev4 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=225446 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:36:56.681 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 225446 /var/tmp/spdk-raid.sock 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 225446 ']' 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:36:56.682 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:36:56.682 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:36:56.682 [2024-06-07 12:39:20.145978] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:36:56.682 [2024-06-07 12:39:20.147010] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid225446 ] 00:36:56.682 I/O size of 3145728 is greater than zero copy threshold (65536). 00:36:56.682 Zero copy mechanism will not be used. 00:36:56.682 [2024-06-07 12:39:20.290419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:56.940 [2024-06-07 12:39:20.384723] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:36:56.940 [2024-06-07 12:39:20.467637] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:36:56.940 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:36:56.940 12:39:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:36:56.940 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:56.940 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:36:57.505 BaseBdev1_malloc 00:36:57.505 12:39:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:36:57.505 [2024-06-07 12:39:21.126417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:36:57.505 [2024-06-07 12:39:21.127001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:57.505 [2024-06-07 12:39:21.127279] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:36:57.505 [2024-06-07 12:39:21.127574] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:57.505 [2024-06-07 12:39:21.130245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:57.505 [2024-06-07 12:39:21.130481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:36:57.505 BaseBdev1 00:36:57.764 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:57.764 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:36:58.020 BaseBdev2_malloc 00:36:58.020 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:36:58.020 [2024-06-07 12:39:21.663431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:36:58.020 [2024-06-07 12:39:21.664184] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:58.020 [2024-06-07 12:39:21.664487] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:36:58.020 [2024-06-07 12:39:21.664712] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:58.277 [2024-06-07 12:39:21.667332] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:58.277 [2024-06-07 12:39:21.667627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:36:58.277 BaseBdev2 00:36:58.277 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:58.277 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:36:58.534 BaseBdev3_malloc 00:36:58.534 12:39:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:36:58.534 [2024-06-07 12:39:22.176704] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:36:58.534 [2024-06-07 12:39:22.177242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:58.534 [2024-06-07 12:39:22.177473] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:36:58.534 [2024-06-07 12:39:22.177697] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:58.534 [2024-06-07 12:39:22.180118] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:58.534 [2024-06-07 12:39:22.180382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:36:58.793 BaseBdev3 00:36:58.793 12:39:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:36:58.793 12:39:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:36:58.793 BaseBdev4_malloc 00:36:59.049 12:39:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:36:59.049 [2024-06-07 12:39:22.653335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:36:59.049 [2024-06-07 12:39:22.653866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:59.049 [2024-06-07 12:39:22.654086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:36:59.049 [2024-06-07 12:39:22.654370] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:59.049 [2024-06-07 12:39:22.656813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:59.049 [2024-06-07 12:39:22.657042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:36:59.049 BaseBdev4 00:36:59.049 12:39:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:36:59.306 spare_malloc 00:36:59.564 12:39:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:36:59.821 spare_delay 00:36:59.821 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:00.079 [2024-06-07 12:39:23.513349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:00.079 [2024-06-07 12:39:23.514178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:00.079 [2024-06-07 12:39:23.514448] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:37:00.079 [2024-06-07 12:39:23.514699] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:00.079 [2024-06-07 12:39:23.517291] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:00.079 [2024-06-07 12:39:23.517555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:00.079 spare 00:37:00.079 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:37:00.339 [2024-06-07 12:39:23.746088] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:00.339 [2024-06-07 12:39:23.748483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:00.339 [2024-06-07 12:39:23.748707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:37:00.339 [2024-06-07 12:39:23.748772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:37:00.339 [2024-06-07 12:39:23.748918] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:37:00.339 [2024-06-07 12:39:23.748959] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:37:00.339 [2024-06-07 12:39:23.749156] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:37:00.339 [2024-06-07 12:39:23.749499] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:37:00.339 [2024-06-07 12:39:23.749608] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:37:00.339 [2024-06-07 12:39:23.749971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:00.339 12:39:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:00.598 12:39:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:00.598 "name": "raid_bdev1", 00:37:00.598 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:00.598 "strip_size_kb": 0, 00:37:00.598 "state": "online", 00:37:00.598 "raid_level": "raid1", 00:37:00.598 "superblock": false, 00:37:00.598 "num_base_bdevs": 4, 00:37:00.598 "num_base_bdevs_discovered": 4, 00:37:00.598 "num_base_bdevs_operational": 4, 00:37:00.598 "base_bdevs_list": [ 00:37:00.598 { 00:37:00.598 "name": "BaseBdev1", 00:37:00.598 "uuid": "57bca9c5-2e44-5a0f-928e-052fe866cfd7", 00:37:00.598 "is_configured": true, 00:37:00.598 "data_offset": 0, 00:37:00.598 "data_size": 65536 00:37:00.598 }, 00:37:00.598 { 00:37:00.598 "name": "BaseBdev2", 00:37:00.598 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:00.598 "is_configured": true, 00:37:00.598 "data_offset": 0, 00:37:00.598 "data_size": 65536 00:37:00.598 }, 00:37:00.598 { 00:37:00.598 "name": "BaseBdev3", 00:37:00.598 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:00.598 "is_configured": true, 00:37:00.598 "data_offset": 0, 00:37:00.598 "data_size": 65536 00:37:00.598 }, 00:37:00.598 { 00:37:00.598 "name": "BaseBdev4", 00:37:00.598 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:00.598 "is_configured": true, 00:37:00.598 "data_offset": 0, 00:37:00.598 "data_size": 65536 00:37:00.598 } 00:37:00.598 ] 00:37:00.598 }' 00:37:00.598 12:39:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:00.598 12:39:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:01.164 12:39:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:37:01.164 12:39:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:37:01.422 [2024-06-07 12:39:24.986493] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:01.422 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:37:01.422 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:01.422 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:37:01.681 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:37:01.681 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:37:01.681 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:37:01.681 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:37:01.940 [2024-06-07 12:39:25.381982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002a10 00:37:01.940 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:01.940 Zero copy mechanism will not be used. 00:37:01.940 Running I/O for 60 seconds... 00:37:01.940 [2024-06-07 12:39:25.546482] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:01.940 [2024-06-07 12:39:25.550766] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002a10 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:02.198 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:02.456 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:02.456 "name": "raid_bdev1", 00:37:02.456 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:02.456 "strip_size_kb": 0, 00:37:02.456 "state": "online", 00:37:02.456 "raid_level": "raid1", 00:37:02.456 "superblock": false, 00:37:02.456 "num_base_bdevs": 4, 00:37:02.456 "num_base_bdevs_discovered": 3, 00:37:02.456 "num_base_bdevs_operational": 3, 00:37:02.456 "base_bdevs_list": [ 00:37:02.456 { 00:37:02.456 "name": null, 00:37:02.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:02.456 "is_configured": false, 00:37:02.456 "data_offset": 0, 00:37:02.456 "data_size": 65536 00:37:02.456 }, 00:37:02.456 { 00:37:02.456 "name": "BaseBdev2", 00:37:02.456 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:02.456 "is_configured": true, 00:37:02.456 "data_offset": 0, 00:37:02.456 "data_size": 65536 00:37:02.456 }, 00:37:02.456 { 00:37:02.456 "name": "BaseBdev3", 00:37:02.456 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:02.456 "is_configured": true, 00:37:02.456 "data_offset": 0, 00:37:02.456 "data_size": 65536 00:37:02.456 }, 00:37:02.456 { 00:37:02.456 "name": "BaseBdev4", 00:37:02.456 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:02.456 "is_configured": true, 00:37:02.456 "data_offset": 0, 00:37:02.456 "data_size": 65536 00:37:02.456 } 00:37:02.456 ] 00:37:02.456 }' 00:37:02.456 12:39:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:02.456 12:39:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:03.023 12:39:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:03.281 [2024-06-07 12:39:26.890132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:03.539 [2024-06-07 12:39:26.937110] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:37:03.539 [2024-06-07 12:39:26.939936] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:03.539 12:39:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:37:03.539 [2024-06-07 12:39:27.061579] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:03.540 [2024-06-07 12:39:27.062514] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:03.540 [2024-06-07 12:39:27.172744] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:03.540 [2024-06-07 12:39:27.173397] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:03.798 [2024-06-07 12:39:27.424263] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:04.056 [2024-06-07 12:39:27.641387] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:04.057 [2024-06-07 12:39:27.642102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:04.314 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:04.314 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:04.314 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:04.314 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:04.314 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:04.573 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:04.573 12:39:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:04.573 [2024-06-07 12:39:28.013817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:04.573 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:04.573 "name": "raid_bdev1", 00:37:04.573 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:04.573 "strip_size_kb": 0, 00:37:04.573 "state": "online", 00:37:04.573 "raid_level": "raid1", 00:37:04.573 "superblock": false, 00:37:04.573 "num_base_bdevs": 4, 00:37:04.573 "num_base_bdevs_discovered": 4, 00:37:04.573 "num_base_bdevs_operational": 4, 00:37:04.573 "process": { 00:37:04.573 "type": "rebuild", 00:37:04.573 "target": "spare", 00:37:04.573 "progress": { 00:37:04.573 "blocks": 18432, 00:37:04.573 "percent": 28 00:37:04.573 } 00:37:04.573 }, 00:37:04.573 "base_bdevs_list": [ 00:37:04.573 { 00:37:04.573 "name": "spare", 00:37:04.573 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:04.573 "is_configured": true, 00:37:04.573 "data_offset": 0, 00:37:04.573 "data_size": 65536 00:37:04.573 }, 00:37:04.573 { 00:37:04.573 "name": "BaseBdev2", 00:37:04.573 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:04.573 "is_configured": true, 00:37:04.573 "data_offset": 0, 00:37:04.573 "data_size": 65536 00:37:04.573 }, 00:37:04.573 { 00:37:04.573 "name": "BaseBdev3", 00:37:04.573 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:04.573 "is_configured": true, 00:37:04.573 "data_offset": 0, 00:37:04.573 "data_size": 65536 00:37:04.573 }, 00:37:04.573 { 00:37:04.573 "name": "BaseBdev4", 00:37:04.573 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:04.573 "is_configured": true, 00:37:04.573 "data_offset": 0, 00:37:04.573 "data_size": 65536 00:37:04.573 } 00:37:04.573 ] 00:37:04.573 }' 00:37:04.573 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:04.831 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:04.831 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:04.831 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:04.831 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:04.831 [2024-06-07 12:39:28.370411] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:37:05.089 [2024-06-07 12:39:28.484615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:05.089 [2024-06-07 12:39:28.504306] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:05.089 [2024-06-07 12:39:28.521949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:05.089 [2024-06-07 12:39:28.522313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:05.089 [2024-06-07 12:39:28.522368] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:05.089 [2024-06-07 12:39:28.538179] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002a10 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:05.089 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:05.347 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:05.347 "name": "raid_bdev1", 00:37:05.347 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:05.347 "strip_size_kb": 0, 00:37:05.347 "state": "online", 00:37:05.347 "raid_level": "raid1", 00:37:05.347 "superblock": false, 00:37:05.347 "num_base_bdevs": 4, 00:37:05.347 "num_base_bdevs_discovered": 3, 00:37:05.347 "num_base_bdevs_operational": 3, 00:37:05.347 "base_bdevs_list": [ 00:37:05.347 { 00:37:05.347 "name": null, 00:37:05.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:05.347 "is_configured": false, 00:37:05.347 "data_offset": 0, 00:37:05.347 "data_size": 65536 00:37:05.347 }, 00:37:05.347 { 00:37:05.347 "name": "BaseBdev2", 00:37:05.347 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:05.347 "is_configured": true, 00:37:05.347 "data_offset": 0, 00:37:05.347 "data_size": 65536 00:37:05.347 }, 00:37:05.347 { 00:37:05.347 "name": "BaseBdev3", 00:37:05.347 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:05.347 "is_configured": true, 00:37:05.347 "data_offset": 0, 00:37:05.347 "data_size": 65536 00:37:05.347 }, 00:37:05.347 { 00:37:05.347 "name": "BaseBdev4", 00:37:05.347 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:05.347 "is_configured": true, 00:37:05.347 "data_offset": 0, 00:37:05.347 "data_size": 65536 00:37:05.347 } 00:37:05.347 ] 00:37:05.347 }' 00:37:05.347 12:39:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:05.347 12:39:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:05.913 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:06.478 "name": "raid_bdev1", 00:37:06.478 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:06.478 "strip_size_kb": 0, 00:37:06.478 "state": "online", 00:37:06.478 "raid_level": "raid1", 00:37:06.478 "superblock": false, 00:37:06.478 "num_base_bdevs": 4, 00:37:06.478 "num_base_bdevs_discovered": 3, 00:37:06.478 "num_base_bdevs_operational": 3, 00:37:06.478 "base_bdevs_list": [ 00:37:06.478 { 00:37:06.478 "name": null, 00:37:06.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:06.478 "is_configured": false, 00:37:06.478 "data_offset": 0, 00:37:06.478 "data_size": 65536 00:37:06.478 }, 00:37:06.478 { 00:37:06.478 "name": "BaseBdev2", 00:37:06.478 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:06.478 "is_configured": true, 00:37:06.478 "data_offset": 0, 00:37:06.478 "data_size": 65536 00:37:06.478 }, 00:37:06.478 { 00:37:06.478 "name": "BaseBdev3", 00:37:06.478 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:06.478 "is_configured": true, 00:37:06.478 "data_offset": 0, 00:37:06.478 "data_size": 65536 00:37:06.478 }, 00:37:06.478 { 00:37:06.478 "name": "BaseBdev4", 00:37:06.478 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:06.478 "is_configured": true, 00:37:06.478 "data_offset": 0, 00:37:06.478 "data_size": 65536 00:37:06.478 } 00:37:06.478 ] 00:37:06.478 }' 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:06.478 12:39:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:06.736 [2024-06-07 12:39:30.251174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:06.736 12:39:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:06.736 [2024-06-07 12:39:30.300943] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002c80 00:37:06.736 [2024-06-07 12:39:30.302972] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:06.995 [2024-06-07 12:39:30.428246] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:06.995 [2024-06-07 12:39:30.542560] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:06.995 [2024-06-07 12:39:30.543188] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:07.561 [2024-06-07 12:39:30.983172] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:07.819 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:07.819 [2024-06-07 12:39:31.325823] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:08.076 [2024-06-07 12:39:31.531033] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:08.076 [2024-06-07 12:39:31.531651] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:08.076 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:08.076 "name": "raid_bdev1", 00:37:08.076 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:08.076 "strip_size_kb": 0, 00:37:08.076 "state": "online", 00:37:08.076 "raid_level": "raid1", 00:37:08.076 "superblock": false, 00:37:08.076 "num_base_bdevs": 4, 00:37:08.076 "num_base_bdevs_discovered": 4, 00:37:08.076 "num_base_bdevs_operational": 4, 00:37:08.076 "process": { 00:37:08.076 "type": "rebuild", 00:37:08.076 "target": "spare", 00:37:08.077 "progress": { 00:37:08.077 "blocks": 16384, 00:37:08.077 "percent": 25 00:37:08.077 } 00:37:08.077 }, 00:37:08.077 "base_bdevs_list": [ 00:37:08.077 { 00:37:08.077 "name": "spare", 00:37:08.077 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:08.077 "is_configured": true, 00:37:08.077 "data_offset": 0, 00:37:08.077 "data_size": 65536 00:37:08.077 }, 00:37:08.077 { 00:37:08.077 "name": "BaseBdev2", 00:37:08.077 "uuid": "f034e221-63cd-5efe-abeb-ef79c7ecb205", 00:37:08.077 "is_configured": true, 00:37:08.077 "data_offset": 0, 00:37:08.077 "data_size": 65536 00:37:08.077 }, 00:37:08.077 { 00:37:08.077 "name": "BaseBdev3", 00:37:08.077 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:08.077 "is_configured": true, 00:37:08.077 "data_offset": 0, 00:37:08.077 "data_size": 65536 00:37:08.077 }, 00:37:08.077 { 00:37:08.077 "name": "BaseBdev4", 00:37:08.077 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:08.077 "is_configured": true, 00:37:08.077 "data_offset": 0, 00:37:08.077 "data_size": 65536 00:37:08.077 } 00:37:08.077 ] 00:37:08.077 }' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:37:08.077 12:39:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:37:08.335 [2024-06-07 12:39:31.759860] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:08.335 [2024-06-07 12:39:31.869555] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:37:08.594 [2024-06-07 12:39:32.032591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:37:08.594 [2024-06-07 12:39:32.109869] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:37:08.594 [2024-06-07 12:39:32.212991] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000002a10 00:37:08.594 [2024-06-07 12:39:32.213280] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000002c80 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:08.852 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:08.852 [2024-06-07 12:39:32.352877] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:37:09.142 [2024-06-07 12:39:32.579652] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:09.142 "name": "raid_bdev1", 00:37:09.142 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:09.142 "strip_size_kb": 0, 00:37:09.142 "state": "online", 00:37:09.142 "raid_level": "raid1", 00:37:09.142 "superblock": false, 00:37:09.142 "num_base_bdevs": 4, 00:37:09.142 "num_base_bdevs_discovered": 3, 00:37:09.142 "num_base_bdevs_operational": 3, 00:37:09.142 "process": { 00:37:09.142 "type": "rebuild", 00:37:09.142 "target": "spare", 00:37:09.142 "progress": { 00:37:09.142 "blocks": 30720, 00:37:09.142 "percent": 46 00:37:09.142 } 00:37:09.142 }, 00:37:09.142 "base_bdevs_list": [ 00:37:09.142 { 00:37:09.142 "name": "spare", 00:37:09.142 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:09.142 "is_configured": true, 00:37:09.142 "data_offset": 0, 00:37:09.142 "data_size": 65536 00:37:09.142 }, 00:37:09.142 { 00:37:09.142 "name": null, 00:37:09.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:09.142 "is_configured": false, 00:37:09.142 "data_offset": 0, 00:37:09.142 "data_size": 65536 00:37:09.142 }, 00:37:09.142 { 00:37:09.142 "name": "BaseBdev3", 00:37:09.142 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:09.142 "is_configured": true, 00:37:09.142 "data_offset": 0, 00:37:09.142 "data_size": 65536 00:37:09.142 }, 00:37:09.142 { 00:37:09.142 "name": "BaseBdev4", 00:37:09.142 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:09.142 "is_configured": true, 00:37:09.142 "data_offset": 0, 00:37:09.142 "data_size": 65536 00:37:09.142 } 00:37:09.142 ] 00:37:09.142 }' 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=979 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:09.142 12:39:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:09.419 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:09.419 "name": "raid_bdev1", 00:37:09.419 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:09.419 "strip_size_kb": 0, 00:37:09.419 "state": "online", 00:37:09.419 "raid_level": "raid1", 00:37:09.419 "superblock": false, 00:37:09.419 "num_base_bdevs": 4, 00:37:09.419 "num_base_bdevs_discovered": 3, 00:37:09.419 "num_base_bdevs_operational": 3, 00:37:09.419 "process": { 00:37:09.419 "type": "rebuild", 00:37:09.419 "target": "spare", 00:37:09.419 "progress": { 00:37:09.419 "blocks": 38912, 00:37:09.419 "percent": 59 00:37:09.419 } 00:37:09.419 }, 00:37:09.419 "base_bdevs_list": [ 00:37:09.419 { 00:37:09.419 "name": "spare", 00:37:09.419 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:09.419 "is_configured": true, 00:37:09.419 "data_offset": 0, 00:37:09.419 "data_size": 65536 00:37:09.419 }, 00:37:09.419 { 00:37:09.419 "name": null, 00:37:09.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:09.419 "is_configured": false, 00:37:09.419 "data_offset": 0, 00:37:09.419 "data_size": 65536 00:37:09.419 }, 00:37:09.419 { 00:37:09.419 "name": "BaseBdev3", 00:37:09.419 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:09.419 "is_configured": true, 00:37:09.419 "data_offset": 0, 00:37:09.419 "data_size": 65536 00:37:09.419 }, 00:37:09.419 { 00:37:09.419 "name": "BaseBdev4", 00:37:09.419 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:09.419 "is_configured": true, 00:37:09.419 "data_offset": 0, 00:37:09.419 "data_size": 65536 00:37:09.419 } 00:37:09.419 ] 00:37:09.419 }' 00:37:09.419 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:09.419 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:09.677 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:09.677 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:09.677 12:39:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:10.612 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:10.870 [2024-06-07 12:39:34.287953] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:37:10.870 [2024-06-07 12:39:34.391957] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:37:10.870 [2024-06-07 12:39:34.396053] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:10.870 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:10.870 "name": "raid_bdev1", 00:37:10.870 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:10.870 "strip_size_kb": 0, 00:37:10.870 "state": "online", 00:37:10.870 "raid_level": "raid1", 00:37:10.870 "superblock": false, 00:37:10.870 "num_base_bdevs": 4, 00:37:10.870 "num_base_bdevs_discovered": 3, 00:37:10.870 "num_base_bdevs_operational": 3, 00:37:10.870 "base_bdevs_list": [ 00:37:10.870 { 00:37:10.870 "name": "spare", 00:37:10.871 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:10.871 "is_configured": true, 00:37:10.871 "data_offset": 0, 00:37:10.871 "data_size": 65536 00:37:10.871 }, 00:37:10.871 { 00:37:10.871 "name": null, 00:37:10.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:10.871 "is_configured": false, 00:37:10.871 "data_offset": 0, 00:37:10.871 "data_size": 65536 00:37:10.871 }, 00:37:10.871 { 00:37:10.871 "name": "BaseBdev3", 00:37:10.871 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:10.871 "is_configured": true, 00:37:10.871 "data_offset": 0, 00:37:10.871 "data_size": 65536 00:37:10.871 }, 00:37:10.871 { 00:37:10.871 "name": "BaseBdev4", 00:37:10.871 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:10.871 "is_configured": true, 00:37:10.871 "data_offset": 0, 00:37:10.871 "data_size": 65536 00:37:10.871 } 00:37:10.871 ] 00:37:10.871 }' 00:37:10.871 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:11.129 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:11.388 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:11.388 "name": "raid_bdev1", 00:37:11.388 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:11.388 "strip_size_kb": 0, 00:37:11.388 "state": "online", 00:37:11.388 "raid_level": "raid1", 00:37:11.388 "superblock": false, 00:37:11.388 "num_base_bdevs": 4, 00:37:11.388 "num_base_bdevs_discovered": 3, 00:37:11.388 "num_base_bdevs_operational": 3, 00:37:11.388 "base_bdevs_list": [ 00:37:11.388 { 00:37:11.388 "name": "spare", 00:37:11.388 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:11.388 "is_configured": true, 00:37:11.388 "data_offset": 0, 00:37:11.388 "data_size": 65536 00:37:11.388 }, 00:37:11.388 { 00:37:11.388 "name": null, 00:37:11.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:11.388 "is_configured": false, 00:37:11.388 "data_offset": 0, 00:37:11.388 "data_size": 65536 00:37:11.388 }, 00:37:11.388 { 00:37:11.388 "name": "BaseBdev3", 00:37:11.388 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:11.388 "is_configured": true, 00:37:11.388 "data_offset": 0, 00:37:11.388 "data_size": 65536 00:37:11.388 }, 00:37:11.388 { 00:37:11.388 "name": "BaseBdev4", 00:37:11.388 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:11.388 "is_configured": true, 00:37:11.388 "data_offset": 0, 00:37:11.388 "data_size": 65536 00:37:11.388 } 00:37:11.388 ] 00:37:11.388 }' 00:37:11.388 12:39:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:11.388 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:11.388 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:11.646 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:11.904 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:11.904 "name": "raid_bdev1", 00:37:11.904 "uuid": "5471f8a5-a7c4-439b-add0-5226d61d7ff8", 00:37:11.904 "strip_size_kb": 0, 00:37:11.904 "state": "online", 00:37:11.904 "raid_level": "raid1", 00:37:11.904 "superblock": false, 00:37:11.904 "num_base_bdevs": 4, 00:37:11.904 "num_base_bdevs_discovered": 3, 00:37:11.904 "num_base_bdevs_operational": 3, 00:37:11.904 "base_bdevs_list": [ 00:37:11.904 { 00:37:11.904 "name": "spare", 00:37:11.904 "uuid": "a24acdb5-b5b5-5471-a74f-1b45b4af8cea", 00:37:11.904 "is_configured": true, 00:37:11.904 "data_offset": 0, 00:37:11.904 "data_size": 65536 00:37:11.904 }, 00:37:11.904 { 00:37:11.904 "name": null, 00:37:11.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:11.904 "is_configured": false, 00:37:11.904 "data_offset": 0, 00:37:11.904 "data_size": 65536 00:37:11.904 }, 00:37:11.904 { 00:37:11.904 "name": "BaseBdev3", 00:37:11.904 "uuid": "05e2dcd4-390b-57a6-a0aa-d29cdf7e0fc4", 00:37:11.904 "is_configured": true, 00:37:11.904 "data_offset": 0, 00:37:11.904 "data_size": 65536 00:37:11.904 }, 00:37:11.904 { 00:37:11.904 "name": "BaseBdev4", 00:37:11.904 "uuid": "c6a4767b-b460-5b01-bd25-d0c62efc7661", 00:37:11.904 "is_configured": true, 00:37:11.904 "data_offset": 0, 00:37:11.904 "data_size": 65536 00:37:11.904 } 00:37:11.904 ] 00:37:11.904 }' 00:37:11.904 12:39:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:11.904 12:39:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:12.470 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:37:12.728 [2024-06-07 12:39:36.359511] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:37:12.728 [2024-06-07 12:39:36.359827] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:37:13.036 00:37:13.036 Latency(us) 00:37:13.036 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:13.036 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:37:13.036 raid_bdev1 : 11.05 129.36 388.07 0.00 0.00 11360.67 483.72 113346.07 00:37:13.036 =================================================================================================================== 00:37:13.036 Total : 129.36 388.07 0.00 0.00 11360.67 483.72 113346.07 00:37:13.036 [2024-06-07 12:39:36.443682] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:13.036 [2024-06-07 12:39:36.443897] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:13.036 0 00:37:13.036 [2024-06-07 12:39:36.444026] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:13.036 [2024-06-07 12:39:36.444040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:37:13.036 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:37:13.036 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:13.294 12:39:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:37:13.552 /dev/nbd0 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:13.552 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:13.810 1+0 records in 00:37:13.810 1+0 records out 00:37:13.810 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000537236 s, 7.6 MB/s 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:13.810 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:37:14.068 /dev/nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:14.068 1+0 records in 00:37:14.068 1+0 records out 00:37:14.068 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000547403 s, 7.5 MB/s 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:14.068 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:14.326 12:39:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:37:14.584 /dev/nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:14.843 1+0 records in 00:37:14.843 1+0 records out 00:37:14.843 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000458977 s, 8.9 MB/s 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:14.843 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:15.102 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 225446 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 225446 ']' 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 225446 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 225446 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 225446' 00:37:15.360 killing process with pid 225446 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 225446 00:37:15.360 Received shutdown signal, test time was about 13.589044 seconds 00:37:15.360 00:37:15.360 Latency(us) 00:37:15.360 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:15.360 =================================================================================================================== 00:37:15.360 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:15.360 12:39:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 225446 00:37:15.360 [2024-06-07 12:39:38.974086] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:37:15.618 [2024-06-07 12:39:39.021527] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:37:15.618 12:39:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:37:15.618 00:37:15.618 real 0m19.163s 00:37:15.618 user 0m30.815s 00:37:15.618 sys 0m3.564s 00:37:15.618 12:39:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:37:15.880 ************************************ 00:37:15.880 END TEST raid_rebuild_test_io 00:37:15.880 ************************************ 00:37:15.880 12:39:39 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:37:15.880 12:39:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:37:15.880 12:39:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:37:15.880 12:39:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:37:15.880 ************************************ 00:37:15.880 START TEST raid_rebuild_test_sb_io 00:37:15.880 ************************************ 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true true true 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev3 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # echo BaseBdev4 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=225918 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 225918 /var/tmp/spdk-raid.sock 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 225918 ']' 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:37:15.880 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:37:15.880 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:15.880 [2024-06-07 12:39:39.402820] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:37:15.880 [2024-06-07 12:39:39.403464] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid225918 ] 00:37:15.880 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:15.880 Zero copy mechanism will not be used. 00:37:16.138 [2024-06-07 12:39:39.551494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:16.138 [2024-06-07 12:39:39.610517] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:37:16.138 [2024-06-07 12:39:39.655713] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:16.138 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:37:16.138 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:37:16.139 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:16.139 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:37:16.397 BaseBdev1_malloc 00:37:16.397 12:39:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:16.696 [2024-06-07 12:39:40.234899] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:16.696 [2024-06-07 12:39:40.235276] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:16.696 [2024-06-07 12:39:40.235480] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:37:16.696 [2024-06-07 12:39:40.235656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:16.696 [2024-06-07 12:39:40.237865] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:16.696 [2024-06-07 12:39:40.238070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:16.696 BaseBdev1 00:37:16.696 12:39:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:16.696 12:39:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:37:16.955 BaseBdev2_malloc 00:37:16.955 12:39:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:37:17.214 [2024-06-07 12:39:40.827777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:37:17.214 [2024-06-07 12:39:40.828092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:17.214 [2024-06-07 12:39:40.828186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:37:17.214 [2024-06-07 12:39:40.828344] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:17.214 [2024-06-07 12:39:40.830327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:17.214 [2024-06-07 12:39:40.830504] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:37:17.214 BaseBdev2 00:37:17.214 12:39:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:17.214 12:39:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:37:17.472 BaseBdev3_malloc 00:37:17.730 12:39:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:37:17.730 [2024-06-07 12:39:41.334538] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:37:17.730 [2024-06-07 12:39:41.334902] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:17.730 [2024-06-07 12:39:41.334980] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007280 00:37:17.730 [2024-06-07 12:39:41.335109] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:17.730 [2024-06-07 12:39:41.336924] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:17.730 [2024-06-07 12:39:41.337113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:37:17.730 BaseBdev3 00:37:17.730 12:39:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:37:17.730 12:39:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:37:17.989 BaseBdev4_malloc 00:37:17.990 12:39:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:37:18.249 [2024-06-07 12:39:41.887122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:37:18.249 [2024-06-07 12:39:41.887523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:18.249 [2024-06-07 12:39:41.887622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007e80 00:37:18.249 [2024-06-07 12:39:41.887770] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:18.249 [2024-06-07 12:39:41.889863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:18.249 [2024-06-07 12:39:41.890053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:37:18.249 BaseBdev4 00:37:18.508 12:39:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:37:18.508 spare_malloc 00:37:18.767 12:39:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:37:18.767 spare_delay 00:37:18.767 12:39:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:19.025 [2024-06-07 12:39:42.619614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:19.025 [2024-06-07 12:39:42.620070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:19.025 [2024-06-07 12:39:42.620292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009080 00:37:19.025 [2024-06-07 12:39:42.620537] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:19.025 [2024-06-07 12:39:42.622404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:19.025 [2024-06-07 12:39:42.622613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:19.025 spare 00:37:19.025 12:39:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:37:19.598 [2024-06-07 12:39:43.011786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:19.598 [2024-06-07 12:39:43.013717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:19.598 [2024-06-07 12:39:43.013877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:37:19.598 [2024-06-07 12:39:43.013953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:37:19.598 [2024-06-07 12:39:43.014216] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:37:19.598 [2024-06-07 12:39:43.014325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:37:19.598 [2024-06-07 12:39:43.014477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000027a0 00:37:19.598 [2024-06-07 12:39:43.014868] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:37:19.598 [2024-06-07 12:39:43.014974] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:37:19.598 [2024-06-07 12:39:43.015166] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:19.598 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:19.855 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:19.855 "name": "raid_bdev1", 00:37:19.855 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:19.855 "strip_size_kb": 0, 00:37:19.855 "state": "online", 00:37:19.855 "raid_level": "raid1", 00:37:19.855 "superblock": true, 00:37:19.855 "num_base_bdevs": 4, 00:37:19.855 "num_base_bdevs_discovered": 4, 00:37:19.855 "num_base_bdevs_operational": 4, 00:37:19.855 "base_bdevs_list": [ 00:37:19.855 { 00:37:19.855 "name": "BaseBdev1", 00:37:19.855 "uuid": "9dd22be2-7904-5a32-940f-4d3553b45aa1", 00:37:19.855 "is_configured": true, 00:37:19.855 "data_offset": 2048, 00:37:19.855 "data_size": 63488 00:37:19.855 }, 00:37:19.855 { 00:37:19.855 "name": "BaseBdev2", 00:37:19.855 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:19.855 "is_configured": true, 00:37:19.855 "data_offset": 2048, 00:37:19.855 "data_size": 63488 00:37:19.855 }, 00:37:19.855 { 00:37:19.855 "name": "BaseBdev3", 00:37:19.855 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:19.855 "is_configured": true, 00:37:19.855 "data_offset": 2048, 00:37:19.855 "data_size": 63488 00:37:19.855 }, 00:37:19.855 { 00:37:19.856 "name": "BaseBdev4", 00:37:19.856 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:19.856 "is_configured": true, 00:37:19.856 "data_offset": 2048, 00:37:19.856 "data_size": 63488 00:37:19.856 } 00:37:19.856 ] 00:37:19.856 }' 00:37:19.856 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:19.856 12:39:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:20.787 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:37:20.787 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:37:20.787 [2024-06-07 12:39:44.424098] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:37:21.045 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:37:21.303 [2024-06-07 12:39:44.813897] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002a10 00:37:21.303 I/O size of 3145728 is greater than zero copy threshold (65536). 00:37:21.303 Zero copy mechanism will not be used. 00:37:21.303 Running I/O for 60 seconds... 00:37:21.303 [2024-06-07 12:39:44.896712] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:21.303 [2024-06-07 12:39:44.897192] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002a10 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:21.303 12:39:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:21.870 12:39:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:21.870 "name": "raid_bdev1", 00:37:21.870 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:21.871 "strip_size_kb": 0, 00:37:21.871 "state": "online", 00:37:21.871 "raid_level": "raid1", 00:37:21.871 "superblock": true, 00:37:21.871 "num_base_bdevs": 4, 00:37:21.871 "num_base_bdevs_discovered": 3, 00:37:21.871 "num_base_bdevs_operational": 3, 00:37:21.871 "base_bdevs_list": [ 00:37:21.871 { 00:37:21.871 "name": null, 00:37:21.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:21.871 "is_configured": false, 00:37:21.871 "data_offset": 2048, 00:37:21.871 "data_size": 63488 00:37:21.871 }, 00:37:21.871 { 00:37:21.871 "name": "BaseBdev2", 00:37:21.871 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:21.871 "is_configured": true, 00:37:21.871 "data_offset": 2048, 00:37:21.871 "data_size": 63488 00:37:21.871 }, 00:37:21.871 { 00:37:21.871 "name": "BaseBdev3", 00:37:21.871 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:21.871 "is_configured": true, 00:37:21.871 "data_offset": 2048, 00:37:21.871 "data_size": 63488 00:37:21.871 }, 00:37:21.871 { 00:37:21.871 "name": "BaseBdev4", 00:37:21.871 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:21.871 "is_configured": true, 00:37:21.871 "data_offset": 2048, 00:37:21.871 "data_size": 63488 00:37:21.871 } 00:37:21.871 ] 00:37:21.871 }' 00:37:21.871 12:39:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:21.871 12:39:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:22.437 12:39:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:22.696 [2024-06-07 12:39:46.149432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:22.696 12:39:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:37:22.696 [2024-06-07 12:39:46.204480] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ae0 00:37:22.696 [2024-06-07 12:39:46.206943] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:22.696 [2024-06-07 12:39:46.314497] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:22.696 [2024-06-07 12:39:46.316453] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:22.955 [2024-06-07 12:39:46.540125] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:22.955 [2024-06-07 12:39:46.540791] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:23.521 [2024-06-07 12:39:46.868572] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:23.521 [2024-06-07 12:39:46.869514] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:23.521 [2024-06-07 12:39:46.993441] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:23.521 [2024-06-07 12:39:46.998521] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:23.779 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:23.779 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:23.779 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:23.780 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:23.780 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:23.780 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:23.780 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:23.780 [2024-06-07 12:39:47.331807] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:23.780 [2024-06-07 12:39:47.333735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:24.038 "name": "raid_bdev1", 00:37:24.038 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:24.038 "strip_size_kb": 0, 00:37:24.038 "state": "online", 00:37:24.038 "raid_level": "raid1", 00:37:24.038 "superblock": true, 00:37:24.038 "num_base_bdevs": 4, 00:37:24.038 "num_base_bdevs_discovered": 4, 00:37:24.038 "num_base_bdevs_operational": 4, 00:37:24.038 "process": { 00:37:24.038 "type": "rebuild", 00:37:24.038 "target": "spare", 00:37:24.038 "progress": { 00:37:24.038 "blocks": 14336, 00:37:24.038 "percent": 22 00:37:24.038 } 00:37:24.038 }, 00:37:24.038 "base_bdevs_list": [ 00:37:24.038 { 00:37:24.038 "name": "spare", 00:37:24.038 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:24.038 "is_configured": true, 00:37:24.038 "data_offset": 2048, 00:37:24.038 "data_size": 63488 00:37:24.038 }, 00:37:24.038 { 00:37:24.038 "name": "BaseBdev2", 00:37:24.038 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:24.038 "is_configured": true, 00:37:24.038 "data_offset": 2048, 00:37:24.038 "data_size": 63488 00:37:24.038 }, 00:37:24.038 { 00:37:24.038 "name": "BaseBdev3", 00:37:24.038 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:24.038 "is_configured": true, 00:37:24.038 "data_offset": 2048, 00:37:24.038 "data_size": 63488 00:37:24.038 }, 00:37:24.038 { 00:37:24.038 "name": "BaseBdev4", 00:37:24.038 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:24.038 "is_configured": true, 00:37:24.038 "data_offset": 2048, 00:37:24.038 "data_size": 63488 00:37:24.038 } 00:37:24.038 ] 00:37:24.038 }' 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:24.038 [2024-06-07 12:39:47.538694] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:24.038 [2024-06-07 12:39:47.539834] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:24.038 12:39:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:24.297 [2024-06-07 12:39:47.860391] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:24.297 [2024-06-07 12:39:47.862285] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:24.297 [2024-06-07 12:39:47.902535] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:24.555 [2024-06-07 12:39:47.964566] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:24.555 [2024-06-07 12:39:47.965495] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:24.556 [2024-06-07 12:39:47.970828] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:24.556 [2024-06-07 12:39:47.981435] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:24.556 [2024-06-07 12:39:47.981770] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:24.556 [2024-06-07 12:39:47.981820] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:24.556 [2024-06-07 12:39:47.992369] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x60d000002a10 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:24.556 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:24.813 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:24.814 "name": "raid_bdev1", 00:37:24.814 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:24.814 "strip_size_kb": 0, 00:37:24.814 "state": "online", 00:37:24.814 "raid_level": "raid1", 00:37:24.814 "superblock": true, 00:37:24.814 "num_base_bdevs": 4, 00:37:24.814 "num_base_bdevs_discovered": 3, 00:37:24.814 "num_base_bdevs_operational": 3, 00:37:24.814 "base_bdevs_list": [ 00:37:24.814 { 00:37:24.814 "name": null, 00:37:24.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:24.814 "is_configured": false, 00:37:24.814 "data_offset": 2048, 00:37:24.814 "data_size": 63488 00:37:24.814 }, 00:37:24.814 { 00:37:24.814 "name": "BaseBdev2", 00:37:24.814 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:24.814 "is_configured": true, 00:37:24.814 "data_offset": 2048, 00:37:24.814 "data_size": 63488 00:37:24.814 }, 00:37:24.814 { 00:37:24.814 "name": "BaseBdev3", 00:37:24.814 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:24.814 "is_configured": true, 00:37:24.814 "data_offset": 2048, 00:37:24.814 "data_size": 63488 00:37:24.814 }, 00:37:24.814 { 00:37:24.814 "name": "BaseBdev4", 00:37:24.814 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:24.814 "is_configured": true, 00:37:24.814 "data_offset": 2048, 00:37:24.814 "data_size": 63488 00:37:24.814 } 00:37:24.814 ] 00:37:24.814 }' 00:37:24.814 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:24.814 12:39:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:25.748 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:26.006 "name": "raid_bdev1", 00:37:26.006 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:26.006 "strip_size_kb": 0, 00:37:26.006 "state": "online", 00:37:26.006 "raid_level": "raid1", 00:37:26.006 "superblock": true, 00:37:26.006 "num_base_bdevs": 4, 00:37:26.006 "num_base_bdevs_discovered": 3, 00:37:26.006 "num_base_bdevs_operational": 3, 00:37:26.006 "base_bdevs_list": [ 00:37:26.006 { 00:37:26.006 "name": null, 00:37:26.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:26.006 "is_configured": false, 00:37:26.006 "data_offset": 2048, 00:37:26.006 "data_size": 63488 00:37:26.006 }, 00:37:26.006 { 00:37:26.006 "name": "BaseBdev2", 00:37:26.006 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:26.006 "is_configured": true, 00:37:26.006 "data_offset": 2048, 00:37:26.006 "data_size": 63488 00:37:26.006 }, 00:37:26.006 { 00:37:26.006 "name": "BaseBdev3", 00:37:26.006 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:26.006 "is_configured": true, 00:37:26.006 "data_offset": 2048, 00:37:26.006 "data_size": 63488 00:37:26.006 }, 00:37:26.006 { 00:37:26.006 "name": "BaseBdev4", 00:37:26.006 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:26.006 "is_configured": true, 00:37:26.006 "data_offset": 2048, 00:37:26.006 "data_size": 63488 00:37:26.006 } 00:37:26.006 ] 00:37:26.006 }' 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:26.006 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:26.264 [2024-06-07 12:39:49.855902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:26.264 [2024-06-07 12:39:49.899646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002c80 00:37:26.264 [2024-06-07 12:39:49.902145] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:26.523 12:39:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:37:26.523 [2024-06-07 12:39:50.011546] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:26.523 [2024-06-07 12:39:50.012556] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:37:26.781 [2024-06-07 12:39:50.216486] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:26.781 [2024-06-07 12:39:50.217121] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:37:27.039 [2024-06-07 12:39:50.461760] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:27.039 [2024-06-07 12:39:50.463752] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:37:27.039 [2024-06-07 12:39:50.674897] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:27.039 [2024-06-07 12:39:50.676061] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:27.346 12:39:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:27.604 [2024-06-07 12:39:51.008630] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:27.604 [2024-06-07 12:39:51.009553] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:37:27.604 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:27.604 "name": "raid_bdev1", 00:37:27.604 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:27.604 "strip_size_kb": 0, 00:37:27.604 "state": "online", 00:37:27.604 "raid_level": "raid1", 00:37:27.604 "superblock": true, 00:37:27.604 "num_base_bdevs": 4, 00:37:27.604 "num_base_bdevs_discovered": 4, 00:37:27.604 "num_base_bdevs_operational": 4, 00:37:27.604 "process": { 00:37:27.604 "type": "rebuild", 00:37:27.604 "target": "spare", 00:37:27.604 "progress": { 00:37:27.604 "blocks": 14336, 00:37:27.604 "percent": 22 00:37:27.604 } 00:37:27.604 }, 00:37:27.604 "base_bdevs_list": [ 00:37:27.604 { 00:37:27.604 "name": "spare", 00:37:27.604 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:27.604 "is_configured": true, 00:37:27.604 "data_offset": 2048, 00:37:27.604 "data_size": 63488 00:37:27.604 }, 00:37:27.604 { 00:37:27.604 "name": "BaseBdev2", 00:37:27.604 "uuid": "9eac55ed-1393-511a-a4f0-f68e165f33fc", 00:37:27.604 "is_configured": true, 00:37:27.604 "data_offset": 2048, 00:37:27.604 "data_size": 63488 00:37:27.604 }, 00:37:27.604 { 00:37:27.604 "name": "BaseBdev3", 00:37:27.604 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:27.604 "is_configured": true, 00:37:27.604 "data_offset": 2048, 00:37:27.604 "data_size": 63488 00:37:27.604 }, 00:37:27.604 { 00:37:27.604 "name": "BaseBdev4", 00:37:27.604 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:27.604 "is_configured": true, 00:37:27.604 "data_offset": 2048, 00:37:27.604 "data_size": 63488 00:37:27.604 } 00:37:27.604 ] 00:37:27.604 }' 00:37:27.604 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:27.604 [2024-06-07 12:39:51.223736] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:27.604 [2024-06-07 12:39:51.224731] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:37:27.863 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:37:27.863 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:37:28.121 [2024-06-07 12:39:51.520943] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:37:28.379 [2024-06-07 12:39:51.775862] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000002a10 00:37:28.379 [2024-06-07 12:39:51.776191] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x60d000002c80 00:37:28.379 [2024-06-07 12:39:51.778039] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:28.379 12:39:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:28.379 [2024-06-07 12:39:51.900900] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:28.637 "name": "raid_bdev1", 00:37:28.637 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:28.637 "strip_size_kb": 0, 00:37:28.637 "state": "online", 00:37:28.637 "raid_level": "raid1", 00:37:28.637 "superblock": true, 00:37:28.637 "num_base_bdevs": 4, 00:37:28.637 "num_base_bdevs_discovered": 3, 00:37:28.637 "num_base_bdevs_operational": 3, 00:37:28.637 "process": { 00:37:28.637 "type": "rebuild", 00:37:28.637 "target": "spare", 00:37:28.637 "progress": { 00:37:28.637 "blocks": 22528, 00:37:28.637 "percent": 35 00:37:28.637 } 00:37:28.637 }, 00:37:28.637 "base_bdevs_list": [ 00:37:28.637 { 00:37:28.637 "name": "spare", 00:37:28.637 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:28.637 "is_configured": true, 00:37:28.637 "data_offset": 2048, 00:37:28.637 "data_size": 63488 00:37:28.637 }, 00:37:28.637 { 00:37:28.637 "name": null, 00:37:28.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:28.637 "is_configured": false, 00:37:28.637 "data_offset": 2048, 00:37:28.637 "data_size": 63488 00:37:28.637 }, 00:37:28.637 { 00:37:28.637 "name": "BaseBdev3", 00:37:28.637 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:28.637 "is_configured": true, 00:37:28.637 "data_offset": 2048, 00:37:28.637 "data_size": 63488 00:37:28.637 }, 00:37:28.637 { 00:37:28.637 "name": "BaseBdev4", 00:37:28.637 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:28.637 "is_configured": true, 00:37:28.637 "data_offset": 2048, 00:37:28.637 "data_size": 63488 00:37:28.637 } 00:37:28.637 ] 00:37:28.637 }' 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=999 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:28.637 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:28.895 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:28.895 "name": "raid_bdev1", 00:37:28.895 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:28.895 "strip_size_kb": 0, 00:37:28.895 "state": "online", 00:37:28.895 "raid_level": "raid1", 00:37:28.895 "superblock": true, 00:37:28.895 "num_base_bdevs": 4, 00:37:28.895 "num_base_bdevs_discovered": 3, 00:37:28.895 "num_base_bdevs_operational": 3, 00:37:28.895 "process": { 00:37:28.895 "type": "rebuild", 00:37:28.895 "target": "spare", 00:37:28.895 "progress": { 00:37:28.895 "blocks": 28672, 00:37:28.895 "percent": 45 00:37:28.895 } 00:37:28.895 }, 00:37:28.895 "base_bdevs_list": [ 00:37:28.895 { 00:37:28.895 "name": "spare", 00:37:28.895 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:28.895 "is_configured": true, 00:37:28.895 "data_offset": 2048, 00:37:28.895 "data_size": 63488 00:37:28.895 }, 00:37:28.895 { 00:37:28.895 "name": null, 00:37:28.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:28.895 "is_configured": false, 00:37:28.895 "data_offset": 2048, 00:37:28.895 "data_size": 63488 00:37:28.895 }, 00:37:28.895 { 00:37:28.895 "name": "BaseBdev3", 00:37:28.895 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:28.895 "is_configured": true, 00:37:28.895 "data_offset": 2048, 00:37:28.895 "data_size": 63488 00:37:28.895 }, 00:37:28.895 { 00:37:28.895 "name": "BaseBdev4", 00:37:28.895 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:28.895 "is_configured": true, 00:37:28.895 "data_offset": 2048, 00:37:28.895 "data_size": 63488 00:37:28.895 } 00:37:28.895 ] 00:37:28.895 }' 00:37:28.895 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:28.895 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:28.895 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:29.153 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:29.153 12:39:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:29.411 [2024-06-07 12:39:53.005788] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:29.977 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:30.235 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:30.235 "name": "raid_bdev1", 00:37:30.235 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:30.235 "strip_size_kb": 0, 00:37:30.235 "state": "online", 00:37:30.235 "raid_level": "raid1", 00:37:30.235 "superblock": true, 00:37:30.235 "num_base_bdevs": 4, 00:37:30.235 "num_base_bdevs_discovered": 3, 00:37:30.235 "num_base_bdevs_operational": 3, 00:37:30.235 "process": { 00:37:30.235 "type": "rebuild", 00:37:30.235 "target": "spare", 00:37:30.235 "progress": { 00:37:30.235 "blocks": 55296, 00:37:30.235 "percent": 87 00:37:30.235 } 00:37:30.235 }, 00:37:30.235 "base_bdevs_list": [ 00:37:30.235 { 00:37:30.235 "name": "spare", 00:37:30.235 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:30.235 "is_configured": true, 00:37:30.235 "data_offset": 2048, 00:37:30.235 "data_size": 63488 00:37:30.235 }, 00:37:30.235 { 00:37:30.235 "name": null, 00:37:30.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:30.235 "is_configured": false, 00:37:30.235 "data_offset": 2048, 00:37:30.235 "data_size": 63488 00:37:30.235 }, 00:37:30.235 { 00:37:30.235 "name": "BaseBdev3", 00:37:30.235 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:30.235 "is_configured": true, 00:37:30.235 "data_offset": 2048, 00:37:30.235 "data_size": 63488 00:37:30.235 }, 00:37:30.235 { 00:37:30.235 "name": "BaseBdev4", 00:37:30.235 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:30.235 "is_configured": true, 00:37:30.235 "data_offset": 2048, 00:37:30.235 "data_size": 63488 00:37:30.235 } 00:37:30.235 ] 00:37:30.235 }' 00:37:30.235 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:30.509 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:30.509 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:30.509 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:30.509 12:39:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:37:30.777 [2024-06-07 12:39:54.219306] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:37:30.777 [2024-06-07 12:39:54.319259] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:37:30.777 [2024-06-07 12:39:54.323366] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:31.344 12:39:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:31.911 "name": "raid_bdev1", 00:37:31.911 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:31.911 "strip_size_kb": 0, 00:37:31.911 "state": "online", 00:37:31.911 "raid_level": "raid1", 00:37:31.911 "superblock": true, 00:37:31.911 "num_base_bdevs": 4, 00:37:31.911 "num_base_bdevs_discovered": 3, 00:37:31.911 "num_base_bdevs_operational": 3, 00:37:31.911 "base_bdevs_list": [ 00:37:31.911 { 00:37:31.911 "name": "spare", 00:37:31.911 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:31.911 "is_configured": true, 00:37:31.911 "data_offset": 2048, 00:37:31.911 "data_size": 63488 00:37:31.911 }, 00:37:31.911 { 00:37:31.911 "name": null, 00:37:31.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:31.911 "is_configured": false, 00:37:31.911 "data_offset": 2048, 00:37:31.911 "data_size": 63488 00:37:31.911 }, 00:37:31.911 { 00:37:31.911 "name": "BaseBdev3", 00:37:31.911 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:31.911 "is_configured": true, 00:37:31.911 "data_offset": 2048, 00:37:31.911 "data_size": 63488 00:37:31.911 }, 00:37:31.911 { 00:37:31.911 "name": "BaseBdev4", 00:37:31.911 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:31.911 "is_configured": true, 00:37:31.911 "data_offset": 2048, 00:37:31.911 "data_size": 63488 00:37:31.911 } 00:37:31.911 ] 00:37:31.911 }' 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:31.911 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:32.169 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:32.169 "name": "raid_bdev1", 00:37:32.169 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:32.169 "strip_size_kb": 0, 00:37:32.169 "state": "online", 00:37:32.169 "raid_level": "raid1", 00:37:32.169 "superblock": true, 00:37:32.169 "num_base_bdevs": 4, 00:37:32.169 "num_base_bdevs_discovered": 3, 00:37:32.169 "num_base_bdevs_operational": 3, 00:37:32.169 "base_bdevs_list": [ 00:37:32.169 { 00:37:32.169 "name": "spare", 00:37:32.169 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:32.169 "is_configured": true, 00:37:32.169 "data_offset": 2048, 00:37:32.169 "data_size": 63488 00:37:32.169 }, 00:37:32.169 { 00:37:32.169 "name": null, 00:37:32.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:32.169 "is_configured": false, 00:37:32.169 "data_offset": 2048, 00:37:32.169 "data_size": 63488 00:37:32.170 }, 00:37:32.170 { 00:37:32.170 "name": "BaseBdev3", 00:37:32.170 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:32.170 "is_configured": true, 00:37:32.170 "data_offset": 2048, 00:37:32.170 "data_size": 63488 00:37:32.170 }, 00:37:32.170 { 00:37:32.170 "name": "BaseBdev4", 00:37:32.170 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:32.170 "is_configured": true, 00:37:32.170 "data_offset": 2048, 00:37:32.170 "data_size": 63488 00:37:32.170 } 00:37:32.170 ] 00:37:32.170 }' 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:32.170 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:32.427 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:32.427 "name": "raid_bdev1", 00:37:32.427 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:32.427 "strip_size_kb": 0, 00:37:32.427 "state": "online", 00:37:32.427 "raid_level": "raid1", 00:37:32.427 "superblock": true, 00:37:32.427 "num_base_bdevs": 4, 00:37:32.427 "num_base_bdevs_discovered": 3, 00:37:32.427 "num_base_bdevs_operational": 3, 00:37:32.427 "base_bdevs_list": [ 00:37:32.427 { 00:37:32.427 "name": "spare", 00:37:32.427 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:32.427 "is_configured": true, 00:37:32.427 "data_offset": 2048, 00:37:32.427 "data_size": 63488 00:37:32.427 }, 00:37:32.427 { 00:37:32.427 "name": null, 00:37:32.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:32.427 "is_configured": false, 00:37:32.427 "data_offset": 2048, 00:37:32.427 "data_size": 63488 00:37:32.427 }, 00:37:32.427 { 00:37:32.427 "name": "BaseBdev3", 00:37:32.427 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:32.427 "is_configured": true, 00:37:32.428 "data_offset": 2048, 00:37:32.428 "data_size": 63488 00:37:32.428 }, 00:37:32.428 { 00:37:32.428 "name": "BaseBdev4", 00:37:32.428 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:32.428 "is_configured": true, 00:37:32.428 "data_offset": 2048, 00:37:32.428 "data_size": 63488 00:37:32.428 } 00:37:32.428 ] 00:37:32.428 }' 00:37:32.428 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:32.428 12:39:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:32.993 12:39:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:37:33.308 [2024-06-07 12:39:56.826272] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:37:33.308 [2024-06-07 12:39:56.826552] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:37:33.308 00:37:33.308 Latency(us) 00:37:33.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:33.308 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:37:33.308 raid_bdev1 : 12.07 127.05 381.15 0.00 0.00 11369.86 446.66 113346.07 00:37:33.308 =================================================================================================================== 00:37:33.308 Total : 127.05 381.15 0.00 0.00 11369.86 446.66 113346.07 00:37:33.308 [2024-06-07 12:39:56.894466] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:33.308 [2024-06-07 12:39:56.894717] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:33.308 [2024-06-07 12:39:56.894861] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:33.308 0 00:37:33.308 [2024-06-07 12:39:56.895041] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:37:33.308 12:39:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:33.308 12:39:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:33.566 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:37:33.824 /dev/nbd0 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:34.083 1+0 records in 00:37:34.083 1+0 records out 00:37:34.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000628639 s, 6.5 MB/s 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:34.083 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:37:34.352 /dev/nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:34.352 1+0 records in 00:37:34.352 1+0 records out 00:37:34.352 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000862371 s, 4.7 MB/s 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:34.352 12:39:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:34.920 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:37:34.921 /dev/nbd1 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:34.921 1+0 records in 00:37:34.921 1+0 records out 00:37:34.921 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000489487 s, 8.4 MB/s 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:37:34.921 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:35.179 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:35.437 12:39:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:37:35.696 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:35.955 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:36.214 [2024-06-07 12:39:59.817048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:36.214 [2024-06-07 12:39:59.817448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:36.214 [2024-06-07 12:39:59.817542] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000a880 00:37:36.214 [2024-06-07 12:39:59.817694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:36.214 [2024-06-07 12:39:59.820438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:36.214 [2024-06-07 12:39:59.820698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:36.214 [2024-06-07 12:39:59.820968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:37:36.214 [2024-06-07 12:39:59.821144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:36.214 [2024-06-07 12:39:59.821527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:37:36.214 [2024-06-07 12:39:59.821773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:37:36.214 spare 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:36.214 12:39:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:36.473 [2024-06-07 12:39:59.921984] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x61600000ae80 00:37:36.473 [2024-06-07 12:39:59.922326] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:37:36.473 [2024-06-07 12:39:59.922590] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000033c90 00:37:36.473 [2024-06-07 12:39:59.923123] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x61600000ae80 00:37:36.473 [2024-06-07 12:39:59.923260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x61600000ae80 00:37:36.473 [2024-06-07 12:39:59.923495] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:36.473 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:36.473 "name": "raid_bdev1", 00:37:36.473 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:36.473 "strip_size_kb": 0, 00:37:36.473 "state": "online", 00:37:36.473 "raid_level": "raid1", 00:37:36.473 "superblock": true, 00:37:36.473 "num_base_bdevs": 4, 00:37:36.473 "num_base_bdevs_discovered": 3, 00:37:36.473 "num_base_bdevs_operational": 3, 00:37:36.473 "base_bdevs_list": [ 00:37:36.473 { 00:37:36.473 "name": "spare", 00:37:36.473 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:36.473 "is_configured": true, 00:37:36.473 "data_offset": 2048, 00:37:36.473 "data_size": 63488 00:37:36.473 }, 00:37:36.473 { 00:37:36.473 "name": null, 00:37:36.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:36.473 "is_configured": false, 00:37:36.474 "data_offset": 2048, 00:37:36.474 "data_size": 63488 00:37:36.474 }, 00:37:36.474 { 00:37:36.474 "name": "BaseBdev3", 00:37:36.474 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:36.474 "is_configured": true, 00:37:36.474 "data_offset": 2048, 00:37:36.474 "data_size": 63488 00:37:36.474 }, 00:37:36.474 { 00:37:36.474 "name": "BaseBdev4", 00:37:36.474 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:36.474 "is_configured": true, 00:37:36.474 "data_offset": 2048, 00:37:36.474 "data_size": 63488 00:37:36.474 } 00:37:36.474 ] 00:37:36.474 }' 00:37:36.474 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:36.474 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:37.406 12:40:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:37.406 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:37.406 "name": "raid_bdev1", 00:37:37.406 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:37.406 "strip_size_kb": 0, 00:37:37.406 "state": "online", 00:37:37.406 "raid_level": "raid1", 00:37:37.406 "superblock": true, 00:37:37.406 "num_base_bdevs": 4, 00:37:37.406 "num_base_bdevs_discovered": 3, 00:37:37.406 "num_base_bdevs_operational": 3, 00:37:37.406 "base_bdevs_list": [ 00:37:37.406 { 00:37:37.406 "name": "spare", 00:37:37.406 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:37.406 "is_configured": true, 00:37:37.406 "data_offset": 2048, 00:37:37.406 "data_size": 63488 00:37:37.406 }, 00:37:37.406 { 00:37:37.406 "name": null, 00:37:37.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:37.406 "is_configured": false, 00:37:37.406 "data_offset": 2048, 00:37:37.406 "data_size": 63488 00:37:37.406 }, 00:37:37.406 { 00:37:37.406 "name": "BaseBdev3", 00:37:37.406 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:37.406 "is_configured": true, 00:37:37.406 "data_offset": 2048, 00:37:37.406 "data_size": 63488 00:37:37.406 }, 00:37:37.406 { 00:37:37.406 "name": "BaseBdev4", 00:37:37.406 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:37.406 "is_configured": true, 00:37:37.406 "data_offset": 2048, 00:37:37.406 "data_size": 63488 00:37:37.406 } 00:37:37.406 ] 00:37:37.406 }' 00:37:37.406 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:37.665 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:37.665 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:37.665 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:37.665 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:37:37.665 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:37.924 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:37:37.924 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:37:38.183 [2024-06-07 12:40:01.657484] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:38.183 12:40:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:38.441 12:40:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:38.441 "name": "raid_bdev1", 00:37:38.441 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:38.441 "strip_size_kb": 0, 00:37:38.441 "state": "online", 00:37:38.441 "raid_level": "raid1", 00:37:38.441 "superblock": true, 00:37:38.441 "num_base_bdevs": 4, 00:37:38.441 "num_base_bdevs_discovered": 2, 00:37:38.441 "num_base_bdevs_operational": 2, 00:37:38.441 "base_bdevs_list": [ 00:37:38.441 { 00:37:38.441 "name": null, 00:37:38.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:38.441 "is_configured": false, 00:37:38.441 "data_offset": 2048, 00:37:38.441 "data_size": 63488 00:37:38.441 }, 00:37:38.441 { 00:37:38.441 "name": null, 00:37:38.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:38.441 "is_configured": false, 00:37:38.441 "data_offset": 2048, 00:37:38.441 "data_size": 63488 00:37:38.441 }, 00:37:38.441 { 00:37:38.441 "name": "BaseBdev3", 00:37:38.441 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:38.441 "is_configured": true, 00:37:38.441 "data_offset": 2048, 00:37:38.441 "data_size": 63488 00:37:38.441 }, 00:37:38.441 { 00:37:38.441 "name": "BaseBdev4", 00:37:38.441 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:38.441 "is_configured": true, 00:37:38.441 "data_offset": 2048, 00:37:38.441 "data_size": 63488 00:37:38.441 } 00:37:38.441 ] 00:37:38.441 }' 00:37:38.441 12:40:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:38.441 12:40:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:39.008 12:40:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:37:39.266 [2024-06-07 12:40:02.845819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:39.266 [2024-06-07 12:40:02.846328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:37:39.266 [2024-06-07 12:40:02.846468] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:37:39.266 [2024-06-07 12:40:02.846598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:39.266 [2024-06-07 12:40:02.853724] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000033e30 00:37:39.266 [2024-06-07 12:40:02.856316] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:39.266 12:40:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:40.640 12:40:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:40.640 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:40.640 "name": "raid_bdev1", 00:37:40.640 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:40.640 "strip_size_kb": 0, 00:37:40.640 "state": "online", 00:37:40.640 "raid_level": "raid1", 00:37:40.640 "superblock": true, 00:37:40.640 "num_base_bdevs": 4, 00:37:40.640 "num_base_bdevs_discovered": 3, 00:37:40.640 "num_base_bdevs_operational": 3, 00:37:40.640 "process": { 00:37:40.640 "type": "rebuild", 00:37:40.640 "target": "spare", 00:37:40.640 "progress": { 00:37:40.640 "blocks": 24576, 00:37:40.640 "percent": 38 00:37:40.640 } 00:37:40.640 }, 00:37:40.640 "base_bdevs_list": [ 00:37:40.640 { 00:37:40.640 "name": "spare", 00:37:40.640 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:40.640 "is_configured": true, 00:37:40.640 "data_offset": 2048, 00:37:40.640 "data_size": 63488 00:37:40.640 }, 00:37:40.640 { 00:37:40.640 "name": null, 00:37:40.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:40.640 "is_configured": false, 00:37:40.640 "data_offset": 2048, 00:37:40.640 "data_size": 63488 00:37:40.640 }, 00:37:40.640 { 00:37:40.640 "name": "BaseBdev3", 00:37:40.640 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:40.640 "is_configured": true, 00:37:40.640 "data_offset": 2048, 00:37:40.640 "data_size": 63488 00:37:40.640 }, 00:37:40.640 { 00:37:40.640 "name": "BaseBdev4", 00:37:40.640 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:40.640 "is_configured": true, 00:37:40.641 "data_offset": 2048, 00:37:40.641 "data_size": 63488 00:37:40.641 } 00:37:40.641 ] 00:37:40.641 }' 00:37:40.641 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:40.641 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:40.641 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:40.641 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:40.641 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:40.898 [2024-06-07 12:40:04.478966] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:41.156 [2024-06-07 12:40:04.569684] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:41.156 [2024-06-07 12:40:04.570088] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:41.156 [2024-06-07 12:40:04.570164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:41.156 [2024-06-07 12:40:04.570280] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:41.156 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:41.157 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:41.414 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:41.414 "name": "raid_bdev1", 00:37:41.414 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:41.415 "strip_size_kb": 0, 00:37:41.415 "state": "online", 00:37:41.415 "raid_level": "raid1", 00:37:41.415 "superblock": true, 00:37:41.415 "num_base_bdevs": 4, 00:37:41.415 "num_base_bdevs_discovered": 2, 00:37:41.415 "num_base_bdevs_operational": 2, 00:37:41.415 "base_bdevs_list": [ 00:37:41.415 { 00:37:41.415 "name": null, 00:37:41.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:41.415 "is_configured": false, 00:37:41.415 "data_offset": 2048, 00:37:41.415 "data_size": 63488 00:37:41.415 }, 00:37:41.415 { 00:37:41.415 "name": null, 00:37:41.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:41.415 "is_configured": false, 00:37:41.415 "data_offset": 2048, 00:37:41.415 "data_size": 63488 00:37:41.415 }, 00:37:41.415 { 00:37:41.415 "name": "BaseBdev3", 00:37:41.415 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:41.415 "is_configured": true, 00:37:41.415 "data_offset": 2048, 00:37:41.415 "data_size": 63488 00:37:41.415 }, 00:37:41.415 { 00:37:41.415 "name": "BaseBdev4", 00:37:41.415 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:41.415 "is_configured": true, 00:37:41.415 "data_offset": 2048, 00:37:41.415 "data_size": 63488 00:37:41.415 } 00:37:41.415 ] 00:37:41.415 }' 00:37:41.415 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:41.415 12:40:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:41.980 12:40:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:37:42.239 [2024-06-07 12:40:05.814465] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:37:42.239 [2024-06-07 12:40:05.814795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:42.239 [2024-06-07 12:40:05.814901] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000b480 00:37:42.239 [2024-06-07 12:40:05.815023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:42.239 [2024-06-07 12:40:05.815500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:42.239 [2024-06-07 12:40:05.815698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:37:42.239 [2024-06-07 12:40:05.815903] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:37:42.239 [2024-06-07 12:40:05.815998] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:37:42.239 [2024-06-07 12:40:05.816077] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:37:42.239 [2024-06-07 12:40:05.816204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:37:42.239 [2024-06-07 12:40:05.823030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000034170 00:37:42.239 spare 00:37:42.239 [2024-06-07 12:40:05.825333] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:37:42.239 12:40:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:43.217 12:40:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:43.784 "name": "raid_bdev1", 00:37:43.784 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:43.784 "strip_size_kb": 0, 00:37:43.784 "state": "online", 00:37:43.784 "raid_level": "raid1", 00:37:43.784 "superblock": true, 00:37:43.784 "num_base_bdevs": 4, 00:37:43.784 "num_base_bdevs_discovered": 3, 00:37:43.784 "num_base_bdevs_operational": 3, 00:37:43.784 "process": { 00:37:43.784 "type": "rebuild", 00:37:43.784 "target": "spare", 00:37:43.784 "progress": { 00:37:43.784 "blocks": 26624, 00:37:43.784 "percent": 41 00:37:43.784 } 00:37:43.784 }, 00:37:43.784 "base_bdevs_list": [ 00:37:43.784 { 00:37:43.784 "name": "spare", 00:37:43.784 "uuid": "0e24d886-8a11-58a0-bceb-35eab8d848cb", 00:37:43.784 "is_configured": true, 00:37:43.784 "data_offset": 2048, 00:37:43.784 "data_size": 63488 00:37:43.784 }, 00:37:43.784 { 00:37:43.784 "name": null, 00:37:43.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:43.784 "is_configured": false, 00:37:43.784 "data_offset": 2048, 00:37:43.784 "data_size": 63488 00:37:43.784 }, 00:37:43.784 { 00:37:43.784 "name": "BaseBdev3", 00:37:43.784 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:43.784 "is_configured": true, 00:37:43.784 "data_offset": 2048, 00:37:43.784 "data_size": 63488 00:37:43.784 }, 00:37:43.784 { 00:37:43.784 "name": "BaseBdev4", 00:37:43.784 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:43.784 "is_configured": true, 00:37:43.784 "data_offset": 2048, 00:37:43.784 "data_size": 63488 00:37:43.784 } 00:37:43.784 ] 00:37:43.784 }' 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:37:43.784 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:37:44.042 [2024-06-07 12:40:07.519685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:44.042 [2024-06-07 12:40:07.537538] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:37:44.042 [2024-06-07 12:40:07.537849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:44.042 [2024-06-07 12:40:07.537998] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:37:44.042 [2024-06-07 12:40:07.538086] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:44.042 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:44.608 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:44.608 "name": "raid_bdev1", 00:37:44.608 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:44.608 "strip_size_kb": 0, 00:37:44.608 "state": "online", 00:37:44.608 "raid_level": "raid1", 00:37:44.608 "superblock": true, 00:37:44.608 "num_base_bdevs": 4, 00:37:44.608 "num_base_bdevs_discovered": 2, 00:37:44.608 "num_base_bdevs_operational": 2, 00:37:44.608 "base_bdevs_list": [ 00:37:44.608 { 00:37:44.608 "name": null, 00:37:44.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:44.608 "is_configured": false, 00:37:44.608 "data_offset": 2048, 00:37:44.608 "data_size": 63488 00:37:44.608 }, 00:37:44.608 { 00:37:44.608 "name": null, 00:37:44.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:44.608 "is_configured": false, 00:37:44.608 "data_offset": 2048, 00:37:44.608 "data_size": 63488 00:37:44.608 }, 00:37:44.608 { 00:37:44.608 "name": "BaseBdev3", 00:37:44.608 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:44.608 "is_configured": true, 00:37:44.608 "data_offset": 2048, 00:37:44.608 "data_size": 63488 00:37:44.608 }, 00:37:44.608 { 00:37:44.608 "name": "BaseBdev4", 00:37:44.608 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:44.608 "is_configured": true, 00:37:44.608 "data_offset": 2048, 00:37:44.608 "data_size": 63488 00:37:44.608 } 00:37:44.608 ] 00:37:44.608 }' 00:37:44.608 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:44.608 12:40:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:45.174 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:45.432 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:45.432 "name": "raid_bdev1", 00:37:45.432 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:45.432 "strip_size_kb": 0, 00:37:45.432 "state": "online", 00:37:45.432 "raid_level": "raid1", 00:37:45.432 "superblock": true, 00:37:45.432 "num_base_bdevs": 4, 00:37:45.432 "num_base_bdevs_discovered": 2, 00:37:45.432 "num_base_bdevs_operational": 2, 00:37:45.432 "base_bdevs_list": [ 00:37:45.432 { 00:37:45.432 "name": null, 00:37:45.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:45.432 "is_configured": false, 00:37:45.432 "data_offset": 2048, 00:37:45.432 "data_size": 63488 00:37:45.432 }, 00:37:45.432 { 00:37:45.432 "name": null, 00:37:45.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:45.432 "is_configured": false, 00:37:45.432 "data_offset": 2048, 00:37:45.432 "data_size": 63488 00:37:45.432 }, 00:37:45.432 { 00:37:45.432 "name": "BaseBdev3", 00:37:45.432 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:45.432 "is_configured": true, 00:37:45.432 "data_offset": 2048, 00:37:45.432 "data_size": 63488 00:37:45.432 }, 00:37:45.432 { 00:37:45.432 "name": "BaseBdev4", 00:37:45.432 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:45.432 "is_configured": true, 00:37:45.432 "data_offset": 2048, 00:37:45.432 "data_size": 63488 00:37:45.432 } 00:37:45.432 ] 00:37:45.432 }' 00:37:45.432 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:45.432 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:45.432 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:45.433 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:45.433 12:40:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:37:45.692 12:40:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:37:45.950 [2024-06-07 12:40:09.366700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:37:45.950 [2024-06-07 12:40:09.367120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:37:45.950 [2024-06-07 12:40:09.367246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61600000ba80 00:37:45.950 [2024-06-07 12:40:09.367467] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:37:45.950 [2024-06-07 12:40:09.367951] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:37:45.950 [2024-06-07 12:40:09.368102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:37:45.950 [2024-06-07 12:40:09.368329] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:37:45.950 [2024-06-07 12:40:09.368441] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:37:45.950 [2024-06-07 12:40:09.368523] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:37:45.950 BaseBdev1 00:37:45.950 12:40:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:46.886 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:47.144 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:47.144 "name": "raid_bdev1", 00:37:47.144 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:47.144 "strip_size_kb": 0, 00:37:47.144 "state": "online", 00:37:47.144 "raid_level": "raid1", 00:37:47.144 "superblock": true, 00:37:47.144 "num_base_bdevs": 4, 00:37:47.145 "num_base_bdevs_discovered": 2, 00:37:47.145 "num_base_bdevs_operational": 2, 00:37:47.145 "base_bdevs_list": [ 00:37:47.145 { 00:37:47.145 "name": null, 00:37:47.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:47.145 "is_configured": false, 00:37:47.145 "data_offset": 2048, 00:37:47.145 "data_size": 63488 00:37:47.145 }, 00:37:47.145 { 00:37:47.145 "name": null, 00:37:47.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:47.145 "is_configured": false, 00:37:47.145 "data_offset": 2048, 00:37:47.145 "data_size": 63488 00:37:47.145 }, 00:37:47.145 { 00:37:47.145 "name": "BaseBdev3", 00:37:47.145 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:47.145 "is_configured": true, 00:37:47.145 "data_offset": 2048, 00:37:47.145 "data_size": 63488 00:37:47.145 }, 00:37:47.145 { 00:37:47.145 "name": "BaseBdev4", 00:37:47.145 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:47.145 "is_configured": true, 00:37:47.145 "data_offset": 2048, 00:37:47.145 "data_size": 63488 00:37:47.145 } 00:37:47.145 ] 00:37:47.145 }' 00:37:47.145 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:47.145 12:40:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:47.714 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:47.973 "name": "raid_bdev1", 00:37:47.973 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:47.973 "strip_size_kb": 0, 00:37:47.973 "state": "online", 00:37:47.973 "raid_level": "raid1", 00:37:47.973 "superblock": true, 00:37:47.973 "num_base_bdevs": 4, 00:37:47.973 "num_base_bdevs_discovered": 2, 00:37:47.973 "num_base_bdevs_operational": 2, 00:37:47.973 "base_bdevs_list": [ 00:37:47.973 { 00:37:47.973 "name": null, 00:37:47.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:47.973 "is_configured": false, 00:37:47.973 "data_offset": 2048, 00:37:47.973 "data_size": 63488 00:37:47.973 }, 00:37:47.973 { 00:37:47.973 "name": null, 00:37:47.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:47.973 "is_configured": false, 00:37:47.973 "data_offset": 2048, 00:37:47.973 "data_size": 63488 00:37:47.973 }, 00:37:47.973 { 00:37:47.973 "name": "BaseBdev3", 00:37:47.973 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:47.973 "is_configured": true, 00:37:47.973 "data_offset": 2048, 00:37:47.973 "data_size": 63488 00:37:47.973 }, 00:37:47.973 { 00:37:47.973 "name": "BaseBdev4", 00:37:47.973 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:47.973 "is_configured": true, 00:37:47.973 "data_offset": 2048, 00:37:47.973 "data_size": 63488 00:37:47.973 } 00:37:47.973 ] 00:37:47.973 }' 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:37:47.973 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:37:48.231 [2024-06-07 12:40:11.823226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:48.231 [2024-06-07 12:40:11.823695] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:37:48.231 [2024-06-07 12:40:11.823804] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:37:48.231 request: 00:37:48.231 { 00:37:48.231 "raid_bdev": "raid_bdev1", 00:37:48.231 "base_bdev": "BaseBdev1", 00:37:48.231 "method": "bdev_raid_add_base_bdev", 00:37:48.231 "req_id": 1 00:37:48.231 } 00:37:48.231 Got JSON-RPC error response 00:37:48.231 response: 00:37:48.231 { 00:37:48.231 "code": -22, 00:37:48.231 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:37:48.231 } 00:37:48.231 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:37:48.231 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:37:48.231 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:37:48.231 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:37:48.231 12:40:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:49.606 12:40:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:49.606 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:49.606 "name": "raid_bdev1", 00:37:49.606 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:49.606 "strip_size_kb": 0, 00:37:49.606 "state": "online", 00:37:49.606 "raid_level": "raid1", 00:37:49.606 "superblock": true, 00:37:49.606 "num_base_bdevs": 4, 00:37:49.606 "num_base_bdevs_discovered": 2, 00:37:49.606 "num_base_bdevs_operational": 2, 00:37:49.606 "base_bdevs_list": [ 00:37:49.606 { 00:37:49.606 "name": null, 00:37:49.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:49.606 "is_configured": false, 00:37:49.606 "data_offset": 2048, 00:37:49.606 "data_size": 63488 00:37:49.606 }, 00:37:49.606 { 00:37:49.606 "name": null, 00:37:49.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:49.606 "is_configured": false, 00:37:49.606 "data_offset": 2048, 00:37:49.606 "data_size": 63488 00:37:49.606 }, 00:37:49.606 { 00:37:49.606 "name": "BaseBdev3", 00:37:49.606 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:49.606 "is_configured": true, 00:37:49.606 "data_offset": 2048, 00:37:49.606 "data_size": 63488 00:37:49.606 }, 00:37:49.606 { 00:37:49.606 "name": "BaseBdev4", 00:37:49.606 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:49.606 "is_configured": true, 00:37:49.606 "data_offset": 2048, 00:37:49.606 "data_size": 63488 00:37:49.606 } 00:37:49.606 ] 00:37:49.606 }' 00:37:49.606 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:49.606 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:50.172 12:40:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:37:50.429 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:37:50.429 "name": "raid_bdev1", 00:37:50.429 "uuid": "2ec40866-3bb7-4b66-867b-9433d04effb2", 00:37:50.430 "strip_size_kb": 0, 00:37:50.430 "state": "online", 00:37:50.430 "raid_level": "raid1", 00:37:50.430 "superblock": true, 00:37:50.430 "num_base_bdevs": 4, 00:37:50.430 "num_base_bdevs_discovered": 2, 00:37:50.430 "num_base_bdevs_operational": 2, 00:37:50.430 "base_bdevs_list": [ 00:37:50.430 { 00:37:50.430 "name": null, 00:37:50.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:50.430 "is_configured": false, 00:37:50.430 "data_offset": 2048, 00:37:50.430 "data_size": 63488 00:37:50.430 }, 00:37:50.430 { 00:37:50.430 "name": null, 00:37:50.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:50.430 "is_configured": false, 00:37:50.430 "data_offset": 2048, 00:37:50.430 "data_size": 63488 00:37:50.430 }, 00:37:50.430 { 00:37:50.430 "name": "BaseBdev3", 00:37:50.430 "uuid": "bc5ce7b1-4622-5677-9124-13d71bb799d0", 00:37:50.430 "is_configured": true, 00:37:50.430 "data_offset": 2048, 00:37:50.430 "data_size": 63488 00:37:50.430 }, 00:37:50.430 { 00:37:50.430 "name": "BaseBdev4", 00:37:50.430 "uuid": "4966e933-5be9-591e-9453-143c07435848", 00:37:50.430 "is_configured": true, 00:37:50.430 "data_offset": 2048, 00:37:50.430 "data_size": 63488 00:37:50.430 } 00:37:50.430 ] 00:37:50.430 }' 00:37:50.430 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:37:50.430 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:37:50.430 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:37:50.687 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 225918 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 225918 ']' 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 225918 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 225918 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 225918' 00:37:50.688 killing process with pid 225918 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 225918 00:37:50.688 Received shutdown signal, test time was about 29.327777 seconds 00:37:50.688 00:37:50.688 Latency(us) 00:37:50.688 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:50.688 =================================================================================================================== 00:37:50.688 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:50.688 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 225918 00:37:50.688 [2024-06-07 12:40:14.144385] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:37:50.688 [2024-06-07 12:40:14.144626] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:37:50.688 [2024-06-07 12:40:14.144816] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:37:50.688 [2024-06-07 12:40:14.144914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x61600000ae80 name raid_bdev1, state offline 00:37:50.688 [2024-06-07 12:40:14.236667] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:37:51.264 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:37:51.264 00:37:51.264 real 0m35.268s 00:37:51.264 user 0m56.660s 00:37:51.264 sys 0m5.260s 00:37:51.264 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:37:51.264 12:40:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:37:51.264 ************************************ 00:37:51.264 END TEST raid_rebuild_test_sb_io 00:37:51.264 ************************************ 00:37:51.264 12:40:14 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:37:51.264 12:40:14 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:37:51.264 12:40:14 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:37:51.264 12:40:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:37:51.264 12:40:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:37:51.264 12:40:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:37:51.264 ************************************ 00:37:51.264 START TEST raid_state_function_test_sb_4k 00:37:51.264 ************************************ 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=226822 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 226822' 00:37:51.264 Process raid pid: 226822 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 226822 /var/tmp/spdk-raid.sock 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 226822 ']' 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:37:51.264 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:37:51.264 12:40:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:37:51.264 [2024-06-07 12:40:14.742548] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:37:51.264 [2024-06-07 12:40:14.743076] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:37:51.264 [2024-06-07 12:40:14.886576] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:51.523 [2024-06-07 12:40:14.980208] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:37:51.523 [2024-06-07 12:40:15.063050] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:37:51.523 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:37:51.523 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:37:51.523 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:37:51.780 [2024-06-07 12:40:15.410530] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:37:51.780 [2024-06-07 12:40:15.410910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:37:51.780 [2024-06-07 12:40:15.411011] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:37:51.780 [2024-06-07 12:40:15.411139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:52.040 "name": "Existed_Raid", 00:37:52.040 "uuid": "901b83d4-0f92-4cb3-9d32-2a9d7bb92591", 00:37:52.040 "strip_size_kb": 0, 00:37:52.040 "state": "configuring", 00:37:52.040 "raid_level": "raid1", 00:37:52.040 "superblock": true, 00:37:52.040 "num_base_bdevs": 2, 00:37:52.040 "num_base_bdevs_discovered": 0, 00:37:52.040 "num_base_bdevs_operational": 2, 00:37:52.040 "base_bdevs_list": [ 00:37:52.040 { 00:37:52.040 "name": "BaseBdev1", 00:37:52.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:52.040 "is_configured": false, 00:37:52.040 "data_offset": 0, 00:37:52.040 "data_size": 0 00:37:52.040 }, 00:37:52.040 { 00:37:52.040 "name": "BaseBdev2", 00:37:52.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:52.040 "is_configured": false, 00:37:52.040 "data_offset": 0, 00:37:52.040 "data_size": 0 00:37:52.040 } 00:37:52.040 ] 00:37:52.040 }' 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:52.040 12:40:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:37:52.606 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:37:52.863 [2024-06-07 12:40:16.458531] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:37:52.863 [2024-06-07 12:40:16.458876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:37:52.863 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:37:53.120 [2024-06-07 12:40:16.730643] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:37:53.120 [2024-06-07 12:40:16.730960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:37:53.120 [2024-06-07 12:40:16.731080] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:37:53.120 [2024-06-07 12:40:16.731146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:37:53.120 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:37:53.377 [2024-06-07 12:40:16.962670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:53.377 BaseBdev1 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:37:53.377 12:40:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:37:53.635 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:37:53.892 [ 00:37:53.892 { 00:37:53.892 "name": "BaseBdev1", 00:37:53.892 "aliases": [ 00:37:53.892 "348768dd-fbc0-40f8-a183-eba2f54cc72b" 00:37:53.892 ], 00:37:53.892 "product_name": "Malloc disk", 00:37:53.892 "block_size": 4096, 00:37:53.892 "num_blocks": 8192, 00:37:53.892 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:53.892 "assigned_rate_limits": { 00:37:53.892 "rw_ios_per_sec": 0, 00:37:53.892 "rw_mbytes_per_sec": 0, 00:37:53.892 "r_mbytes_per_sec": 0, 00:37:53.892 "w_mbytes_per_sec": 0 00:37:53.892 }, 00:37:53.892 "claimed": true, 00:37:53.892 "claim_type": "exclusive_write", 00:37:53.892 "zoned": false, 00:37:53.892 "supported_io_types": { 00:37:53.892 "read": true, 00:37:53.892 "write": true, 00:37:53.892 "unmap": true, 00:37:53.892 "write_zeroes": true, 00:37:53.892 "flush": true, 00:37:53.892 "reset": true, 00:37:53.892 "compare": false, 00:37:53.892 "compare_and_write": false, 00:37:53.892 "abort": true, 00:37:53.892 "nvme_admin": false, 00:37:53.892 "nvme_io": false 00:37:53.892 }, 00:37:53.892 "memory_domains": [ 00:37:53.892 { 00:37:53.892 "dma_device_id": "system", 00:37:53.892 "dma_device_type": 1 00:37:53.892 }, 00:37:53.892 { 00:37:53.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:53.892 "dma_device_type": 2 00:37:53.892 } 00:37:53.892 ], 00:37:53.892 "driver_specific": {} 00:37:53.892 } 00:37:53.892 ] 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:53.892 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:37:54.150 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:54.150 "name": "Existed_Raid", 00:37:54.150 "uuid": "26f764ed-35a7-4f52-a332-1a9f23822b5c", 00:37:54.150 "strip_size_kb": 0, 00:37:54.150 "state": "configuring", 00:37:54.150 "raid_level": "raid1", 00:37:54.150 "superblock": true, 00:37:54.150 "num_base_bdevs": 2, 00:37:54.150 "num_base_bdevs_discovered": 1, 00:37:54.150 "num_base_bdevs_operational": 2, 00:37:54.150 "base_bdevs_list": [ 00:37:54.150 { 00:37:54.150 "name": "BaseBdev1", 00:37:54.150 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:54.150 "is_configured": true, 00:37:54.150 "data_offset": 256, 00:37:54.150 "data_size": 7936 00:37:54.150 }, 00:37:54.150 { 00:37:54.150 "name": "BaseBdev2", 00:37:54.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:54.150 "is_configured": false, 00:37:54.150 "data_offset": 0, 00:37:54.150 "data_size": 0 00:37:54.150 } 00:37:54.150 ] 00:37:54.150 }' 00:37:54.150 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:54.150 12:40:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:37:54.713 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:37:54.970 [2024-06-07 12:40:18.546916] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:37:54.970 [2024-06-07 12:40:18.547128] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:37:54.970 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:37:55.228 [2024-06-07 12:40:18.747032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:37:55.228 [2024-06-07 12:40:18.749374] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:37:55.228 [2024-06-07 12:40:18.749608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:55.228 12:40:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:37:55.486 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:55.486 "name": "Existed_Raid", 00:37:55.486 "uuid": "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9", 00:37:55.486 "strip_size_kb": 0, 00:37:55.486 "state": "configuring", 00:37:55.486 "raid_level": "raid1", 00:37:55.486 "superblock": true, 00:37:55.486 "num_base_bdevs": 2, 00:37:55.486 "num_base_bdevs_discovered": 1, 00:37:55.486 "num_base_bdevs_operational": 2, 00:37:55.486 "base_bdevs_list": [ 00:37:55.486 { 00:37:55.486 "name": "BaseBdev1", 00:37:55.486 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:55.486 "is_configured": true, 00:37:55.486 "data_offset": 256, 00:37:55.486 "data_size": 7936 00:37:55.486 }, 00:37:55.486 { 00:37:55.486 "name": "BaseBdev2", 00:37:55.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:37:55.486 "is_configured": false, 00:37:55.486 "data_offset": 0, 00:37:55.486 "data_size": 0 00:37:55.486 } 00:37:55.486 ] 00:37:55.486 }' 00:37:55.486 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:55.486 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:37:56.050 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:37:56.308 [2024-06-07 12:40:19.872772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:37:56.308 [2024-06-07 12:40:19.873291] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:37:56.308 [2024-06-07 12:40:19.873364] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:37:56.308 [2024-06-07 12:40:19.873536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:37:56.308 [2024-06-07 12:40:19.874105] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:37:56.308 BaseBdev2 00:37:56.308 [2024-06-07 12:40:19.874292] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:37:56.308 [2024-06-07 12:40:19.874543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:37:56.308 12:40:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:37:56.566 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:37:56.824 [ 00:37:56.824 { 00:37:56.824 "name": "BaseBdev2", 00:37:56.824 "aliases": [ 00:37:56.824 "13b6322a-4775-4bd5-ba01-d9a31c5e5a26" 00:37:56.824 ], 00:37:56.824 "product_name": "Malloc disk", 00:37:56.824 "block_size": 4096, 00:37:56.824 "num_blocks": 8192, 00:37:56.825 "uuid": "13b6322a-4775-4bd5-ba01-d9a31c5e5a26", 00:37:56.825 "assigned_rate_limits": { 00:37:56.825 "rw_ios_per_sec": 0, 00:37:56.825 "rw_mbytes_per_sec": 0, 00:37:56.825 "r_mbytes_per_sec": 0, 00:37:56.825 "w_mbytes_per_sec": 0 00:37:56.825 }, 00:37:56.825 "claimed": true, 00:37:56.825 "claim_type": "exclusive_write", 00:37:56.825 "zoned": false, 00:37:56.825 "supported_io_types": { 00:37:56.825 "read": true, 00:37:56.825 "write": true, 00:37:56.825 "unmap": true, 00:37:56.825 "write_zeroes": true, 00:37:56.825 "flush": true, 00:37:56.825 "reset": true, 00:37:56.825 "compare": false, 00:37:56.825 "compare_and_write": false, 00:37:56.825 "abort": true, 00:37:56.825 "nvme_admin": false, 00:37:56.825 "nvme_io": false 00:37:56.825 }, 00:37:56.825 "memory_domains": [ 00:37:56.825 { 00:37:56.825 "dma_device_id": "system", 00:37:56.825 "dma_device_type": 1 00:37:56.825 }, 00:37:56.825 { 00:37:56.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:56.825 "dma_device_type": 2 00:37:56.825 } 00:37:56.825 ], 00:37:56.825 "driver_specific": {} 00:37:56.825 } 00:37:56.825 ] 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:37:56.825 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:37:57.393 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:37:57.393 "name": "Existed_Raid", 00:37:57.393 "uuid": "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9", 00:37:57.393 "strip_size_kb": 0, 00:37:57.393 "state": "online", 00:37:57.393 "raid_level": "raid1", 00:37:57.393 "superblock": true, 00:37:57.393 "num_base_bdevs": 2, 00:37:57.393 "num_base_bdevs_discovered": 2, 00:37:57.393 "num_base_bdevs_operational": 2, 00:37:57.393 "base_bdevs_list": [ 00:37:57.393 { 00:37:57.393 "name": "BaseBdev1", 00:37:57.393 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:57.393 "is_configured": true, 00:37:57.393 "data_offset": 256, 00:37:57.393 "data_size": 7936 00:37:57.393 }, 00:37:57.393 { 00:37:57.393 "name": "BaseBdev2", 00:37:57.393 "uuid": "13b6322a-4775-4bd5-ba01-d9a31c5e5a26", 00:37:57.393 "is_configured": true, 00:37:57.393 "data_offset": 256, 00:37:57.393 "data_size": 7936 00:37:57.393 } 00:37:57.393 ] 00:37:57.393 }' 00:37:57.393 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:37:57.393 12:40:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:37:57.958 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:37:58.216 [2024-06-07 12:40:21.633281] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:37:58.216 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:37:58.216 "name": "Existed_Raid", 00:37:58.216 "aliases": [ 00:37:58.216 "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9" 00:37:58.216 ], 00:37:58.216 "product_name": "Raid Volume", 00:37:58.216 "block_size": 4096, 00:37:58.216 "num_blocks": 7936, 00:37:58.216 "uuid": "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9", 00:37:58.216 "assigned_rate_limits": { 00:37:58.216 "rw_ios_per_sec": 0, 00:37:58.216 "rw_mbytes_per_sec": 0, 00:37:58.216 "r_mbytes_per_sec": 0, 00:37:58.216 "w_mbytes_per_sec": 0 00:37:58.216 }, 00:37:58.216 "claimed": false, 00:37:58.216 "zoned": false, 00:37:58.216 "supported_io_types": { 00:37:58.216 "read": true, 00:37:58.216 "write": true, 00:37:58.216 "unmap": false, 00:37:58.216 "write_zeroes": true, 00:37:58.216 "flush": false, 00:37:58.216 "reset": true, 00:37:58.216 "compare": false, 00:37:58.216 "compare_and_write": false, 00:37:58.216 "abort": false, 00:37:58.216 "nvme_admin": false, 00:37:58.216 "nvme_io": false 00:37:58.216 }, 00:37:58.216 "memory_domains": [ 00:37:58.216 { 00:37:58.216 "dma_device_id": "system", 00:37:58.216 "dma_device_type": 1 00:37:58.216 }, 00:37:58.216 { 00:37:58.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:58.216 "dma_device_type": 2 00:37:58.216 }, 00:37:58.216 { 00:37:58.216 "dma_device_id": "system", 00:37:58.216 "dma_device_type": 1 00:37:58.216 }, 00:37:58.216 { 00:37:58.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:58.216 "dma_device_type": 2 00:37:58.216 } 00:37:58.216 ], 00:37:58.216 "driver_specific": { 00:37:58.216 "raid": { 00:37:58.216 "uuid": "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9", 00:37:58.216 "strip_size_kb": 0, 00:37:58.216 "state": "online", 00:37:58.216 "raid_level": "raid1", 00:37:58.216 "superblock": true, 00:37:58.216 "num_base_bdevs": 2, 00:37:58.216 "num_base_bdevs_discovered": 2, 00:37:58.217 "num_base_bdevs_operational": 2, 00:37:58.217 "base_bdevs_list": [ 00:37:58.217 { 00:37:58.217 "name": "BaseBdev1", 00:37:58.217 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:58.217 "is_configured": true, 00:37:58.217 "data_offset": 256, 00:37:58.217 "data_size": 7936 00:37:58.217 }, 00:37:58.217 { 00:37:58.217 "name": "BaseBdev2", 00:37:58.217 "uuid": "13b6322a-4775-4bd5-ba01-d9a31c5e5a26", 00:37:58.217 "is_configured": true, 00:37:58.217 "data_offset": 256, 00:37:58.217 "data_size": 7936 00:37:58.217 } 00:37:58.217 ] 00:37:58.217 } 00:37:58.217 } 00:37:58.217 }' 00:37:58.217 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:37:58.217 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:37:58.217 BaseBdev2' 00:37:58.217 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:37:58.217 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:37:58.217 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:37:58.474 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:37:58.474 "name": "BaseBdev1", 00:37:58.474 "aliases": [ 00:37:58.474 "348768dd-fbc0-40f8-a183-eba2f54cc72b" 00:37:58.474 ], 00:37:58.474 "product_name": "Malloc disk", 00:37:58.474 "block_size": 4096, 00:37:58.474 "num_blocks": 8192, 00:37:58.474 "uuid": "348768dd-fbc0-40f8-a183-eba2f54cc72b", 00:37:58.474 "assigned_rate_limits": { 00:37:58.474 "rw_ios_per_sec": 0, 00:37:58.474 "rw_mbytes_per_sec": 0, 00:37:58.474 "r_mbytes_per_sec": 0, 00:37:58.474 "w_mbytes_per_sec": 0 00:37:58.474 }, 00:37:58.474 "claimed": true, 00:37:58.474 "claim_type": "exclusive_write", 00:37:58.474 "zoned": false, 00:37:58.474 "supported_io_types": { 00:37:58.474 "read": true, 00:37:58.474 "write": true, 00:37:58.474 "unmap": true, 00:37:58.474 "write_zeroes": true, 00:37:58.474 "flush": true, 00:37:58.474 "reset": true, 00:37:58.474 "compare": false, 00:37:58.474 "compare_and_write": false, 00:37:58.474 "abort": true, 00:37:58.474 "nvme_admin": false, 00:37:58.474 "nvme_io": false 00:37:58.474 }, 00:37:58.474 "memory_domains": [ 00:37:58.474 { 00:37:58.474 "dma_device_id": "system", 00:37:58.474 "dma_device_type": 1 00:37:58.474 }, 00:37:58.474 { 00:37:58.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:58.474 "dma_device_type": 2 00:37:58.474 } 00:37:58.474 ], 00:37:58.474 "driver_specific": {} 00:37:58.474 }' 00:37:58.474 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:37:58.474 12:40:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:37:58.474 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:37:58.474 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:37:58.474 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:37:58.474 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:37:58.474 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:37:58.732 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:37:58.990 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:37:58.990 "name": "BaseBdev2", 00:37:58.990 "aliases": [ 00:37:58.990 "13b6322a-4775-4bd5-ba01-d9a31c5e5a26" 00:37:58.990 ], 00:37:58.990 "product_name": "Malloc disk", 00:37:58.990 "block_size": 4096, 00:37:58.990 "num_blocks": 8192, 00:37:58.990 "uuid": "13b6322a-4775-4bd5-ba01-d9a31c5e5a26", 00:37:58.990 "assigned_rate_limits": { 00:37:58.990 "rw_ios_per_sec": 0, 00:37:58.990 "rw_mbytes_per_sec": 0, 00:37:58.990 "r_mbytes_per_sec": 0, 00:37:58.990 "w_mbytes_per_sec": 0 00:37:58.990 }, 00:37:58.990 "claimed": true, 00:37:58.990 "claim_type": "exclusive_write", 00:37:58.990 "zoned": false, 00:37:58.990 "supported_io_types": { 00:37:58.990 "read": true, 00:37:58.990 "write": true, 00:37:58.990 "unmap": true, 00:37:58.990 "write_zeroes": true, 00:37:58.990 "flush": true, 00:37:58.990 "reset": true, 00:37:58.990 "compare": false, 00:37:58.990 "compare_and_write": false, 00:37:58.990 "abort": true, 00:37:58.990 "nvme_admin": false, 00:37:58.990 "nvme_io": false 00:37:58.990 }, 00:37:58.990 "memory_domains": [ 00:37:58.990 { 00:37:58.990 "dma_device_id": "system", 00:37:58.990 "dma_device_type": 1 00:37:58.990 }, 00:37:58.990 { 00:37:58.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:58.990 "dma_device_type": 2 00:37:58.990 } 00:37:58.990 ], 00:37:58.990 "driver_specific": {} 00:37:58.990 }' 00:37:58.990 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:37:58.990 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:37:59.250 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:37:59.507 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:37:59.507 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:37:59.507 12:40:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:37:59.765 [2024-06-07 12:40:23.197441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:37:59.765 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:00.022 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:00.022 "name": "Existed_Raid", 00:38:00.022 "uuid": "1768a9d3-1cfd-4feb-bba3-a7a2568d22d9", 00:38:00.022 "strip_size_kb": 0, 00:38:00.022 "state": "online", 00:38:00.022 "raid_level": "raid1", 00:38:00.022 "superblock": true, 00:38:00.022 "num_base_bdevs": 2, 00:38:00.022 "num_base_bdevs_discovered": 1, 00:38:00.022 "num_base_bdevs_operational": 1, 00:38:00.022 "base_bdevs_list": [ 00:38:00.022 { 00:38:00.022 "name": null, 00:38:00.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:00.022 "is_configured": false, 00:38:00.022 "data_offset": 256, 00:38:00.022 "data_size": 7936 00:38:00.022 }, 00:38:00.022 { 00:38:00.022 "name": "BaseBdev2", 00:38:00.022 "uuid": "13b6322a-4775-4bd5-ba01-d9a31c5e5a26", 00:38:00.022 "is_configured": true, 00:38:00.022 "data_offset": 256, 00:38:00.022 "data_size": 7936 00:38:00.022 } 00:38:00.022 ] 00:38:00.022 }' 00:38:00.022 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:00.022 12:40:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:00.586 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:38:00.586 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:38:00.586 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:00.586 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:38:00.883 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:38:00.883 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:38:00.883 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:38:01.142 [2024-06-07 12:40:24.554639] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:38:01.142 [2024-06-07 12:40:24.555041] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:01.142 [2024-06-07 12:40:24.569255] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:01.142 [2024-06-07 12:40:24.569580] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:01.142 [2024-06-07 12:40:24.569734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:38:01.142 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:38:01.142 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:38:01.142 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:01.142 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 226822 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 226822 ']' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 226822 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 226822 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 226822' 00:38:01.401 killing process with pid 226822 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # kill 226822 00:38:01.401 [2024-06-07 12:40:24.830689] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:38:01.401 12:40:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@973 -- # wait 226822 00:38:01.401 [2024-06-07 12:40:24.830866] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:38:01.660 12:40:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:38:01.660 00:38:01.660 real 0m10.404s 00:38:01.660 user 0m18.825s 00:38:01.660 sys 0m1.780s 00:38:01.660 12:40:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:38:01.660 12:40:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:01.660 ************************************ 00:38:01.660 END TEST raid_state_function_test_sb_4k 00:38:01.660 ************************************ 00:38:01.660 12:40:25 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:38:01.660 12:40:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:38:01.660 12:40:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:38:01.660 12:40:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:38:01.660 ************************************ 00:38:01.660 START TEST raid_superblock_test_4k 00:38:01.660 ************************************ 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=227178 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 227178 /var/tmp/spdk-raid.sock 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@830 -- # '[' -z 227178 ']' 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:38:01.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:38:01.660 12:40:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:01.660 [2024-06-07 12:40:25.228909] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:38:01.660 [2024-06-07 12:40:25.229544] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid227178 ] 00:38:01.919 [2024-06-07 12:40:25.383822] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:01.919 [2024-06-07 12:40:25.458254] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:38:01.919 [2024-06-07 12:40:25.508127] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@863 -- # return 0 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:38:02.854 malloc1 00:38:02.854 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:38:03.113 [2024-06-07 12:40:26.652215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:38:03.113 [2024-06-07 12:40:26.652589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:03.113 [2024-06-07 12:40:26.652687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:38:03.113 [2024-06-07 12:40:26.652880] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:03.113 [2024-06-07 12:40:26.655187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:03.113 [2024-06-07 12:40:26.655404] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:38:03.113 pt1 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:38:03.113 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:38:03.372 malloc2 00:38:03.372 12:40:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:38:03.630 [2024-06-07 12:40:27.061021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:38:03.630 [2024-06-07 12:40:27.061380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:03.630 [2024-06-07 12:40:27.061568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:38:03.630 [2024-06-07 12:40:27.061729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:03.630 [2024-06-07 12:40:27.064188] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:03.630 [2024-06-07 12:40:27.064406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:38:03.630 pt2 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:38:03.630 [2024-06-07 12:40:27.253200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:38:03.630 [2024-06-07 12:40:27.255073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:38:03.630 [2024-06-07 12:40:27.255358] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:38:03.630 [2024-06-07 12:40:27.255460] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:03.630 [2024-06-07 12:40:27.255621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:38:03.630 [2024-06-07 12:40:27.256003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:38:03.630 [2024-06-07 12:40:27.256102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:38:03.630 [2024-06-07 12:40:27.256294] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:03.630 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:03.888 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:03.889 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:03.889 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:03.889 "name": "raid_bdev1", 00:38:03.889 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:03.889 "strip_size_kb": 0, 00:38:03.889 "state": "online", 00:38:03.889 "raid_level": "raid1", 00:38:03.889 "superblock": true, 00:38:03.889 "num_base_bdevs": 2, 00:38:03.889 "num_base_bdevs_discovered": 2, 00:38:03.889 "num_base_bdevs_operational": 2, 00:38:03.889 "base_bdevs_list": [ 00:38:03.889 { 00:38:03.889 "name": "pt1", 00:38:03.889 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:03.889 "is_configured": true, 00:38:03.889 "data_offset": 256, 00:38:03.889 "data_size": 7936 00:38:03.889 }, 00:38:03.889 { 00:38:03.889 "name": "pt2", 00:38:03.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:03.889 "is_configured": true, 00:38:03.889 "data_offset": 256, 00:38:03.889 "data_size": 7936 00:38:03.889 } 00:38:03.889 ] 00:38:03.889 }' 00:38:03.889 12:40:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:03.889 12:40:27 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:38:04.822 [2024-06-07 12:40:28.325462] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:38:04.822 "name": "raid_bdev1", 00:38:04.822 "aliases": [ 00:38:04.822 "65ae78d7-bba4-4a1f-b60b-b17f81c3503c" 00:38:04.822 ], 00:38:04.822 "product_name": "Raid Volume", 00:38:04.822 "block_size": 4096, 00:38:04.822 "num_blocks": 7936, 00:38:04.822 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:04.822 "assigned_rate_limits": { 00:38:04.822 "rw_ios_per_sec": 0, 00:38:04.822 "rw_mbytes_per_sec": 0, 00:38:04.822 "r_mbytes_per_sec": 0, 00:38:04.822 "w_mbytes_per_sec": 0 00:38:04.822 }, 00:38:04.822 "claimed": false, 00:38:04.822 "zoned": false, 00:38:04.822 "supported_io_types": { 00:38:04.822 "read": true, 00:38:04.822 "write": true, 00:38:04.822 "unmap": false, 00:38:04.822 "write_zeroes": true, 00:38:04.822 "flush": false, 00:38:04.822 "reset": true, 00:38:04.822 "compare": false, 00:38:04.822 "compare_and_write": false, 00:38:04.822 "abort": false, 00:38:04.822 "nvme_admin": false, 00:38:04.822 "nvme_io": false 00:38:04.822 }, 00:38:04.822 "memory_domains": [ 00:38:04.822 { 00:38:04.822 "dma_device_id": "system", 00:38:04.822 "dma_device_type": 1 00:38:04.822 }, 00:38:04.822 { 00:38:04.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:04.822 "dma_device_type": 2 00:38:04.822 }, 00:38:04.822 { 00:38:04.822 "dma_device_id": "system", 00:38:04.822 "dma_device_type": 1 00:38:04.822 }, 00:38:04.822 { 00:38:04.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:04.822 "dma_device_type": 2 00:38:04.822 } 00:38:04.822 ], 00:38:04.822 "driver_specific": { 00:38:04.822 "raid": { 00:38:04.822 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:04.822 "strip_size_kb": 0, 00:38:04.822 "state": "online", 00:38:04.822 "raid_level": "raid1", 00:38:04.822 "superblock": true, 00:38:04.822 "num_base_bdevs": 2, 00:38:04.822 "num_base_bdevs_discovered": 2, 00:38:04.822 "num_base_bdevs_operational": 2, 00:38:04.822 "base_bdevs_list": [ 00:38:04.822 { 00:38:04.822 "name": "pt1", 00:38:04.822 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:04.822 "is_configured": true, 00:38:04.822 "data_offset": 256, 00:38:04.822 "data_size": 7936 00:38:04.822 }, 00:38:04.822 { 00:38:04.822 "name": "pt2", 00:38:04.822 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:04.822 "is_configured": true, 00:38:04.822 "data_offset": 256, 00:38:04.822 "data_size": 7936 00:38:04.822 } 00:38:04.822 ] 00:38:04.822 } 00:38:04.822 } 00:38:04.822 }' 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:38:04.822 pt2' 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:38:04.822 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:38:05.129 "name": "pt1", 00:38:05.129 "aliases": [ 00:38:05.129 "00000000-0000-0000-0000-000000000001" 00:38:05.129 ], 00:38:05.129 "product_name": "passthru", 00:38:05.129 "block_size": 4096, 00:38:05.129 "num_blocks": 8192, 00:38:05.129 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:05.129 "assigned_rate_limits": { 00:38:05.129 "rw_ios_per_sec": 0, 00:38:05.129 "rw_mbytes_per_sec": 0, 00:38:05.129 "r_mbytes_per_sec": 0, 00:38:05.129 "w_mbytes_per_sec": 0 00:38:05.129 }, 00:38:05.129 "claimed": true, 00:38:05.129 "claim_type": "exclusive_write", 00:38:05.129 "zoned": false, 00:38:05.129 "supported_io_types": { 00:38:05.129 "read": true, 00:38:05.129 "write": true, 00:38:05.129 "unmap": true, 00:38:05.129 "write_zeroes": true, 00:38:05.129 "flush": true, 00:38:05.129 "reset": true, 00:38:05.129 "compare": false, 00:38:05.129 "compare_and_write": false, 00:38:05.129 "abort": true, 00:38:05.129 "nvme_admin": false, 00:38:05.129 "nvme_io": false 00:38:05.129 }, 00:38:05.129 "memory_domains": [ 00:38:05.129 { 00:38:05.129 "dma_device_id": "system", 00:38:05.129 "dma_device_type": 1 00:38:05.129 }, 00:38:05.129 { 00:38:05.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:05.129 "dma_device_type": 2 00:38:05.129 } 00:38:05.129 ], 00:38:05.129 "driver_specific": { 00:38:05.129 "passthru": { 00:38:05.129 "name": "pt1", 00:38:05.129 "base_bdev_name": "malloc1" 00:38:05.129 } 00:38:05.129 } 00:38:05.129 }' 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:05.129 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:05.387 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:38:05.388 12:40:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:38:05.645 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:38:05.645 "name": "pt2", 00:38:05.645 "aliases": [ 00:38:05.645 "00000000-0000-0000-0000-000000000002" 00:38:05.645 ], 00:38:05.646 "product_name": "passthru", 00:38:05.646 "block_size": 4096, 00:38:05.646 "num_blocks": 8192, 00:38:05.646 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:05.646 "assigned_rate_limits": { 00:38:05.646 "rw_ios_per_sec": 0, 00:38:05.646 "rw_mbytes_per_sec": 0, 00:38:05.646 "r_mbytes_per_sec": 0, 00:38:05.646 "w_mbytes_per_sec": 0 00:38:05.646 }, 00:38:05.646 "claimed": true, 00:38:05.646 "claim_type": "exclusive_write", 00:38:05.646 "zoned": false, 00:38:05.646 "supported_io_types": { 00:38:05.646 "read": true, 00:38:05.646 "write": true, 00:38:05.646 "unmap": true, 00:38:05.646 "write_zeroes": true, 00:38:05.646 "flush": true, 00:38:05.646 "reset": true, 00:38:05.646 "compare": false, 00:38:05.646 "compare_and_write": false, 00:38:05.646 "abort": true, 00:38:05.646 "nvme_admin": false, 00:38:05.646 "nvme_io": false 00:38:05.646 }, 00:38:05.646 "memory_domains": [ 00:38:05.646 { 00:38:05.646 "dma_device_id": "system", 00:38:05.646 "dma_device_type": 1 00:38:05.646 }, 00:38:05.646 { 00:38:05.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:05.646 "dma_device_type": 2 00:38:05.646 } 00:38:05.646 ], 00:38:05.646 "driver_specific": { 00:38:05.646 "passthru": { 00:38:05.646 "name": "pt2", 00:38:05.646 "base_bdev_name": "malloc2" 00:38:05.646 } 00:38:05.646 } 00:38:05.646 }' 00:38:05.646 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:05.646 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:05.646 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:38:05.646 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:05.646 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:05.903 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:38:06.161 [2024-06-07 12:40:29.761619] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:06.161 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=65ae78d7-bba4-4a1f-b60b-b17f81c3503c 00:38:06.161 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 65ae78d7-bba4-4a1f-b60b-b17f81c3503c ']' 00:38:06.161 12:40:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:06.418 [2024-06-07 12:40:29.997519] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:06.418 [2024-06-07 12:40:29.997685] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:06.418 [2024-06-07 12:40:29.997884] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:06.418 [2024-06-07 12:40:29.998068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:06.418 [2024-06-07 12:40:29.998163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:38:06.418 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:38:06.418 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:06.675 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:38:06.675 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:38:06.675 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:38:06.675 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:38:06.933 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:38:06.933 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:38:07.191 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:38:07.191 12:40:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@649 -- # local es=0 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:38:07.449 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:38:07.708 [2024-06-07 12:40:31.341683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:38:07.708 [2024-06-07 12:40:31.343664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:38:07.708 [2024-06-07 12:40:31.343849] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:38:07.708 [2024-06-07 12:40:31.344015] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:38:07.708 [2024-06-07 12:40:31.344090] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:07.708 [2024-06-07 12:40:31.344162] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:38:07.708 request: 00:38:07.708 { 00:38:07.708 "name": "raid_bdev1", 00:38:07.708 "raid_level": "raid1", 00:38:07.708 "base_bdevs": [ 00:38:07.708 "malloc1", 00:38:07.708 "malloc2" 00:38:07.708 ], 00:38:07.708 "superblock": false, 00:38:07.708 "method": "bdev_raid_create", 00:38:07.708 "req_id": 1 00:38:07.708 } 00:38:07.708 Got JSON-RPC error response 00:38:07.708 response: 00:38:07.708 { 00:38:07.708 "code": -17, 00:38:07.708 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:38:07.708 } 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # es=1 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:38:07.966 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:08.223 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:38:08.223 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:38:08.223 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:38:08.482 [2024-06-07 12:40:31.881694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:38:08.482 [2024-06-07 12:40:31.882048] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:08.482 [2024-06-07 12:40:31.882176] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:38:08.482 [2024-06-07 12:40:31.882295] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:08.482 [2024-06-07 12:40:31.884200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:08.482 [2024-06-07 12:40:31.884380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:38:08.482 [2024-06-07 12:40:31.884550] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:38:08.482 [2024-06-07 12:40:31.884727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:38:08.482 pt1 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:08.482 12:40:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:08.482 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:08.482 "name": "raid_bdev1", 00:38:08.482 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:08.482 "strip_size_kb": 0, 00:38:08.482 "state": "configuring", 00:38:08.482 "raid_level": "raid1", 00:38:08.482 "superblock": true, 00:38:08.482 "num_base_bdevs": 2, 00:38:08.482 "num_base_bdevs_discovered": 1, 00:38:08.482 "num_base_bdevs_operational": 2, 00:38:08.482 "base_bdevs_list": [ 00:38:08.482 { 00:38:08.482 "name": "pt1", 00:38:08.482 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:08.482 "is_configured": true, 00:38:08.482 "data_offset": 256, 00:38:08.482 "data_size": 7936 00:38:08.482 }, 00:38:08.482 { 00:38:08.482 "name": null, 00:38:08.482 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:08.482 "is_configured": false, 00:38:08.482 "data_offset": 256, 00:38:08.482 "data_size": 7936 00:38:08.482 } 00:38:08.482 ] 00:38:08.482 }' 00:38:08.482 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:08.482 12:40:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:09.046 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:38:09.046 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:38:09.046 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:38:09.046 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:38:09.304 [2024-06-07 12:40:32.905854] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:38:09.304 [2024-06-07 12:40:32.906198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:09.304 [2024-06-07 12:40:32.906365] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:38:09.304 [2024-06-07 12:40:32.906487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:09.304 [2024-06-07 12:40:32.906924] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:09.304 [2024-06-07 12:40:32.907111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:38:09.304 [2024-06-07 12:40:32.907338] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:38:09.304 [2024-06-07 12:40:32.907466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:38:09.304 [2024-06-07 12:40:32.907607] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:38:09.304 [2024-06-07 12:40:32.907773] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:09.304 [2024-06-07 12:40:32.907878] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:38:09.304 [2024-06-07 12:40:32.908288] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:38:09.304 [2024-06-07 12:40:32.908426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:38:09.304 [2024-06-07 12:40:32.908591] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:09.304 pt2 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:09.304 12:40:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:09.562 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:09.562 "name": "raid_bdev1", 00:38:09.562 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:09.562 "strip_size_kb": 0, 00:38:09.562 "state": "online", 00:38:09.562 "raid_level": "raid1", 00:38:09.562 "superblock": true, 00:38:09.562 "num_base_bdevs": 2, 00:38:09.562 "num_base_bdevs_discovered": 2, 00:38:09.562 "num_base_bdevs_operational": 2, 00:38:09.562 "base_bdevs_list": [ 00:38:09.563 { 00:38:09.563 "name": "pt1", 00:38:09.563 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:09.563 "is_configured": true, 00:38:09.563 "data_offset": 256, 00:38:09.563 "data_size": 7936 00:38:09.563 }, 00:38:09.563 { 00:38:09.563 "name": "pt2", 00:38:09.563 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:09.563 "is_configured": true, 00:38:09.563 "data_offset": 256, 00:38:09.563 "data_size": 7936 00:38:09.563 } 00:38:09.563 ] 00:38:09.563 }' 00:38:09.563 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:09.563 12:40:33 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:38:10.130 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:10.388 [2024-06-07 12:40:33.862101] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:10.388 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:38:10.388 "name": "raid_bdev1", 00:38:10.388 "aliases": [ 00:38:10.388 "65ae78d7-bba4-4a1f-b60b-b17f81c3503c" 00:38:10.388 ], 00:38:10.388 "product_name": "Raid Volume", 00:38:10.388 "block_size": 4096, 00:38:10.388 "num_blocks": 7936, 00:38:10.388 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:10.388 "assigned_rate_limits": { 00:38:10.388 "rw_ios_per_sec": 0, 00:38:10.388 "rw_mbytes_per_sec": 0, 00:38:10.388 "r_mbytes_per_sec": 0, 00:38:10.388 "w_mbytes_per_sec": 0 00:38:10.388 }, 00:38:10.388 "claimed": false, 00:38:10.388 "zoned": false, 00:38:10.388 "supported_io_types": { 00:38:10.388 "read": true, 00:38:10.388 "write": true, 00:38:10.388 "unmap": false, 00:38:10.388 "write_zeroes": true, 00:38:10.388 "flush": false, 00:38:10.388 "reset": true, 00:38:10.388 "compare": false, 00:38:10.388 "compare_and_write": false, 00:38:10.388 "abort": false, 00:38:10.388 "nvme_admin": false, 00:38:10.388 "nvme_io": false 00:38:10.388 }, 00:38:10.388 "memory_domains": [ 00:38:10.388 { 00:38:10.388 "dma_device_id": "system", 00:38:10.388 "dma_device_type": 1 00:38:10.388 }, 00:38:10.388 { 00:38:10.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:10.388 "dma_device_type": 2 00:38:10.388 }, 00:38:10.388 { 00:38:10.388 "dma_device_id": "system", 00:38:10.388 "dma_device_type": 1 00:38:10.388 }, 00:38:10.388 { 00:38:10.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:10.388 "dma_device_type": 2 00:38:10.388 } 00:38:10.388 ], 00:38:10.388 "driver_specific": { 00:38:10.388 "raid": { 00:38:10.388 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:10.388 "strip_size_kb": 0, 00:38:10.388 "state": "online", 00:38:10.388 "raid_level": "raid1", 00:38:10.388 "superblock": true, 00:38:10.388 "num_base_bdevs": 2, 00:38:10.388 "num_base_bdevs_discovered": 2, 00:38:10.388 "num_base_bdevs_operational": 2, 00:38:10.388 "base_bdevs_list": [ 00:38:10.388 { 00:38:10.388 "name": "pt1", 00:38:10.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:10.388 "is_configured": true, 00:38:10.388 "data_offset": 256, 00:38:10.389 "data_size": 7936 00:38:10.389 }, 00:38:10.389 { 00:38:10.389 "name": "pt2", 00:38:10.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:10.389 "is_configured": true, 00:38:10.389 "data_offset": 256, 00:38:10.389 "data_size": 7936 00:38:10.389 } 00:38:10.389 ] 00:38:10.389 } 00:38:10.389 } 00:38:10.389 }' 00:38:10.389 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:38:10.389 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:38:10.389 pt2' 00:38:10.389 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:38:10.389 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:38:10.389 12:40:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:38:10.647 "name": "pt1", 00:38:10.647 "aliases": [ 00:38:10.647 "00000000-0000-0000-0000-000000000001" 00:38:10.647 ], 00:38:10.647 "product_name": "passthru", 00:38:10.647 "block_size": 4096, 00:38:10.647 "num_blocks": 8192, 00:38:10.647 "uuid": "00000000-0000-0000-0000-000000000001", 00:38:10.647 "assigned_rate_limits": { 00:38:10.647 "rw_ios_per_sec": 0, 00:38:10.647 "rw_mbytes_per_sec": 0, 00:38:10.647 "r_mbytes_per_sec": 0, 00:38:10.647 "w_mbytes_per_sec": 0 00:38:10.647 }, 00:38:10.647 "claimed": true, 00:38:10.647 "claim_type": "exclusive_write", 00:38:10.647 "zoned": false, 00:38:10.647 "supported_io_types": { 00:38:10.647 "read": true, 00:38:10.647 "write": true, 00:38:10.647 "unmap": true, 00:38:10.647 "write_zeroes": true, 00:38:10.647 "flush": true, 00:38:10.647 "reset": true, 00:38:10.647 "compare": false, 00:38:10.647 "compare_and_write": false, 00:38:10.647 "abort": true, 00:38:10.647 "nvme_admin": false, 00:38:10.647 "nvme_io": false 00:38:10.647 }, 00:38:10.647 "memory_domains": [ 00:38:10.647 { 00:38:10.647 "dma_device_id": "system", 00:38:10.647 "dma_device_type": 1 00:38:10.647 }, 00:38:10.647 { 00:38:10.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:10.647 "dma_device_type": 2 00:38:10.647 } 00:38:10.647 ], 00:38:10.647 "driver_specific": { 00:38:10.647 "passthru": { 00:38:10.647 "name": "pt1", 00:38:10.647 "base_bdev_name": "malloc1" 00:38:10.647 } 00:38:10.647 } 00:38:10.647 }' 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:10.647 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:38:10.905 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:38:11.163 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:38:11.163 "name": "pt2", 00:38:11.163 "aliases": [ 00:38:11.163 "00000000-0000-0000-0000-000000000002" 00:38:11.163 ], 00:38:11.163 "product_name": "passthru", 00:38:11.163 "block_size": 4096, 00:38:11.163 "num_blocks": 8192, 00:38:11.163 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:11.163 "assigned_rate_limits": { 00:38:11.163 "rw_ios_per_sec": 0, 00:38:11.163 "rw_mbytes_per_sec": 0, 00:38:11.163 "r_mbytes_per_sec": 0, 00:38:11.163 "w_mbytes_per_sec": 0 00:38:11.163 }, 00:38:11.163 "claimed": true, 00:38:11.163 "claim_type": "exclusive_write", 00:38:11.163 "zoned": false, 00:38:11.163 "supported_io_types": { 00:38:11.163 "read": true, 00:38:11.163 "write": true, 00:38:11.163 "unmap": true, 00:38:11.163 "write_zeroes": true, 00:38:11.163 "flush": true, 00:38:11.163 "reset": true, 00:38:11.163 "compare": false, 00:38:11.163 "compare_and_write": false, 00:38:11.163 "abort": true, 00:38:11.163 "nvme_admin": false, 00:38:11.163 "nvme_io": false 00:38:11.163 }, 00:38:11.163 "memory_domains": [ 00:38:11.163 { 00:38:11.163 "dma_device_id": "system", 00:38:11.163 "dma_device_type": 1 00:38:11.163 }, 00:38:11.163 { 00:38:11.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:11.163 "dma_device_type": 2 00:38:11.163 } 00:38:11.163 ], 00:38:11.163 "driver_specific": { 00:38:11.163 "passthru": { 00:38:11.163 "name": "pt2", 00:38:11.163 "base_bdev_name": "malloc2" 00:38:11.163 } 00:38:11.163 } 00:38:11.163 }' 00:38:11.163 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:11.420 12:40:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:11.420 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:38:11.420 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:11.678 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:38:11.678 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:38:11.678 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:38:11.678 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:11.935 [2024-06-07 12:40:35.398360] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:11.935 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 65ae78d7-bba4-4a1f-b60b-b17f81c3503c '!=' 65ae78d7-bba4-4a1f-b60b-b17f81c3503c ']' 00:38:11.935 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:38:11.935 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:38:11.935 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:38:11.935 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:38:12.192 [2024-06-07 12:40:35.614223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:12.193 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:12.451 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:12.451 "name": "raid_bdev1", 00:38:12.451 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:12.451 "strip_size_kb": 0, 00:38:12.451 "state": "online", 00:38:12.451 "raid_level": "raid1", 00:38:12.451 "superblock": true, 00:38:12.451 "num_base_bdevs": 2, 00:38:12.451 "num_base_bdevs_discovered": 1, 00:38:12.451 "num_base_bdevs_operational": 1, 00:38:12.451 "base_bdevs_list": [ 00:38:12.451 { 00:38:12.451 "name": null, 00:38:12.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:12.451 "is_configured": false, 00:38:12.451 "data_offset": 256, 00:38:12.451 "data_size": 7936 00:38:12.451 }, 00:38:12.451 { 00:38:12.451 "name": "pt2", 00:38:12.451 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:12.451 "is_configured": true, 00:38:12.451 "data_offset": 256, 00:38:12.451 "data_size": 7936 00:38:12.451 } 00:38:12.451 ] 00:38:12.451 }' 00:38:12.451 12:40:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:12.451 12:40:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:13.015 12:40:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:13.273 [2024-06-07 12:40:36.810434] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:13.273 [2024-06-07 12:40:36.810479] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:13.273 [2024-06-07 12:40:36.810537] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:13.273 [2024-06-07 12:40:36.810572] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:13.273 [2024-06-07 12:40:36.810582] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:38:13.273 12:40:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:13.273 12:40:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:38:13.530 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:38:13.530 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:38:13.530 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:38:13.530 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:38:13.530 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:38:13.787 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:38:14.045 [2024-06-07 12:40:37.614541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:38:14.045 [2024-06-07 12:40:37.614685] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:14.045 [2024-06-07 12:40:37.614714] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:38:14.045 [2024-06-07 12:40:37.614772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:14.045 [2024-06-07 12:40:37.616713] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:14.045 [2024-06-07 12:40:37.616773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:38:14.045 [2024-06-07 12:40:37.616840] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:38:14.045 [2024-06-07 12:40:37.616867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:38:14.045 [2024-06-07 12:40:37.616926] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:38:14.045 [2024-06-07 12:40:37.616936] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:14.045 [2024-06-07 12:40:37.616999] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:38:14.045 [2024-06-07 12:40:37.617178] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:38:14.045 [2024-06-07 12:40:37.617195] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:38:14.045 [2024-06-07 12:40:37.617276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:14.045 pt2 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:14.045 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:14.610 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:14.610 "name": "raid_bdev1", 00:38:14.610 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:14.610 "strip_size_kb": 0, 00:38:14.610 "state": "online", 00:38:14.610 "raid_level": "raid1", 00:38:14.610 "superblock": true, 00:38:14.610 "num_base_bdevs": 2, 00:38:14.610 "num_base_bdevs_discovered": 1, 00:38:14.610 "num_base_bdevs_operational": 1, 00:38:14.610 "base_bdevs_list": [ 00:38:14.610 { 00:38:14.610 "name": null, 00:38:14.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:14.610 "is_configured": false, 00:38:14.610 "data_offset": 256, 00:38:14.610 "data_size": 7936 00:38:14.610 }, 00:38:14.610 { 00:38:14.610 "name": "pt2", 00:38:14.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:14.610 "is_configured": true, 00:38:14.610 "data_offset": 256, 00:38:14.610 "data_size": 7936 00:38:14.610 } 00:38:14.610 ] 00:38:14.610 }' 00:38:14.610 12:40:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:14.610 12:40:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:15.176 12:40:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:15.176 [2024-06-07 12:40:38.742763] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:15.176 [2024-06-07 12:40:38.742827] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:15.176 [2024-06-07 12:40:38.742909] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:15.176 [2024-06-07 12:40:38.742957] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:15.176 [2024-06-07 12:40:38.742968] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:38:15.176 12:40:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:15.176 12:40:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:38:15.434 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:38:15.435 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:38:15.435 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:38:15.435 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:38:15.693 [2024-06-07 12:40:39.330813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:38:15.693 [2024-06-07 12:40:39.330991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:15.693 [2024-06-07 12:40:39.331049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:38:15.693 [2024-06-07 12:40:39.331076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:15.693 [2024-06-07 12:40:39.333452] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:15.693 [2024-06-07 12:40:39.333519] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:38:15.693 [2024-06-07 12:40:39.333610] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:38:15.693 [2024-06-07 12:40:39.333646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:38:15.693 [2024-06-07 12:40:39.333788] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:38:15.693 [2024-06-07 12:40:39.333805] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:15.693 [2024-06-07 12:40:39.333836] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state configuring 00:38:15.693 [2024-06-07 12:40:39.333897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:38:15.693 [2024-06-07 12:40:39.333971] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:38:15.693 [2024-06-07 12:40:39.333992] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:15.693 [2024-06-07 12:40:39.334059] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:38:15.693 [2024-06-07 12:40:39.334286] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:38:15.693 [2024-06-07 12:40:39.334306] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:38:15.693 [2024-06-07 12:40:39.334382] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:15.693 pt1 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:15.951 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:16.209 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:16.209 "name": "raid_bdev1", 00:38:16.209 "uuid": "65ae78d7-bba4-4a1f-b60b-b17f81c3503c", 00:38:16.209 "strip_size_kb": 0, 00:38:16.209 "state": "online", 00:38:16.209 "raid_level": "raid1", 00:38:16.209 "superblock": true, 00:38:16.209 "num_base_bdevs": 2, 00:38:16.209 "num_base_bdevs_discovered": 1, 00:38:16.209 "num_base_bdevs_operational": 1, 00:38:16.209 "base_bdevs_list": [ 00:38:16.209 { 00:38:16.209 "name": null, 00:38:16.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:16.209 "is_configured": false, 00:38:16.209 "data_offset": 256, 00:38:16.209 "data_size": 7936 00:38:16.209 }, 00:38:16.209 { 00:38:16.209 "name": "pt2", 00:38:16.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:38:16.209 "is_configured": true, 00:38:16.209 "data_offset": 256, 00:38:16.209 "data_size": 7936 00:38:16.209 } 00:38:16.209 ] 00:38:16.209 }' 00:38:16.209 12:40:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:16.209 12:40:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:16.803 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:38:16.803 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:38:17.064 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:38:17.064 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:38:17.064 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:17.631 [2024-06-07 12:40:40.971213] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:17.631 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 65ae78d7-bba4-4a1f-b60b-b17f81c3503c '!=' 65ae78d7-bba4-4a1f-b60b-b17f81c3503c ']' 00:38:17.631 12:40:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 227178 00:38:17.631 12:40:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@949 -- # '[' -z 227178 ']' 00:38:17.631 12:40:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # kill -0 227178 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # uname 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 227178 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 227178' 00:38:17.631 killing process with pid 227178 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # kill 227178 00:38:17.631 [2024-06-07 12:40:41.034112] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:38:17.631 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@973 -- # wait 227178 00:38:17.631 [2024-06-07 12:40:41.034385] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:17.631 [2024-06-07 12:40:41.034436] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:17.631 [2024-06-07 12:40:41.034448] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:38:17.631 [2024-06-07 12:40:41.059458] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:38:17.890 12:40:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:38:17.890 00:38:17.890 real 0m16.101s 00:38:17.890 user 0m29.492s 00:38:17.890 sys 0m2.517s 00:38:17.890 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:38:17.890 12:40:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:38:17.890 ************************************ 00:38:17.890 END TEST raid_superblock_test_4k 00:38:17.890 ************************************ 00:38:17.890 12:40:41 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:38:17.890 12:40:41 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:38:17.890 12:40:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:38:17.890 12:40:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:38:17.890 12:40:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:38:17.890 ************************************ 00:38:17.890 START TEST raid_rebuild_test_sb_4k 00:38:17.890 ************************************ 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=227696 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 227696 /var/tmp/spdk-raid.sock 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 227696 ']' 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:38:17.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:38:17.890 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:17.890 [2024-06-07 12:40:41.413098] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:38:17.890 [2024-06-07 12:40:41.413866] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid227696 ] 00:38:17.890 I/O size of 3145728 is greater than zero copy threshold (65536). 00:38:17.890 Zero copy mechanism will not be used. 00:38:18.148 [2024-06-07 12:40:41.562581] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:18.148 [2024-06-07 12:40:41.622996] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:38:18.148 [2024-06-07 12:40:41.673531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:18.148 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:38:18.148 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:38:18.148 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:18.148 12:40:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:38:18.406 BaseBdev1_malloc 00:38:18.406 12:40:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:38:18.971 [2024-06-07 12:40:42.412580] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:38:18.971 [2024-06-07 12:40:42.412933] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:18.971 [2024-06-07 12:40:42.413026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:38:18.971 [2024-06-07 12:40:42.413284] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:18.971 [2024-06-07 12:40:42.415374] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:18.971 [2024-06-07 12:40:42.415539] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:38:18.971 BaseBdev1 00:38:18.971 12:40:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:38:18.971 12:40:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:38:19.228 BaseBdev2_malloc 00:38:19.228 12:40:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:38:19.486 [2024-06-07 12:40:42.941559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:38:19.486 [2024-06-07 12:40:42.941846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:19.486 [2024-06-07 12:40:42.941944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:38:19.486 [2024-06-07 12:40:42.942294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:19.486 [2024-06-07 12:40:42.944567] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:19.486 [2024-06-07 12:40:42.944761] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:38:19.486 BaseBdev2 00:38:19.486 12:40:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:38:19.745 spare_malloc 00:38:19.745 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:38:20.004 spare_delay 00:38:20.004 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:20.264 [2024-06-07 12:40:43.678946] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:20.264 [2024-06-07 12:40:43.679312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:20.264 [2024-06-07 12:40:43.679522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:38:20.264 [2024-06-07 12:40:43.679760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:20.264 [2024-06-07 12:40:43.682013] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:20.264 [2024-06-07 12:40:43.682218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:20.264 spare 00:38:20.264 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:38:20.523 [2024-06-07 12:40:43.955138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:20.523 [2024-06-07 12:40:43.957178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:20.523 [2024-06-07 12:40:43.957524] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:38:20.523 [2024-06-07 12:40:43.957653] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:20.523 [2024-06-07 12:40:43.957847] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:38:20.523 [2024-06-07 12:40:43.958387] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:38:20.523 [2024-06-07 12:40:43.958533] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:38:20.523 [2024-06-07 12:40:43.958779] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:20.523 12:40:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:20.782 12:40:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:20.782 "name": "raid_bdev1", 00:38:20.782 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:20.782 "strip_size_kb": 0, 00:38:20.782 "state": "online", 00:38:20.782 "raid_level": "raid1", 00:38:20.782 "superblock": true, 00:38:20.782 "num_base_bdevs": 2, 00:38:20.782 "num_base_bdevs_discovered": 2, 00:38:20.782 "num_base_bdevs_operational": 2, 00:38:20.782 "base_bdevs_list": [ 00:38:20.782 { 00:38:20.782 "name": "BaseBdev1", 00:38:20.782 "uuid": "45ad2dc6-a8c5-51a0-9ac5-1f8205b964e1", 00:38:20.782 "is_configured": true, 00:38:20.782 "data_offset": 256, 00:38:20.782 "data_size": 7936 00:38:20.782 }, 00:38:20.782 { 00:38:20.782 "name": "BaseBdev2", 00:38:20.782 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:20.782 "is_configured": true, 00:38:20.782 "data_offset": 256, 00:38:20.782 "data_size": 7936 00:38:20.782 } 00:38:20.782 ] 00:38:20.782 }' 00:38:20.782 12:40:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:20.782 12:40:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:21.349 12:40:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:38:21.349 12:40:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:38:21.607 [2024-06-07 12:40:45.123395] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:21.607 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:38:21.607 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:21.607 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:21.867 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:38:22.128 [2024-06-07 12:40:45.535331] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:38:22.128 /dev/nbd0 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:22.128 1+0 records in 00:38:22.128 1+0 records out 00:38:22.128 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000369056 s, 11.1 MB/s 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:38:22.128 12:40:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:38:22.694 7936+0 records in 00:38:22.694 7936+0 records out 00:38:22.694 32505856 bytes (33 MB, 31 MiB) copied, 0.665046 s, 48.9 MB/s 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:22.694 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:22.952 [2024-06-07 12:40:46.534308] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:38:22.952 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:38:23.210 [2024-06-07 12:40:46.814052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:23.210 12:40:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:23.468 12:40:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:23.468 "name": "raid_bdev1", 00:38:23.468 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:23.468 "strip_size_kb": 0, 00:38:23.468 "state": "online", 00:38:23.468 "raid_level": "raid1", 00:38:23.468 "superblock": true, 00:38:23.468 "num_base_bdevs": 2, 00:38:23.468 "num_base_bdevs_discovered": 1, 00:38:23.468 "num_base_bdevs_operational": 1, 00:38:23.468 "base_bdevs_list": [ 00:38:23.468 { 00:38:23.468 "name": null, 00:38:23.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:23.468 "is_configured": false, 00:38:23.468 "data_offset": 256, 00:38:23.468 "data_size": 7936 00:38:23.468 }, 00:38:23.468 { 00:38:23.468 "name": "BaseBdev2", 00:38:23.468 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:23.468 "is_configured": true, 00:38:23.468 "data_offset": 256, 00:38:23.468 "data_size": 7936 00:38:23.468 } 00:38:23.468 ] 00:38:23.468 }' 00:38:23.468 12:40:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:23.468 12:40:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:24.035 12:40:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:24.293 [2024-06-07 12:40:47.834281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:24.293 [2024-06-07 12:40:47.842625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00019c960 00:38:24.293 [2024-06-07 12:40:47.845322] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:24.293 12:40:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:25.227 12:40:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:25.792 "name": "raid_bdev1", 00:38:25.792 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:25.792 "strip_size_kb": 0, 00:38:25.792 "state": "online", 00:38:25.792 "raid_level": "raid1", 00:38:25.792 "superblock": true, 00:38:25.792 "num_base_bdevs": 2, 00:38:25.792 "num_base_bdevs_discovered": 2, 00:38:25.792 "num_base_bdevs_operational": 2, 00:38:25.792 "process": { 00:38:25.792 "type": "rebuild", 00:38:25.792 "target": "spare", 00:38:25.792 "progress": { 00:38:25.792 "blocks": 3072, 00:38:25.792 "percent": 38 00:38:25.792 } 00:38:25.792 }, 00:38:25.792 "base_bdevs_list": [ 00:38:25.792 { 00:38:25.792 "name": "spare", 00:38:25.792 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:25.792 "is_configured": true, 00:38:25.792 "data_offset": 256, 00:38:25.792 "data_size": 7936 00:38:25.792 }, 00:38:25.792 { 00:38:25.792 "name": "BaseBdev2", 00:38:25.792 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:25.792 "is_configured": true, 00:38:25.792 "data_offset": 256, 00:38:25.792 "data_size": 7936 00:38:25.792 } 00:38:25.792 ] 00:38:25.792 }' 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:25.792 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:26.049 [2024-06-07 12:40:49.550292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:26.049 [2024-06-07 12:40:49.565220] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:26.049 [2024-06-07 12:40:49.565385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:26.049 [2024-06-07 12:40:49.565404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:26.049 [2024-06-07 12:40:49.565414] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:26.049 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:26.307 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:26.307 "name": "raid_bdev1", 00:38:26.307 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:26.307 "strip_size_kb": 0, 00:38:26.307 "state": "online", 00:38:26.307 "raid_level": "raid1", 00:38:26.307 "superblock": true, 00:38:26.307 "num_base_bdevs": 2, 00:38:26.307 "num_base_bdevs_discovered": 1, 00:38:26.307 "num_base_bdevs_operational": 1, 00:38:26.307 "base_bdevs_list": [ 00:38:26.307 { 00:38:26.307 "name": null, 00:38:26.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:26.307 "is_configured": false, 00:38:26.307 "data_offset": 256, 00:38:26.307 "data_size": 7936 00:38:26.307 }, 00:38:26.307 { 00:38:26.307 "name": "BaseBdev2", 00:38:26.307 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:26.307 "is_configured": true, 00:38:26.307 "data_offset": 256, 00:38:26.307 "data_size": 7936 00:38:26.307 } 00:38:26.307 ] 00:38:26.307 }' 00:38:26.307 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:26.307 12:40:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:27.240 "name": "raid_bdev1", 00:38:27.240 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:27.240 "strip_size_kb": 0, 00:38:27.240 "state": "online", 00:38:27.240 "raid_level": "raid1", 00:38:27.240 "superblock": true, 00:38:27.240 "num_base_bdevs": 2, 00:38:27.240 "num_base_bdevs_discovered": 1, 00:38:27.240 "num_base_bdevs_operational": 1, 00:38:27.240 "base_bdevs_list": [ 00:38:27.240 { 00:38:27.240 "name": null, 00:38:27.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:27.240 "is_configured": false, 00:38:27.240 "data_offset": 256, 00:38:27.240 "data_size": 7936 00:38:27.240 }, 00:38:27.240 { 00:38:27.240 "name": "BaseBdev2", 00:38:27.240 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:27.240 "is_configured": true, 00:38:27.240 "data_offset": 256, 00:38:27.240 "data_size": 7936 00:38:27.240 } 00:38:27.240 ] 00:38:27.240 }' 00:38:27.240 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:27.498 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:27.498 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:27.498 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:27.498 12:40:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:27.755 [2024-06-07 12:40:51.152043] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:27.755 [2024-06-07 12:40:51.159878] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00019cb00 00:38:27.755 [2024-06-07 12:40:51.161985] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:27.755 12:40:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:28.706 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:28.963 "name": "raid_bdev1", 00:38:28.963 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:28.963 "strip_size_kb": 0, 00:38:28.963 "state": "online", 00:38:28.963 "raid_level": "raid1", 00:38:28.963 "superblock": true, 00:38:28.963 "num_base_bdevs": 2, 00:38:28.963 "num_base_bdevs_discovered": 2, 00:38:28.963 "num_base_bdevs_operational": 2, 00:38:28.963 "process": { 00:38:28.963 "type": "rebuild", 00:38:28.963 "target": "spare", 00:38:28.963 "progress": { 00:38:28.963 "blocks": 3328, 00:38:28.963 "percent": 41 00:38:28.963 } 00:38:28.963 }, 00:38:28.963 "base_bdevs_list": [ 00:38:28.963 { 00:38:28.963 "name": "spare", 00:38:28.963 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:28.963 "is_configured": true, 00:38:28.963 "data_offset": 256, 00:38:28.963 "data_size": 7936 00:38:28.963 }, 00:38:28.963 { 00:38:28.963 "name": "BaseBdev2", 00:38:28.963 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:28.963 "is_configured": true, 00:38:28.963 "data_offset": 256, 00:38:28.963 "data_size": 7936 00:38:28.963 } 00:38:28.963 ] 00:38:28.963 }' 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:38:28.963 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:38:28.963 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1059 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:29.220 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:29.477 "name": "raid_bdev1", 00:38:29.477 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:29.477 "strip_size_kb": 0, 00:38:29.477 "state": "online", 00:38:29.477 "raid_level": "raid1", 00:38:29.477 "superblock": true, 00:38:29.477 "num_base_bdevs": 2, 00:38:29.477 "num_base_bdevs_discovered": 2, 00:38:29.477 "num_base_bdevs_operational": 2, 00:38:29.477 "process": { 00:38:29.477 "type": "rebuild", 00:38:29.477 "target": "spare", 00:38:29.477 "progress": { 00:38:29.477 "blocks": 4352, 00:38:29.477 "percent": 54 00:38:29.477 } 00:38:29.477 }, 00:38:29.477 "base_bdevs_list": [ 00:38:29.477 { 00:38:29.477 "name": "spare", 00:38:29.477 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:29.477 "is_configured": true, 00:38:29.477 "data_offset": 256, 00:38:29.477 "data_size": 7936 00:38:29.477 }, 00:38:29.477 { 00:38:29.477 "name": "BaseBdev2", 00:38:29.477 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:29.477 "is_configured": true, 00:38:29.477 "data_offset": 256, 00:38:29.477 "data_size": 7936 00:38:29.477 } 00:38:29.477 ] 00:38:29.477 }' 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:29.477 12:40:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:30.409 12:40:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:30.668 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:30.668 "name": "raid_bdev1", 00:38:30.668 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:30.668 "strip_size_kb": 0, 00:38:30.668 "state": "online", 00:38:30.669 "raid_level": "raid1", 00:38:30.669 "superblock": true, 00:38:30.669 "num_base_bdevs": 2, 00:38:30.669 "num_base_bdevs_discovered": 2, 00:38:30.669 "num_base_bdevs_operational": 2, 00:38:30.669 "process": { 00:38:30.669 "type": "rebuild", 00:38:30.669 "target": "spare", 00:38:30.669 "progress": { 00:38:30.669 "blocks": 7680, 00:38:30.669 "percent": 96 00:38:30.669 } 00:38:30.669 }, 00:38:30.669 "base_bdevs_list": [ 00:38:30.669 { 00:38:30.669 "name": "spare", 00:38:30.669 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:30.669 "is_configured": true, 00:38:30.669 "data_offset": 256, 00:38:30.669 "data_size": 7936 00:38:30.669 }, 00:38:30.669 { 00:38:30.669 "name": "BaseBdev2", 00:38:30.669 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:30.669 "is_configured": true, 00:38:30.669 "data_offset": 256, 00:38:30.669 "data_size": 7936 00:38:30.669 } 00:38:30.669 ] 00:38:30.669 }' 00:38:30.669 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:30.669 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:30.669 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:30.669 [2024-06-07 12:40:54.284840] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:38:30.669 [2024-06-07 12:40:54.285129] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:38:30.669 [2024-06-07 12:40:54.285418] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:30.927 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:30.927 12:40:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:38:31.861 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:38:31.861 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:31.861 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:31.861 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:31.861 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:31.862 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:31.862 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:31.862 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:32.119 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:32.119 "name": "raid_bdev1", 00:38:32.119 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:32.119 "strip_size_kb": 0, 00:38:32.119 "state": "online", 00:38:32.119 "raid_level": "raid1", 00:38:32.119 "superblock": true, 00:38:32.119 "num_base_bdevs": 2, 00:38:32.119 "num_base_bdevs_discovered": 2, 00:38:32.119 "num_base_bdevs_operational": 2, 00:38:32.119 "base_bdevs_list": [ 00:38:32.119 { 00:38:32.119 "name": "spare", 00:38:32.120 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:32.120 "is_configured": true, 00:38:32.120 "data_offset": 256, 00:38:32.120 "data_size": 7936 00:38:32.120 }, 00:38:32.120 { 00:38:32.120 "name": "BaseBdev2", 00:38:32.120 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:32.120 "is_configured": true, 00:38:32.120 "data_offset": 256, 00:38:32.120 "data_size": 7936 00:38:32.120 } 00:38:32.120 ] 00:38:32.120 }' 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:32.120 12:40:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:32.685 "name": "raid_bdev1", 00:38:32.685 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:32.685 "strip_size_kb": 0, 00:38:32.685 "state": "online", 00:38:32.685 "raid_level": "raid1", 00:38:32.685 "superblock": true, 00:38:32.685 "num_base_bdevs": 2, 00:38:32.685 "num_base_bdevs_discovered": 2, 00:38:32.685 "num_base_bdevs_operational": 2, 00:38:32.685 "base_bdevs_list": [ 00:38:32.685 { 00:38:32.685 "name": "spare", 00:38:32.685 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:32.685 "is_configured": true, 00:38:32.685 "data_offset": 256, 00:38:32.685 "data_size": 7936 00:38:32.685 }, 00:38:32.685 { 00:38:32.685 "name": "BaseBdev2", 00:38:32.685 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:32.685 "is_configured": true, 00:38:32.685 "data_offset": 256, 00:38:32.685 "data_size": 7936 00:38:32.685 } 00:38:32.685 ] 00:38:32.685 }' 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:32.685 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:32.943 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:32.943 "name": "raid_bdev1", 00:38:32.943 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:32.943 "strip_size_kb": 0, 00:38:32.943 "state": "online", 00:38:32.943 "raid_level": "raid1", 00:38:32.943 "superblock": true, 00:38:32.943 "num_base_bdevs": 2, 00:38:32.943 "num_base_bdevs_discovered": 2, 00:38:32.943 "num_base_bdevs_operational": 2, 00:38:32.943 "base_bdevs_list": [ 00:38:32.943 { 00:38:32.943 "name": "spare", 00:38:32.943 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:32.943 "is_configured": true, 00:38:32.943 "data_offset": 256, 00:38:32.943 "data_size": 7936 00:38:32.943 }, 00:38:32.943 { 00:38:32.943 "name": "BaseBdev2", 00:38:32.943 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:32.943 "is_configured": true, 00:38:32.943 "data_offset": 256, 00:38:32.943 "data_size": 7936 00:38:32.943 } 00:38:32.943 ] 00:38:32.943 }' 00:38:32.943 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:32.943 12:40:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:33.510 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:38:33.769 [2024-06-07 12:40:57.318078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:38:33.769 [2024-06-07 12:40:57.318429] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:38:33.769 [2024-06-07 12:40:57.318649] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:33.769 [2024-06-07 12:40:57.318831] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:33.769 [2024-06-07 12:40:57.318929] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:38:33.769 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:38:33.769 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:34.334 12:40:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:38:34.591 /dev/nbd0 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:34.591 1+0 records in 00:38:34.591 1+0 records out 00:38:34.591 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000485177 s, 8.4 MB/s 00:38:34.591 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:34.592 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:38:34.849 /dev/nbd1 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:34.849 1+0 records in 00:38:34.849 1+0 records out 00:38:34.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00107047 s, 3.8 MB/s 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:34.849 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:35.415 12:40:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:38:35.673 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:35.931 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:36.189 [2024-06-07 12:40:59.599687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:36.189 [2024-06-07 12:40:59.600373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:36.189 [2024-06-07 12:40:59.600682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:38:36.189 [2024-06-07 12:40:59.600919] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:36.189 [2024-06-07 12:40:59.603856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:36.189 [2024-06-07 12:40:59.604155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:36.189 [2024-06-07 12:40:59.604485] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:38:36.189 [2024-06-07 12:40:59.604700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:36.189 [2024-06-07 12:40:59.605055] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:36.189 spare 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:36.189 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:36.189 [2024-06-07 12:40:59.705309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009380 00:38:36.189 [2024-06-07 12:40:59.705698] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:36.189 [2024-06-07 12:40:59.705955] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb4f0 00:38:36.189 [2024-06-07 12:40:59.706624] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009380 00:38:36.189 [2024-06-07 12:40:59.706769] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009380 00:38:36.189 [2024-06-07 12:40:59.707054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:36.447 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:36.447 "name": "raid_bdev1", 00:38:36.447 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:36.447 "strip_size_kb": 0, 00:38:36.447 "state": "online", 00:38:36.447 "raid_level": "raid1", 00:38:36.447 "superblock": true, 00:38:36.447 "num_base_bdevs": 2, 00:38:36.447 "num_base_bdevs_discovered": 2, 00:38:36.447 "num_base_bdevs_operational": 2, 00:38:36.447 "base_bdevs_list": [ 00:38:36.447 { 00:38:36.447 "name": "spare", 00:38:36.447 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:36.447 "is_configured": true, 00:38:36.447 "data_offset": 256, 00:38:36.447 "data_size": 7936 00:38:36.447 }, 00:38:36.447 { 00:38:36.447 "name": "BaseBdev2", 00:38:36.447 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:36.447 "is_configured": true, 00:38:36.447 "data_offset": 256, 00:38:36.447 "data_size": 7936 00:38:36.447 } 00:38:36.447 ] 00:38:36.447 }' 00:38:36.447 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:36.447 12:40:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:37.014 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:37.579 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:37.579 "name": "raid_bdev1", 00:38:37.579 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:37.579 "strip_size_kb": 0, 00:38:37.579 "state": "online", 00:38:37.579 "raid_level": "raid1", 00:38:37.579 "superblock": true, 00:38:37.579 "num_base_bdevs": 2, 00:38:37.579 "num_base_bdevs_discovered": 2, 00:38:37.579 "num_base_bdevs_operational": 2, 00:38:37.579 "base_bdevs_list": [ 00:38:37.579 { 00:38:37.579 "name": "spare", 00:38:37.579 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:37.579 "is_configured": true, 00:38:37.579 "data_offset": 256, 00:38:37.579 "data_size": 7936 00:38:37.579 }, 00:38:37.579 { 00:38:37.579 "name": "BaseBdev2", 00:38:37.579 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:37.579 "is_configured": true, 00:38:37.579 "data_offset": 256, 00:38:37.579 "data_size": 7936 00:38:37.579 } 00:38:37.579 ] 00:38:37.579 }' 00:38:37.579 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:37.579 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:37.579 12:41:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:37.579 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:37.579 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:37.579 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:38:37.837 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:38:37.837 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:38:38.095 [2024-06-07 12:41:01.673004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:38.095 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:38.353 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:38.353 "name": "raid_bdev1", 00:38:38.353 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:38.353 "strip_size_kb": 0, 00:38:38.354 "state": "online", 00:38:38.354 "raid_level": "raid1", 00:38:38.354 "superblock": true, 00:38:38.354 "num_base_bdevs": 2, 00:38:38.354 "num_base_bdevs_discovered": 1, 00:38:38.354 "num_base_bdevs_operational": 1, 00:38:38.354 "base_bdevs_list": [ 00:38:38.354 { 00:38:38.354 "name": null, 00:38:38.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:38.354 "is_configured": false, 00:38:38.354 "data_offset": 256, 00:38:38.354 "data_size": 7936 00:38:38.354 }, 00:38:38.354 { 00:38:38.354 "name": "BaseBdev2", 00:38:38.354 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:38.354 "is_configured": true, 00:38:38.354 "data_offset": 256, 00:38:38.354 "data_size": 7936 00:38:38.354 } 00:38:38.354 ] 00:38:38.354 }' 00:38:38.354 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:38.354 12:41:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:38.918 12:41:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:38:39.235 [2024-06-07 12:41:02.817156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:39.235 [2024-06-07 12:41:02.817606] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:38:39.235 [2024-06-07 12:41:02.817750] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:38:39.235 [2024-06-07 12:41:02.818425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:39.235 [2024-06-07 12:41:02.826572] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb690 00:38:39.235 [2024-06-07 12:41:02.829209] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:39.235 12:41:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:40.610 12:41:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:40.610 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:40.610 "name": "raid_bdev1", 00:38:40.610 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:40.610 "strip_size_kb": 0, 00:38:40.610 "state": "online", 00:38:40.610 "raid_level": "raid1", 00:38:40.610 "superblock": true, 00:38:40.610 "num_base_bdevs": 2, 00:38:40.610 "num_base_bdevs_discovered": 2, 00:38:40.610 "num_base_bdevs_operational": 2, 00:38:40.610 "process": { 00:38:40.610 "type": "rebuild", 00:38:40.610 "target": "spare", 00:38:40.610 "progress": { 00:38:40.610 "blocks": 3328, 00:38:40.610 "percent": 41 00:38:40.610 } 00:38:40.610 }, 00:38:40.610 "base_bdevs_list": [ 00:38:40.610 { 00:38:40.610 "name": "spare", 00:38:40.610 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:40.610 "is_configured": true, 00:38:40.610 "data_offset": 256, 00:38:40.610 "data_size": 7936 00:38:40.610 }, 00:38:40.610 { 00:38:40.610 "name": "BaseBdev2", 00:38:40.610 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:40.610 "is_configured": true, 00:38:40.610 "data_offset": 256, 00:38:40.610 "data_size": 7936 00:38:40.610 } 00:38:40.610 ] 00:38:40.610 }' 00:38:40.610 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:40.868 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:40.868 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:40.868 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:40.868 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:41.126 [2024-06-07 12:41:04.551064] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:41.126 [2024-06-07 12:41:04.642991] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:41.126 [2024-06-07 12:41:04.643898] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:41.126 [2024-06-07 12:41:04.644072] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:41.126 [2024-06-07 12:41:04.644186] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:41.126 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:41.383 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:41.383 "name": "raid_bdev1", 00:38:41.383 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:41.383 "strip_size_kb": 0, 00:38:41.383 "state": "online", 00:38:41.383 "raid_level": "raid1", 00:38:41.383 "superblock": true, 00:38:41.383 "num_base_bdevs": 2, 00:38:41.383 "num_base_bdevs_discovered": 1, 00:38:41.383 "num_base_bdevs_operational": 1, 00:38:41.383 "base_bdevs_list": [ 00:38:41.383 { 00:38:41.383 "name": null, 00:38:41.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:41.383 "is_configured": false, 00:38:41.383 "data_offset": 256, 00:38:41.383 "data_size": 7936 00:38:41.383 }, 00:38:41.383 { 00:38:41.383 "name": "BaseBdev2", 00:38:41.383 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:41.383 "is_configured": true, 00:38:41.384 "data_offset": 256, 00:38:41.384 "data_size": 7936 00:38:41.384 } 00:38:41.384 ] 00:38:41.384 }' 00:38:41.384 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:41.384 12:41:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:41.949 12:41:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:38:42.207 [2024-06-07 12:41:05.756970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:38:42.207 [2024-06-07 12:41:05.757690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:42.207 [2024-06-07 12:41:05.757982] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:38:42.207 [2024-06-07 12:41:05.758208] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:42.207 [2024-06-07 12:41:05.758886] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:42.207 [2024-06-07 12:41:05.759139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:38:42.208 [2024-06-07 12:41:05.759467] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:38:42.208 [2024-06-07 12:41:05.759617] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:38:42.208 [2024-06-07 12:41:05.759724] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:38:42.208 [2024-06-07 12:41:05.759911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:38:42.208 [2024-06-07 12:41:05.767969] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb9d0 00:38:42.208 spare 00:38:42.208 [2024-06-07 12:41:05.770583] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:38:42.208 12:41:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:43.583 12:41:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:43.583 "name": "raid_bdev1", 00:38:43.583 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:43.583 "strip_size_kb": 0, 00:38:43.583 "state": "online", 00:38:43.583 "raid_level": "raid1", 00:38:43.583 "superblock": true, 00:38:43.583 "num_base_bdevs": 2, 00:38:43.583 "num_base_bdevs_discovered": 2, 00:38:43.583 "num_base_bdevs_operational": 2, 00:38:43.583 "process": { 00:38:43.583 "type": "rebuild", 00:38:43.583 "target": "spare", 00:38:43.583 "progress": { 00:38:43.583 "blocks": 3072, 00:38:43.583 "percent": 38 00:38:43.583 } 00:38:43.583 }, 00:38:43.583 "base_bdevs_list": [ 00:38:43.583 { 00:38:43.583 "name": "spare", 00:38:43.583 "uuid": "9fdab072-c5db-542d-bb2c-95a6a3b5be14", 00:38:43.583 "is_configured": true, 00:38:43.583 "data_offset": 256, 00:38:43.583 "data_size": 7936 00:38:43.583 }, 00:38:43.583 { 00:38:43.583 "name": "BaseBdev2", 00:38:43.583 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:43.583 "is_configured": true, 00:38:43.583 "data_offset": 256, 00:38:43.583 "data_size": 7936 00:38:43.583 } 00:38:43.583 ] 00:38:43.583 }' 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:38:43.583 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:38:43.841 [2024-06-07 12:41:07.448257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:43.841 [2024-06-07 12:41:07.483518] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:38:43.841 [2024-06-07 12:41:07.484453] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:43.841 [2024-06-07 12:41:07.484628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:38:43.841 [2024-06-07 12:41:07.484684] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:44.100 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:44.358 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:44.358 "name": "raid_bdev1", 00:38:44.358 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:44.358 "strip_size_kb": 0, 00:38:44.358 "state": "online", 00:38:44.358 "raid_level": "raid1", 00:38:44.358 "superblock": true, 00:38:44.358 "num_base_bdevs": 2, 00:38:44.358 "num_base_bdevs_discovered": 1, 00:38:44.358 "num_base_bdevs_operational": 1, 00:38:44.358 "base_bdevs_list": [ 00:38:44.358 { 00:38:44.358 "name": null, 00:38:44.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:44.358 "is_configured": false, 00:38:44.358 "data_offset": 256, 00:38:44.358 "data_size": 7936 00:38:44.358 }, 00:38:44.358 { 00:38:44.358 "name": "BaseBdev2", 00:38:44.358 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:44.358 "is_configured": true, 00:38:44.358 "data_offset": 256, 00:38:44.358 "data_size": 7936 00:38:44.358 } 00:38:44.358 ] 00:38:44.358 }' 00:38:44.358 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:44.358 12:41:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:44.925 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:45.184 "name": "raid_bdev1", 00:38:45.184 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:45.184 "strip_size_kb": 0, 00:38:45.184 "state": "online", 00:38:45.184 "raid_level": "raid1", 00:38:45.184 "superblock": true, 00:38:45.184 "num_base_bdevs": 2, 00:38:45.184 "num_base_bdevs_discovered": 1, 00:38:45.184 "num_base_bdevs_operational": 1, 00:38:45.184 "base_bdevs_list": [ 00:38:45.184 { 00:38:45.184 "name": null, 00:38:45.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:45.184 "is_configured": false, 00:38:45.184 "data_offset": 256, 00:38:45.184 "data_size": 7936 00:38:45.184 }, 00:38:45.184 { 00:38:45.184 "name": "BaseBdev2", 00:38:45.184 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:45.184 "is_configured": true, 00:38:45.184 "data_offset": 256, 00:38:45.184 "data_size": 7936 00:38:45.184 } 00:38:45.184 ] 00:38:45.184 }' 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:45.184 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:38:45.443 12:41:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:38:45.702 [2024-06-07 12:41:09.241559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:38:45.702 [2024-06-07 12:41:09.242385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:45.702 [2024-06-07 12:41:09.242667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:38:45.702 [2024-06-07 12:41:09.242867] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:45.702 [2024-06-07 12:41:09.243418] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:45.702 [2024-06-07 12:41:09.243658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:38:45.702 [2024-06-07 12:41:09.243911] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:38:45.702 [2024-06-07 12:41:09.244025] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:38:45.702 [2024-06-07 12:41:09.244108] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:38:45.702 BaseBdev1 00:38:45.702 12:41:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:46.636 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:46.895 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:46.895 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:46.895 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:46.895 "name": "raid_bdev1", 00:38:46.895 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:46.895 "strip_size_kb": 0, 00:38:46.895 "state": "online", 00:38:46.895 "raid_level": "raid1", 00:38:46.895 "superblock": true, 00:38:46.895 "num_base_bdevs": 2, 00:38:46.895 "num_base_bdevs_discovered": 1, 00:38:46.895 "num_base_bdevs_operational": 1, 00:38:46.895 "base_bdevs_list": [ 00:38:46.895 { 00:38:46.895 "name": null, 00:38:46.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:46.895 "is_configured": false, 00:38:46.895 "data_offset": 256, 00:38:46.895 "data_size": 7936 00:38:46.895 }, 00:38:46.895 { 00:38:46.895 "name": "BaseBdev2", 00:38:46.895 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:46.895 "is_configured": true, 00:38:46.895 "data_offset": 256, 00:38:46.895 "data_size": 7936 00:38:46.895 } 00:38:46.895 ] 00:38:46.895 }' 00:38:46.895 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:46.895 12:41:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:47.828 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:48.086 "name": "raid_bdev1", 00:38:48.086 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:48.086 "strip_size_kb": 0, 00:38:48.086 "state": "online", 00:38:48.086 "raid_level": "raid1", 00:38:48.086 "superblock": true, 00:38:48.086 "num_base_bdevs": 2, 00:38:48.086 "num_base_bdevs_discovered": 1, 00:38:48.086 "num_base_bdevs_operational": 1, 00:38:48.086 "base_bdevs_list": [ 00:38:48.086 { 00:38:48.086 "name": null, 00:38:48.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:48.086 "is_configured": false, 00:38:48.086 "data_offset": 256, 00:38:48.086 "data_size": 7936 00:38:48.086 }, 00:38:48.086 { 00:38:48.086 "name": "BaseBdev2", 00:38:48.086 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:48.086 "is_configured": true, 00:38:48.086 "data_offset": 256, 00:38:48.086 "data_size": 7936 00:38:48.086 } 00:38:48.086 ] 00:38:48.086 }' 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@649 -- # local es=0 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:38:48.086 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:38:48.344 [2024-06-07 12:41:11.797522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:48.344 [2024-06-07 12:41:11.797996] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:38:48.344 [2024-06-07 12:41:11.798151] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:38:48.344 request: 00:38:48.344 { 00:38:48.344 "raid_bdev": "raid_bdev1", 00:38:48.344 "base_bdev": "BaseBdev1", 00:38:48.344 "method": "bdev_raid_add_base_bdev", 00:38:48.344 "req_id": 1 00:38:48.344 } 00:38:48.344 Got JSON-RPC error response 00:38:48.344 response: 00:38:48.344 { 00:38:48.344 "code": -22, 00:38:48.344 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:38:48.344 } 00:38:48.344 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # es=1 00:38:48.344 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:38:48.344 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:38:48.344 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:38:48.344 12:41:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:49.280 12:41:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:49.539 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:49.539 "name": "raid_bdev1", 00:38:49.539 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:49.539 "strip_size_kb": 0, 00:38:49.539 "state": "online", 00:38:49.539 "raid_level": "raid1", 00:38:49.539 "superblock": true, 00:38:49.539 "num_base_bdevs": 2, 00:38:49.539 "num_base_bdevs_discovered": 1, 00:38:49.539 "num_base_bdevs_operational": 1, 00:38:49.539 "base_bdevs_list": [ 00:38:49.539 { 00:38:49.539 "name": null, 00:38:49.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:49.539 "is_configured": false, 00:38:49.539 "data_offset": 256, 00:38:49.539 "data_size": 7936 00:38:49.539 }, 00:38:49.539 { 00:38:49.539 "name": "BaseBdev2", 00:38:49.539 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:49.539 "is_configured": true, 00:38:49.539 "data_offset": 256, 00:38:49.539 "data_size": 7936 00:38:49.539 } 00:38:49.539 ] 00:38:49.539 }' 00:38:49.539 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:49.539 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:38:50.104 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:50.361 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:38:50.361 "name": "raid_bdev1", 00:38:50.361 "uuid": "39b13362-9033-45c1-a4b3-1d7f8af18097", 00:38:50.361 "strip_size_kb": 0, 00:38:50.361 "state": "online", 00:38:50.361 "raid_level": "raid1", 00:38:50.361 "superblock": true, 00:38:50.361 "num_base_bdevs": 2, 00:38:50.361 "num_base_bdevs_discovered": 1, 00:38:50.361 "num_base_bdevs_operational": 1, 00:38:50.361 "base_bdevs_list": [ 00:38:50.361 { 00:38:50.361 "name": null, 00:38:50.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:50.361 "is_configured": false, 00:38:50.362 "data_offset": 256, 00:38:50.362 "data_size": 7936 00:38:50.362 }, 00:38:50.362 { 00:38:50.362 "name": "BaseBdev2", 00:38:50.362 "uuid": "bffa9bd1-dde2-594d-b60d-f04270ba4ee9", 00:38:50.362 "is_configured": true, 00:38:50.362 "data_offset": 256, 00:38:50.362 "data_size": 7936 00:38:50.362 } 00:38:50.362 ] 00:38:50.362 }' 00:38:50.362 12:41:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 227696 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 227696 ']' 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 227696 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 227696 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 227696' 00:38:50.619 killing process with pid 227696 00:38:50.619 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # kill 227696 00:38:50.619 Received shutdown signal, test time was about 60.000000 seconds 00:38:50.619 00:38:50.620 Latency(us) 00:38:50.620 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:50.620 =================================================================================================================== 00:38:50.620 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:38:50.620 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@973 -- # wait 227696 00:38:50.620 [2024-06-07 12:41:14.097173] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:38:50.620 [2024-06-07 12:41:14.097444] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:38:50.620 [2024-06-07 12:41:14.097589] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:38:50.620 [2024-06-07 12:41:14.097684] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state offline 00:38:50.620 [2024-06-07 12:41:14.162531] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:38:51.186 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:38:51.186 00:38:51.186 real 0m33.187s 00:38:51.186 user 0m52.340s 00:38:51.186 sys 0m5.298s 00:38:51.186 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:38:51.186 12:41:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:38:51.186 ************************************ 00:38:51.186 END TEST raid_rebuild_test_sb_4k 00:38:51.186 ************************************ 00:38:51.186 12:41:14 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:38:51.186 12:41:14 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:38:51.186 12:41:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:38:51.186 12:41:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:38:51.186 12:41:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:38:51.186 ************************************ 00:38:51.186 START TEST raid_state_function_test_sb_md_separate 00:38:51.186 ************************************ 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=228559 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 228559' 00:38:51.186 Process raid pid: 228559 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 228559 /var/tmp/spdk-raid.sock 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 228559 ']' 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:38:51.186 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:38:51.186 12:41:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:38:51.186 [2024-06-07 12:41:14.683132] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:38:51.186 [2024-06-07 12:41:14.683700] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:51.444 [2024-06-07 12:41:14.831445] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:51.444 [2024-06-07 12:41:14.927394] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:38:51.444 [2024-06-07 12:41:15.012719] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:38:52.376 12:41:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:38:52.376 12:41:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:38:52.376 12:41:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:38:52.376 [2024-06-07 12:41:15.979108] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:38:52.376 [2024-06-07 12:41:15.979993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:38:52.376 [2024-06-07 12:41:15.980179] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:38:52.376 [2024-06-07 12:41:15.980461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:52.376 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:38:52.635 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:52.635 "name": "Existed_Raid", 00:38:52.635 "uuid": "3433d12a-b954-4b66-9256-9ed9c3b5af85", 00:38:52.635 "strip_size_kb": 0, 00:38:52.635 "state": "configuring", 00:38:52.635 "raid_level": "raid1", 00:38:52.635 "superblock": true, 00:38:52.635 "num_base_bdevs": 2, 00:38:52.635 "num_base_bdevs_discovered": 0, 00:38:52.635 "num_base_bdevs_operational": 2, 00:38:52.635 "base_bdevs_list": [ 00:38:52.635 { 00:38:52.635 "name": "BaseBdev1", 00:38:52.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:52.635 "is_configured": false, 00:38:52.635 "data_offset": 0, 00:38:52.635 "data_size": 0 00:38:52.635 }, 00:38:52.635 { 00:38:52.635 "name": "BaseBdev2", 00:38:52.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:52.635 "is_configured": false, 00:38:52.635 "data_offset": 0, 00:38:52.635 "data_size": 0 00:38:52.635 } 00:38:52.635 ] 00:38:52.635 }' 00:38:52.635 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:52.635 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:38:53.200 12:41:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:38:53.457 [2024-06-07 12:41:17.059124] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:38:53.457 [2024-06-07 12:41:17.059465] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:38:53.457 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:38:53.715 [2024-06-07 12:41:17.331211] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:38:53.715 [2024-06-07 12:41:17.331654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:38:53.715 [2024-06-07 12:41:17.331804] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:38:53.715 [2024-06-07 12:41:17.331960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:38:53.715 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:38:53.973 [2024-06-07 12:41:17.593682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:53.973 BaseBdev1 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:38:53.973 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:38:54.231 12:41:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:38:54.489 [ 00:38:54.489 { 00:38:54.489 "name": "BaseBdev1", 00:38:54.489 "aliases": [ 00:38:54.489 "0abb9f56-65ee-4964-88e8-54c3fc7febfb" 00:38:54.489 ], 00:38:54.489 "product_name": "Malloc disk", 00:38:54.489 "block_size": 4096, 00:38:54.489 "num_blocks": 8192, 00:38:54.489 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:54.489 "md_size": 32, 00:38:54.489 "md_interleave": false, 00:38:54.489 "dif_type": 0, 00:38:54.489 "assigned_rate_limits": { 00:38:54.489 "rw_ios_per_sec": 0, 00:38:54.489 "rw_mbytes_per_sec": 0, 00:38:54.489 "r_mbytes_per_sec": 0, 00:38:54.489 "w_mbytes_per_sec": 0 00:38:54.489 }, 00:38:54.489 "claimed": true, 00:38:54.489 "claim_type": "exclusive_write", 00:38:54.489 "zoned": false, 00:38:54.489 "supported_io_types": { 00:38:54.489 "read": true, 00:38:54.489 "write": true, 00:38:54.489 "unmap": true, 00:38:54.489 "write_zeroes": true, 00:38:54.489 "flush": true, 00:38:54.489 "reset": true, 00:38:54.489 "compare": false, 00:38:54.489 "compare_and_write": false, 00:38:54.489 "abort": true, 00:38:54.489 "nvme_admin": false, 00:38:54.489 "nvme_io": false 00:38:54.489 }, 00:38:54.489 "memory_domains": [ 00:38:54.489 { 00:38:54.489 "dma_device_id": "system", 00:38:54.489 "dma_device_type": 1 00:38:54.489 }, 00:38:54.489 { 00:38:54.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:54.489 "dma_device_type": 2 00:38:54.489 } 00:38:54.489 ], 00:38:54.489 "driver_specific": {} 00:38:54.489 } 00:38:54.489 ] 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:54.489 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:54.746 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:54.746 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:38:55.002 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:55.002 "name": "Existed_Raid", 00:38:55.002 "uuid": "8761d01a-710b-4fab-9d18-dfe96245b75f", 00:38:55.002 "strip_size_kb": 0, 00:38:55.002 "state": "configuring", 00:38:55.002 "raid_level": "raid1", 00:38:55.002 "superblock": true, 00:38:55.002 "num_base_bdevs": 2, 00:38:55.002 "num_base_bdevs_discovered": 1, 00:38:55.002 "num_base_bdevs_operational": 2, 00:38:55.002 "base_bdevs_list": [ 00:38:55.002 { 00:38:55.002 "name": "BaseBdev1", 00:38:55.002 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:55.002 "is_configured": true, 00:38:55.002 "data_offset": 256, 00:38:55.002 "data_size": 7936 00:38:55.002 }, 00:38:55.002 { 00:38:55.002 "name": "BaseBdev2", 00:38:55.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:55.002 "is_configured": false, 00:38:55.002 "data_offset": 0, 00:38:55.002 "data_size": 0 00:38:55.002 } 00:38:55.002 ] 00:38:55.002 }' 00:38:55.002 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:55.002 12:41:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:38:55.564 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:38:55.822 [2024-06-07 12:41:19.354023] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:38:55.822 [2024-06-07 12:41:19.354387] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:38:55.822 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:38:56.085 [2024-06-07 12:41:19.610122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:38:56.085 [2024-06-07 12:41:19.612606] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:38:56.085 [2024-06-07 12:41:19.612913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:38:56.085 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:56.650 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:56.650 "name": "Existed_Raid", 00:38:56.650 "uuid": "41f70e50-352f-4e9f-bd60-94e72d999bec", 00:38:56.650 "strip_size_kb": 0, 00:38:56.650 "state": "configuring", 00:38:56.650 "raid_level": "raid1", 00:38:56.650 "superblock": true, 00:38:56.650 "num_base_bdevs": 2, 00:38:56.650 "num_base_bdevs_discovered": 1, 00:38:56.650 "num_base_bdevs_operational": 2, 00:38:56.650 "base_bdevs_list": [ 00:38:56.650 { 00:38:56.650 "name": "BaseBdev1", 00:38:56.650 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:56.650 "is_configured": true, 00:38:56.650 "data_offset": 256, 00:38:56.650 "data_size": 7936 00:38:56.650 }, 00:38:56.650 { 00:38:56.650 "name": "BaseBdev2", 00:38:56.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:38:56.650 "is_configured": false, 00:38:56.650 "data_offset": 0, 00:38:56.650 "data_size": 0 00:38:56.650 } 00:38:56.650 ] 00:38:56.650 }' 00:38:56.650 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:56.650 12:41:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:38:57.216 12:41:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:38:57.473 [2024-06-07 12:41:21.046882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:38:57.473 [2024-06-07 12:41:21.047408] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:38:57.473 [2024-06-07 12:41:21.047533] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:38:57.473 [2024-06-07 12:41:21.047976] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:38:57.473 [2024-06-07 12:41:21.048246] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:38:57.473 [2024-06-07 12:41:21.048364] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:38:57.473 [2024-06-07 12:41:21.048554] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:38:57.473 BaseBdev2 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:38:57.473 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:38:58.036 [ 00:38:58.036 { 00:38:58.036 "name": "BaseBdev2", 00:38:58.036 "aliases": [ 00:38:58.036 "68b71597-610b-44f8-911e-f0895ad059ae" 00:38:58.036 ], 00:38:58.036 "product_name": "Malloc disk", 00:38:58.036 "block_size": 4096, 00:38:58.036 "num_blocks": 8192, 00:38:58.036 "uuid": "68b71597-610b-44f8-911e-f0895ad059ae", 00:38:58.036 "md_size": 32, 00:38:58.036 "md_interleave": false, 00:38:58.036 "dif_type": 0, 00:38:58.036 "assigned_rate_limits": { 00:38:58.036 "rw_ios_per_sec": 0, 00:38:58.036 "rw_mbytes_per_sec": 0, 00:38:58.036 "r_mbytes_per_sec": 0, 00:38:58.036 "w_mbytes_per_sec": 0 00:38:58.036 }, 00:38:58.036 "claimed": true, 00:38:58.036 "claim_type": "exclusive_write", 00:38:58.036 "zoned": false, 00:38:58.036 "supported_io_types": { 00:38:58.036 "read": true, 00:38:58.036 "write": true, 00:38:58.036 "unmap": true, 00:38:58.036 "write_zeroes": true, 00:38:58.036 "flush": true, 00:38:58.036 "reset": true, 00:38:58.036 "compare": false, 00:38:58.036 "compare_and_write": false, 00:38:58.036 "abort": true, 00:38:58.036 "nvme_admin": false, 00:38:58.036 "nvme_io": false 00:38:58.036 }, 00:38:58.036 "memory_domains": [ 00:38:58.036 { 00:38:58.036 "dma_device_id": "system", 00:38:58.036 "dma_device_type": 1 00:38:58.036 }, 00:38:58.036 { 00:38:58.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:58.036 "dma_device_type": 2 00:38:58.036 } 00:38:58.036 ], 00:38:58.036 "driver_specific": {} 00:38:58.036 } 00:38:58.036 ] 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:38:58.036 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:38:58.292 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:38:58.292 "name": "Existed_Raid", 00:38:58.292 "uuid": "41f70e50-352f-4e9f-bd60-94e72d999bec", 00:38:58.292 "strip_size_kb": 0, 00:38:58.292 "state": "online", 00:38:58.292 "raid_level": "raid1", 00:38:58.292 "superblock": true, 00:38:58.292 "num_base_bdevs": 2, 00:38:58.292 "num_base_bdevs_discovered": 2, 00:38:58.292 "num_base_bdevs_operational": 2, 00:38:58.292 "base_bdevs_list": [ 00:38:58.292 { 00:38:58.292 "name": "BaseBdev1", 00:38:58.292 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:58.292 "is_configured": true, 00:38:58.292 "data_offset": 256, 00:38:58.292 "data_size": 7936 00:38:58.292 }, 00:38:58.292 { 00:38:58.292 "name": "BaseBdev2", 00:38:58.292 "uuid": "68b71597-610b-44f8-911e-f0895ad059ae", 00:38:58.292 "is_configured": true, 00:38:58.292 "data_offset": 256, 00:38:58.292 "data_size": 7936 00:38:58.292 } 00:38:58.292 ] 00:38:58.292 }' 00:38:58.292 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:38:58.292 12:41:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:38:59.221 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:38:59.221 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:38:59.222 [2024-06-07 12:41:22.739489] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:38:59.222 "name": "Existed_Raid", 00:38:59.222 "aliases": [ 00:38:59.222 "41f70e50-352f-4e9f-bd60-94e72d999bec" 00:38:59.222 ], 00:38:59.222 "product_name": "Raid Volume", 00:38:59.222 "block_size": 4096, 00:38:59.222 "num_blocks": 7936, 00:38:59.222 "uuid": "41f70e50-352f-4e9f-bd60-94e72d999bec", 00:38:59.222 "md_size": 32, 00:38:59.222 "md_interleave": false, 00:38:59.222 "dif_type": 0, 00:38:59.222 "assigned_rate_limits": { 00:38:59.222 "rw_ios_per_sec": 0, 00:38:59.222 "rw_mbytes_per_sec": 0, 00:38:59.222 "r_mbytes_per_sec": 0, 00:38:59.222 "w_mbytes_per_sec": 0 00:38:59.222 }, 00:38:59.222 "claimed": false, 00:38:59.222 "zoned": false, 00:38:59.222 "supported_io_types": { 00:38:59.222 "read": true, 00:38:59.222 "write": true, 00:38:59.222 "unmap": false, 00:38:59.222 "write_zeroes": true, 00:38:59.222 "flush": false, 00:38:59.222 "reset": true, 00:38:59.222 "compare": false, 00:38:59.222 "compare_and_write": false, 00:38:59.222 "abort": false, 00:38:59.222 "nvme_admin": false, 00:38:59.222 "nvme_io": false 00:38:59.222 }, 00:38:59.222 "memory_domains": [ 00:38:59.222 { 00:38:59.222 "dma_device_id": "system", 00:38:59.222 "dma_device_type": 1 00:38:59.222 }, 00:38:59.222 { 00:38:59.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:59.222 "dma_device_type": 2 00:38:59.222 }, 00:38:59.222 { 00:38:59.222 "dma_device_id": "system", 00:38:59.222 "dma_device_type": 1 00:38:59.222 }, 00:38:59.222 { 00:38:59.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:59.222 "dma_device_type": 2 00:38:59.222 } 00:38:59.222 ], 00:38:59.222 "driver_specific": { 00:38:59.222 "raid": { 00:38:59.222 "uuid": "41f70e50-352f-4e9f-bd60-94e72d999bec", 00:38:59.222 "strip_size_kb": 0, 00:38:59.222 "state": "online", 00:38:59.222 "raid_level": "raid1", 00:38:59.222 "superblock": true, 00:38:59.222 "num_base_bdevs": 2, 00:38:59.222 "num_base_bdevs_discovered": 2, 00:38:59.222 "num_base_bdevs_operational": 2, 00:38:59.222 "base_bdevs_list": [ 00:38:59.222 { 00:38:59.222 "name": "BaseBdev1", 00:38:59.222 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:59.222 "is_configured": true, 00:38:59.222 "data_offset": 256, 00:38:59.222 "data_size": 7936 00:38:59.222 }, 00:38:59.222 { 00:38:59.222 "name": "BaseBdev2", 00:38:59.222 "uuid": "68b71597-610b-44f8-911e-f0895ad059ae", 00:38:59.222 "is_configured": true, 00:38:59.222 "data_offset": 256, 00:38:59.222 "data_size": 7936 00:38:59.222 } 00:38:59.222 ] 00:38:59.222 } 00:38:59.222 } 00:38:59.222 }' 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:38:59.222 BaseBdev2' 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:38:59.222 12:41:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:38:59.787 "name": "BaseBdev1", 00:38:59.787 "aliases": [ 00:38:59.787 "0abb9f56-65ee-4964-88e8-54c3fc7febfb" 00:38:59.787 ], 00:38:59.787 "product_name": "Malloc disk", 00:38:59.787 "block_size": 4096, 00:38:59.787 "num_blocks": 8192, 00:38:59.787 "uuid": "0abb9f56-65ee-4964-88e8-54c3fc7febfb", 00:38:59.787 "md_size": 32, 00:38:59.787 "md_interleave": false, 00:38:59.787 "dif_type": 0, 00:38:59.787 "assigned_rate_limits": { 00:38:59.787 "rw_ios_per_sec": 0, 00:38:59.787 "rw_mbytes_per_sec": 0, 00:38:59.787 "r_mbytes_per_sec": 0, 00:38:59.787 "w_mbytes_per_sec": 0 00:38:59.787 }, 00:38:59.787 "claimed": true, 00:38:59.787 "claim_type": "exclusive_write", 00:38:59.787 "zoned": false, 00:38:59.787 "supported_io_types": { 00:38:59.787 "read": true, 00:38:59.787 "write": true, 00:38:59.787 "unmap": true, 00:38:59.787 "write_zeroes": true, 00:38:59.787 "flush": true, 00:38:59.787 "reset": true, 00:38:59.787 "compare": false, 00:38:59.787 "compare_and_write": false, 00:38:59.787 "abort": true, 00:38:59.787 "nvme_admin": false, 00:38:59.787 "nvme_io": false 00:38:59.787 }, 00:38:59.787 "memory_domains": [ 00:38:59.787 { 00:38:59.787 "dma_device_id": "system", 00:38:59.787 "dma_device_type": 1 00:38:59.787 }, 00:38:59.787 { 00:38:59.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:59.787 "dma_device_type": 2 00:38:59.787 } 00:38:59.787 ], 00:38:59.787 "driver_specific": {} 00:38:59.787 }' 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:38:59.787 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:39:00.045 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:00.303 "name": "BaseBdev2", 00:39:00.303 "aliases": [ 00:39:00.303 "68b71597-610b-44f8-911e-f0895ad059ae" 00:39:00.303 ], 00:39:00.303 "product_name": "Malloc disk", 00:39:00.303 "block_size": 4096, 00:39:00.303 "num_blocks": 8192, 00:39:00.303 "uuid": "68b71597-610b-44f8-911e-f0895ad059ae", 00:39:00.303 "md_size": 32, 00:39:00.303 "md_interleave": false, 00:39:00.303 "dif_type": 0, 00:39:00.303 "assigned_rate_limits": { 00:39:00.303 "rw_ios_per_sec": 0, 00:39:00.303 "rw_mbytes_per_sec": 0, 00:39:00.303 "r_mbytes_per_sec": 0, 00:39:00.303 "w_mbytes_per_sec": 0 00:39:00.303 }, 00:39:00.303 "claimed": true, 00:39:00.303 "claim_type": "exclusive_write", 00:39:00.303 "zoned": false, 00:39:00.303 "supported_io_types": { 00:39:00.303 "read": true, 00:39:00.303 "write": true, 00:39:00.303 "unmap": true, 00:39:00.303 "write_zeroes": true, 00:39:00.303 "flush": true, 00:39:00.303 "reset": true, 00:39:00.303 "compare": false, 00:39:00.303 "compare_and_write": false, 00:39:00.303 "abort": true, 00:39:00.303 "nvme_admin": false, 00:39:00.303 "nvme_io": false 00:39:00.303 }, 00:39:00.303 "memory_domains": [ 00:39:00.303 { 00:39:00.303 "dma_device_id": "system", 00:39:00.303 "dma_device_type": 1 00:39:00.303 }, 00:39:00.303 { 00:39:00.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:00.303 "dma_device_type": 2 00:39:00.303 } 00:39:00.303 ], 00:39:00.303 "driver_specific": {} 00:39:00.303 }' 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:00.303 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:00.561 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:39:00.561 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:00.561 12:41:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:00.561 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:00.561 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:00.561 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:00.561 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:00.561 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:39:00.818 [2024-06-07 12:41:24.415627] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:01.076 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:01.335 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:01.335 "name": "Existed_Raid", 00:39:01.335 "uuid": "41f70e50-352f-4e9f-bd60-94e72d999bec", 00:39:01.335 "strip_size_kb": 0, 00:39:01.335 "state": "online", 00:39:01.335 "raid_level": "raid1", 00:39:01.335 "superblock": true, 00:39:01.335 "num_base_bdevs": 2, 00:39:01.335 "num_base_bdevs_discovered": 1, 00:39:01.335 "num_base_bdevs_operational": 1, 00:39:01.335 "base_bdevs_list": [ 00:39:01.335 { 00:39:01.335 "name": null, 00:39:01.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:01.335 "is_configured": false, 00:39:01.335 "data_offset": 256, 00:39:01.335 "data_size": 7936 00:39:01.335 }, 00:39:01.335 { 00:39:01.335 "name": "BaseBdev2", 00:39:01.335 "uuid": "68b71597-610b-44f8-911e-f0895ad059ae", 00:39:01.335 "is_configured": true, 00:39:01.335 "data_offset": 256, 00:39:01.335 "data_size": 7936 00:39:01.335 } 00:39:01.335 ] 00:39:01.335 }' 00:39:01.335 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:01.335 12:41:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:01.899 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:39:01.899 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:39:01.899 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:01.899 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:39:02.156 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:39:02.156 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:39:02.156 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:39:02.414 [2024-06-07 12:41:25.928057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:39:02.414 [2024-06-07 12:41:25.928536] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:02.414 [2024-06-07 12:41:25.953993] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:02.414 [2024-06-07 12:41:25.954372] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:02.414 [2024-06-07 12:41:25.954510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:39:02.414 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:39:02.414 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:39:02.414 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:39:02.414 12:41:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 228559 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 228559 ']' 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 228559 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 228559 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:02.980 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 228559' 00:39:02.980 killing process with pid 228559 00:39:02.981 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 228559 00:39:02.981 [2024-06-07 12:41:26.395938] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:02.981 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 228559 00:39:02.981 [2024-06-07 12:41:26.396274] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:03.238 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:39:03.238 00:39:03.238 real 0m12.125s 00:39:03.238 user 0m21.547s 00:39:03.238 sys 0m1.997s 00:39:03.238 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:03.238 12:41:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:03.238 ************************************ 00:39:03.238 END TEST raid_state_function_test_sb_md_separate 00:39:03.238 ************************************ 00:39:03.238 12:41:26 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:39:03.238 12:41:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:39:03.238 12:41:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:03.238 12:41:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:03.238 ************************************ 00:39:03.238 START TEST raid_superblock_test_md_separate 00:39:03.238 ************************************ 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=228934 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 228934 /var/tmp/spdk-raid.sock 00:39:03.238 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@830 -- # '[' -z 228934 ']' 00:39:03.239 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:03.239 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:03.239 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:03.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:03.239 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:03.239 12:41:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:03.239 [2024-06-07 12:41:26.878256] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:39:03.239 [2024-06-07 12:41:26.878853] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid228934 ] 00:39:03.495 [2024-06-07 12:41:27.023027] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:03.495 [2024-06-07 12:41:27.138448] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:03.753 [2024-06-07 12:41:27.221047] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@863 -- # return 0 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:39:03.753 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:39:04.022 malloc1 00:39:04.022 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:39:04.280 [2024-06-07 12:41:27.878919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:39:04.280 [2024-06-07 12:41:27.879462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:04.280 [2024-06-07 12:41:27.879994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:39:04.280 [2024-06-07 12:41:27.880401] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:04.280 [2024-06-07 12:41:27.884933] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:04.280 [2024-06-07 12:41:27.885414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:39:04.280 pt1 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:39:04.280 12:41:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:39:04.538 malloc2 00:39:04.538 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:39:04.796 [2024-06-07 12:41:28.431642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:39:04.796 [2024-06-07 12:41:28.432015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:04.796 [2024-06-07 12:41:28.432113] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:39:04.796 [2024-06-07 12:41:28.432280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:04.796 [2024-06-07 12:41:28.434531] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:04.796 [2024-06-07 12:41:28.434694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:39:04.796 pt2 00:39:05.054 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:39:05.054 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:39:05.054 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:39:05.054 [2024-06-07 12:41:28.699783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:39:05.313 [2024-06-07 12:41:28.702072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:39:05.313 [2024-06-07 12:41:28.702381] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:39:05.313 [2024-06-07 12:41:28.702493] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:05.313 [2024-06-07 12:41:28.702659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:39:05.313 [2024-06-07 12:41:28.702892] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:39:05.313 [2024-06-07 12:41:28.702990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:39:05.313 [2024-06-07 12:41:28.703162] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:05.313 "name": "raid_bdev1", 00:39:05.313 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:05.313 "strip_size_kb": 0, 00:39:05.313 "state": "online", 00:39:05.313 "raid_level": "raid1", 00:39:05.313 "superblock": true, 00:39:05.313 "num_base_bdevs": 2, 00:39:05.313 "num_base_bdevs_discovered": 2, 00:39:05.313 "num_base_bdevs_operational": 2, 00:39:05.313 "base_bdevs_list": [ 00:39:05.313 { 00:39:05.313 "name": "pt1", 00:39:05.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:05.313 "is_configured": true, 00:39:05.313 "data_offset": 256, 00:39:05.313 "data_size": 7936 00:39:05.313 }, 00:39:05.313 { 00:39:05.313 "name": "pt2", 00:39:05.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:05.313 "is_configured": true, 00:39:05.313 "data_offset": 256, 00:39:05.313 "data_size": 7936 00:39:05.313 } 00:39:05.313 ] 00:39:05.313 }' 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:05.313 12:41:28 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:06.246 12:41:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:39:06.504 [2024-06-07 12:41:29.984020] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:06.504 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:39:06.504 "name": "raid_bdev1", 00:39:06.504 "aliases": [ 00:39:06.504 "edfea29a-41e2-40d1-91d4-1b3d09a42b1e" 00:39:06.504 ], 00:39:06.504 "product_name": "Raid Volume", 00:39:06.504 "block_size": 4096, 00:39:06.504 "num_blocks": 7936, 00:39:06.504 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:06.504 "md_size": 32, 00:39:06.504 "md_interleave": false, 00:39:06.504 "dif_type": 0, 00:39:06.504 "assigned_rate_limits": { 00:39:06.504 "rw_ios_per_sec": 0, 00:39:06.504 "rw_mbytes_per_sec": 0, 00:39:06.504 "r_mbytes_per_sec": 0, 00:39:06.504 "w_mbytes_per_sec": 0 00:39:06.504 }, 00:39:06.504 "claimed": false, 00:39:06.504 "zoned": false, 00:39:06.504 "supported_io_types": { 00:39:06.504 "read": true, 00:39:06.504 "write": true, 00:39:06.504 "unmap": false, 00:39:06.504 "write_zeroes": true, 00:39:06.504 "flush": false, 00:39:06.504 "reset": true, 00:39:06.504 "compare": false, 00:39:06.504 "compare_and_write": false, 00:39:06.504 "abort": false, 00:39:06.504 "nvme_admin": false, 00:39:06.504 "nvme_io": false 00:39:06.504 }, 00:39:06.504 "memory_domains": [ 00:39:06.504 { 00:39:06.504 "dma_device_id": "system", 00:39:06.504 "dma_device_type": 1 00:39:06.504 }, 00:39:06.504 { 00:39:06.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:06.504 "dma_device_type": 2 00:39:06.504 }, 00:39:06.504 { 00:39:06.504 "dma_device_id": "system", 00:39:06.504 "dma_device_type": 1 00:39:06.504 }, 00:39:06.504 { 00:39:06.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:06.504 "dma_device_type": 2 00:39:06.505 } 00:39:06.505 ], 00:39:06.505 "driver_specific": { 00:39:06.505 "raid": { 00:39:06.505 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:06.505 "strip_size_kb": 0, 00:39:06.505 "state": "online", 00:39:06.505 "raid_level": "raid1", 00:39:06.505 "superblock": true, 00:39:06.505 "num_base_bdevs": 2, 00:39:06.505 "num_base_bdevs_discovered": 2, 00:39:06.505 "num_base_bdevs_operational": 2, 00:39:06.505 "base_bdevs_list": [ 00:39:06.505 { 00:39:06.505 "name": "pt1", 00:39:06.505 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:06.505 "is_configured": true, 00:39:06.505 "data_offset": 256, 00:39:06.505 "data_size": 7936 00:39:06.505 }, 00:39:06.505 { 00:39:06.505 "name": "pt2", 00:39:06.505 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:06.505 "is_configured": true, 00:39:06.505 "data_offset": 256, 00:39:06.505 "data_size": 7936 00:39:06.505 } 00:39:06.505 ] 00:39:06.505 } 00:39:06.505 } 00:39:06.505 }' 00:39:06.505 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:39:06.505 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:39:06.505 pt2' 00:39:06.505 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:06.505 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:39:06.505 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:06.763 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:06.763 "name": "pt1", 00:39:06.763 "aliases": [ 00:39:06.763 "00000000-0000-0000-0000-000000000001" 00:39:06.763 ], 00:39:06.763 "product_name": "passthru", 00:39:06.763 "block_size": 4096, 00:39:06.763 "num_blocks": 8192, 00:39:06.763 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:06.763 "md_size": 32, 00:39:06.763 "md_interleave": false, 00:39:06.763 "dif_type": 0, 00:39:06.763 "assigned_rate_limits": { 00:39:06.763 "rw_ios_per_sec": 0, 00:39:06.763 "rw_mbytes_per_sec": 0, 00:39:06.763 "r_mbytes_per_sec": 0, 00:39:06.763 "w_mbytes_per_sec": 0 00:39:06.763 }, 00:39:06.763 "claimed": true, 00:39:06.763 "claim_type": "exclusive_write", 00:39:06.763 "zoned": false, 00:39:06.763 "supported_io_types": { 00:39:06.763 "read": true, 00:39:06.763 "write": true, 00:39:06.763 "unmap": true, 00:39:06.763 "write_zeroes": true, 00:39:06.763 "flush": true, 00:39:06.763 "reset": true, 00:39:06.763 "compare": false, 00:39:06.763 "compare_and_write": false, 00:39:06.763 "abort": true, 00:39:06.763 "nvme_admin": false, 00:39:06.763 "nvme_io": false 00:39:06.763 }, 00:39:06.763 "memory_domains": [ 00:39:06.763 { 00:39:06.763 "dma_device_id": "system", 00:39:06.763 "dma_device_type": 1 00:39:06.763 }, 00:39:06.763 { 00:39:06.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:06.763 "dma_device_type": 2 00:39:06.763 } 00:39:06.763 ], 00:39:06.763 "driver_specific": { 00:39:06.763 "passthru": { 00:39:06.763 "name": "pt1", 00:39:06.763 "base_bdev_name": "malloc1" 00:39:06.763 } 00:39:06.763 } 00:39:06.763 }' 00:39:06.763 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:07.063 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:07.322 "name": "pt2", 00:39:07.322 "aliases": [ 00:39:07.322 "00000000-0000-0000-0000-000000000002" 00:39:07.322 ], 00:39:07.322 "product_name": "passthru", 00:39:07.322 "block_size": 4096, 00:39:07.322 "num_blocks": 8192, 00:39:07.322 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:07.322 "md_size": 32, 00:39:07.322 "md_interleave": false, 00:39:07.322 "dif_type": 0, 00:39:07.322 "assigned_rate_limits": { 00:39:07.322 "rw_ios_per_sec": 0, 00:39:07.322 "rw_mbytes_per_sec": 0, 00:39:07.322 "r_mbytes_per_sec": 0, 00:39:07.322 "w_mbytes_per_sec": 0 00:39:07.322 }, 00:39:07.322 "claimed": true, 00:39:07.322 "claim_type": "exclusive_write", 00:39:07.322 "zoned": false, 00:39:07.322 "supported_io_types": { 00:39:07.322 "read": true, 00:39:07.322 "write": true, 00:39:07.322 "unmap": true, 00:39:07.322 "write_zeroes": true, 00:39:07.322 "flush": true, 00:39:07.322 "reset": true, 00:39:07.322 "compare": false, 00:39:07.322 "compare_and_write": false, 00:39:07.322 "abort": true, 00:39:07.322 "nvme_admin": false, 00:39:07.322 "nvme_io": false 00:39:07.322 }, 00:39:07.322 "memory_domains": [ 00:39:07.322 { 00:39:07.322 "dma_device_id": "system", 00:39:07.322 "dma_device_type": 1 00:39:07.322 }, 00:39:07.322 { 00:39:07.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:07.322 "dma_device_type": 2 00:39:07.322 } 00:39:07.322 ], 00:39:07.322 "driver_specific": { 00:39:07.322 "passthru": { 00:39:07.322 "name": "pt2", 00:39:07.322 "base_bdev_name": "malloc2" 00:39:07.322 } 00:39:07.322 } 00:39:07.322 }' 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:07.322 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:07.579 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:07.579 12:41:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:07.579 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:07.845 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:07.845 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:07.845 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:07.845 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:39:08.103 [2024-06-07 12:41:31.580221] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:08.103 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=edfea29a-41e2-40d1-91d4-1b3d09a42b1e 00:39:08.103 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z edfea29a-41e2-40d1-91d4-1b3d09a42b1e ']' 00:39:08.103 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:08.361 [2024-06-07 12:41:31.880083] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:08.361 [2024-06-07 12:41:31.880383] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:08.361 [2024-06-07 12:41:31.880644] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:08.361 [2024-06-07 12:41:31.880806] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:08.361 [2024-06-07 12:41:31.880901] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:39:08.361 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:39:08.361 12:41:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:08.619 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:39:08.619 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:39:08.619 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:39:08.619 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:39:08.876 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:39:08.876 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:39:09.134 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:39:09.134 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:39:09.392 12:41:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:39:09.649 [2024-06-07 12:41:33.262885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:39:09.649 [2024-06-07 12:41:33.265473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:39:09.649 [2024-06-07 12:41:33.265700] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:39:09.649 [2024-06-07 12:41:33.265949] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:39:09.649 [2024-06-07 12:41:33.266100] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:09.650 [2024-06-07 12:41:33.266149] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:39:09.650 request: 00:39:09.650 { 00:39:09.650 "name": "raid_bdev1", 00:39:09.650 "raid_level": "raid1", 00:39:09.650 "base_bdevs": [ 00:39:09.650 "malloc1", 00:39:09.650 "malloc2" 00:39:09.650 ], 00:39:09.650 "superblock": false, 00:39:09.650 "method": "bdev_raid_create", 00:39:09.650 "req_id": 1 00:39:09.650 } 00:39:09.650 Got JSON-RPC error response 00:39:09.650 response: 00:39:09.650 { 00:39:09.650 "code": -17, 00:39:09.650 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:39:09.650 } 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # es=1 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:09.650 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:39:09.907 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:39:09.907 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:39:09.907 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:39:10.164 [2024-06-07 12:41:33.738883] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:39:10.164 [2024-06-07 12:41:33.739279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:10.164 [2024-06-07 12:41:33.739445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:39:10.164 [2024-06-07 12:41:33.739560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:10.164 [2024-06-07 12:41:33.741710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:10.164 [2024-06-07 12:41:33.741899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:39:10.164 [2024-06-07 12:41:33.742078] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:39:10.164 [2024-06-07 12:41:33.742260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:39:10.164 pt1 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:10.164 12:41:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:10.728 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:10.728 "name": "raid_bdev1", 00:39:10.728 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:10.728 "strip_size_kb": 0, 00:39:10.728 "state": "configuring", 00:39:10.728 "raid_level": "raid1", 00:39:10.728 "superblock": true, 00:39:10.728 "num_base_bdevs": 2, 00:39:10.728 "num_base_bdevs_discovered": 1, 00:39:10.728 "num_base_bdevs_operational": 2, 00:39:10.728 "base_bdevs_list": [ 00:39:10.728 { 00:39:10.728 "name": "pt1", 00:39:10.728 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:10.728 "is_configured": true, 00:39:10.728 "data_offset": 256, 00:39:10.728 "data_size": 7936 00:39:10.728 }, 00:39:10.728 { 00:39:10.728 "name": null, 00:39:10.728 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:10.728 "is_configured": false, 00:39:10.728 "data_offset": 256, 00:39:10.728 "data_size": 7936 00:39:10.728 } 00:39:10.728 ] 00:39:10.728 }' 00:39:10.728 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:10.728 12:41:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:11.295 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:39:11.295 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:39:11.295 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:39:11.295 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:39:11.553 [2024-06-07 12:41:34.975059] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:39:11.553 [2024-06-07 12:41:34.975436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:11.553 [2024-06-07 12:41:34.975505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:39:11.553 [2024-06-07 12:41:34.975611] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:11.553 [2024-06-07 12:41:34.975815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:11.553 [2024-06-07 12:41:34.975876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:39:11.553 [2024-06-07 12:41:34.976034] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:39:11.553 [2024-06-07 12:41:34.976082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:39:11.553 [2024-06-07 12:41:34.976365] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:39:11.553 [2024-06-07 12:41:34.976468] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:11.553 [2024-06-07 12:41:34.976566] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:39:11.553 [2024-06-07 12:41:34.976716] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:39:11.553 [2024-06-07 12:41:34.976752] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:39:11.553 [2024-06-07 12:41:34.976873] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:11.553 pt2 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:11.553 12:41:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:11.553 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:11.553 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:11.812 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:11.812 "name": "raid_bdev1", 00:39:11.812 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:11.812 "strip_size_kb": 0, 00:39:11.812 "state": "online", 00:39:11.812 "raid_level": "raid1", 00:39:11.812 "superblock": true, 00:39:11.812 "num_base_bdevs": 2, 00:39:11.812 "num_base_bdevs_discovered": 2, 00:39:11.812 "num_base_bdevs_operational": 2, 00:39:11.812 "base_bdevs_list": [ 00:39:11.812 { 00:39:11.812 "name": "pt1", 00:39:11.812 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:11.812 "is_configured": true, 00:39:11.812 "data_offset": 256, 00:39:11.812 "data_size": 7936 00:39:11.812 }, 00:39:11.812 { 00:39:11.812 "name": "pt2", 00:39:11.812 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:11.812 "is_configured": true, 00:39:11.812 "data_offset": 256, 00:39:11.812 "data_size": 7936 00:39:11.812 } 00:39:11.812 ] 00:39:11.812 }' 00:39:11.812 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:11.812 12:41:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:12.378 12:41:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:39:12.378 [2024-06-07 12:41:35.983295] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:12.378 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:39:12.378 "name": "raid_bdev1", 00:39:12.378 "aliases": [ 00:39:12.378 "edfea29a-41e2-40d1-91d4-1b3d09a42b1e" 00:39:12.378 ], 00:39:12.378 "product_name": "Raid Volume", 00:39:12.378 "block_size": 4096, 00:39:12.378 "num_blocks": 7936, 00:39:12.378 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:12.378 "md_size": 32, 00:39:12.378 "md_interleave": false, 00:39:12.378 "dif_type": 0, 00:39:12.378 "assigned_rate_limits": { 00:39:12.378 "rw_ios_per_sec": 0, 00:39:12.378 "rw_mbytes_per_sec": 0, 00:39:12.378 "r_mbytes_per_sec": 0, 00:39:12.378 "w_mbytes_per_sec": 0 00:39:12.378 }, 00:39:12.378 "claimed": false, 00:39:12.378 "zoned": false, 00:39:12.378 "supported_io_types": { 00:39:12.378 "read": true, 00:39:12.378 "write": true, 00:39:12.378 "unmap": false, 00:39:12.378 "write_zeroes": true, 00:39:12.378 "flush": false, 00:39:12.378 "reset": true, 00:39:12.378 "compare": false, 00:39:12.378 "compare_and_write": false, 00:39:12.378 "abort": false, 00:39:12.378 "nvme_admin": false, 00:39:12.378 "nvme_io": false 00:39:12.378 }, 00:39:12.378 "memory_domains": [ 00:39:12.378 { 00:39:12.378 "dma_device_id": "system", 00:39:12.378 "dma_device_type": 1 00:39:12.378 }, 00:39:12.378 { 00:39:12.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:12.378 "dma_device_type": 2 00:39:12.378 }, 00:39:12.378 { 00:39:12.378 "dma_device_id": "system", 00:39:12.378 "dma_device_type": 1 00:39:12.378 }, 00:39:12.378 { 00:39:12.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:12.378 "dma_device_type": 2 00:39:12.378 } 00:39:12.378 ], 00:39:12.378 "driver_specific": { 00:39:12.378 "raid": { 00:39:12.378 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:12.378 "strip_size_kb": 0, 00:39:12.378 "state": "online", 00:39:12.378 "raid_level": "raid1", 00:39:12.378 "superblock": true, 00:39:12.378 "num_base_bdevs": 2, 00:39:12.378 "num_base_bdevs_discovered": 2, 00:39:12.378 "num_base_bdevs_operational": 2, 00:39:12.378 "base_bdevs_list": [ 00:39:12.378 { 00:39:12.378 "name": "pt1", 00:39:12.378 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:12.378 "is_configured": true, 00:39:12.378 "data_offset": 256, 00:39:12.378 "data_size": 7936 00:39:12.378 }, 00:39:12.379 { 00:39:12.379 "name": "pt2", 00:39:12.379 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:12.379 "is_configured": true, 00:39:12.379 "data_offset": 256, 00:39:12.379 "data_size": 7936 00:39:12.379 } 00:39:12.379 ] 00:39:12.379 } 00:39:12.379 } 00:39:12.379 }' 00:39:12.379 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:39:12.637 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:39:12.637 pt2' 00:39:12.637 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:12.637 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:12.637 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:12.895 "name": "pt1", 00:39:12.895 "aliases": [ 00:39:12.895 "00000000-0000-0000-0000-000000000001" 00:39:12.895 ], 00:39:12.895 "product_name": "passthru", 00:39:12.895 "block_size": 4096, 00:39:12.895 "num_blocks": 8192, 00:39:12.895 "uuid": "00000000-0000-0000-0000-000000000001", 00:39:12.895 "md_size": 32, 00:39:12.895 "md_interleave": false, 00:39:12.895 "dif_type": 0, 00:39:12.895 "assigned_rate_limits": { 00:39:12.895 "rw_ios_per_sec": 0, 00:39:12.895 "rw_mbytes_per_sec": 0, 00:39:12.895 "r_mbytes_per_sec": 0, 00:39:12.895 "w_mbytes_per_sec": 0 00:39:12.895 }, 00:39:12.895 "claimed": true, 00:39:12.895 "claim_type": "exclusive_write", 00:39:12.895 "zoned": false, 00:39:12.895 "supported_io_types": { 00:39:12.895 "read": true, 00:39:12.895 "write": true, 00:39:12.895 "unmap": true, 00:39:12.895 "write_zeroes": true, 00:39:12.895 "flush": true, 00:39:12.895 "reset": true, 00:39:12.895 "compare": false, 00:39:12.895 "compare_and_write": false, 00:39:12.895 "abort": true, 00:39:12.895 "nvme_admin": false, 00:39:12.895 "nvme_io": false 00:39:12.895 }, 00:39:12.895 "memory_domains": [ 00:39:12.895 { 00:39:12.895 "dma_device_id": "system", 00:39:12.895 "dma_device_type": 1 00:39:12.895 }, 00:39:12.895 { 00:39:12.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:12.895 "dma_device_type": 2 00:39:12.895 } 00:39:12.895 ], 00:39:12.895 "driver_specific": { 00:39:12.895 "passthru": { 00:39:12.895 "name": "pt1", 00:39:12.895 "base_bdev_name": "malloc1" 00:39:12.895 } 00:39:12.895 } 00:39:12.895 }' 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:39:12.895 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:39:13.153 12:41:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:39:13.411 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:39:13.411 "name": "pt2", 00:39:13.411 "aliases": [ 00:39:13.411 "00000000-0000-0000-0000-000000000002" 00:39:13.411 ], 00:39:13.411 "product_name": "passthru", 00:39:13.411 "block_size": 4096, 00:39:13.411 "num_blocks": 8192, 00:39:13.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:13.411 "md_size": 32, 00:39:13.411 "md_interleave": false, 00:39:13.411 "dif_type": 0, 00:39:13.411 "assigned_rate_limits": { 00:39:13.411 "rw_ios_per_sec": 0, 00:39:13.411 "rw_mbytes_per_sec": 0, 00:39:13.411 "r_mbytes_per_sec": 0, 00:39:13.411 "w_mbytes_per_sec": 0 00:39:13.411 }, 00:39:13.411 "claimed": true, 00:39:13.411 "claim_type": "exclusive_write", 00:39:13.411 "zoned": false, 00:39:13.411 "supported_io_types": { 00:39:13.411 "read": true, 00:39:13.411 "write": true, 00:39:13.411 "unmap": true, 00:39:13.411 "write_zeroes": true, 00:39:13.411 "flush": true, 00:39:13.411 "reset": true, 00:39:13.411 "compare": false, 00:39:13.411 "compare_and_write": false, 00:39:13.411 "abort": true, 00:39:13.411 "nvme_admin": false, 00:39:13.411 "nvme_io": false 00:39:13.411 }, 00:39:13.411 "memory_domains": [ 00:39:13.411 { 00:39:13.411 "dma_device_id": "system", 00:39:13.411 "dma_device_type": 1 00:39:13.411 }, 00:39:13.411 { 00:39:13.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:13.411 "dma_device_type": 2 00:39:13.411 } 00:39:13.411 ], 00:39:13.411 "driver_specific": { 00:39:13.411 "passthru": { 00:39:13.411 "name": "pt2", 00:39:13.411 "base_bdev_name": "malloc2" 00:39:13.411 } 00:39:13.411 } 00:39:13.411 }' 00:39:13.411 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:13.411 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:13.670 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:39:13.960 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:39:13.960 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:13.960 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:39:14.225 [2024-06-07 12:41:37.619506] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:14.225 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' edfea29a-41e2-40d1-91d4-1b3d09a42b1e '!=' edfea29a-41e2-40d1-91d4-1b3d09a42b1e ']' 00:39:14.225 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:39:14.225 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:39:14.225 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:39:14.225 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:39:14.483 [2024-06-07 12:41:37.907438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:14.483 12:41:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:14.741 12:41:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:14.741 "name": "raid_bdev1", 00:39:14.741 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:14.741 "strip_size_kb": 0, 00:39:14.741 "state": "online", 00:39:14.741 "raid_level": "raid1", 00:39:14.741 "superblock": true, 00:39:14.741 "num_base_bdevs": 2, 00:39:14.741 "num_base_bdevs_discovered": 1, 00:39:14.741 "num_base_bdevs_operational": 1, 00:39:14.741 "base_bdevs_list": [ 00:39:14.741 { 00:39:14.741 "name": null, 00:39:14.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:14.741 "is_configured": false, 00:39:14.741 "data_offset": 256, 00:39:14.741 "data_size": 7936 00:39:14.741 }, 00:39:14.741 { 00:39:14.741 "name": "pt2", 00:39:14.741 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:14.741 "is_configured": true, 00:39:14.741 "data_offset": 256, 00:39:14.741 "data_size": 7936 00:39:14.741 } 00:39:14.741 ] 00:39:14.741 }' 00:39:14.741 12:41:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:14.741 12:41:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:15.308 12:41:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:15.308 [2024-06-07 12:41:38.855564] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:15.308 [2024-06-07 12:41:38.855903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:15.308 [2024-06-07 12:41:38.856077] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:15.308 [2024-06-07 12:41:38.856244] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:15.308 [2024-06-07 12:41:38.856349] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:39:15.308 12:41:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:15.308 12:41:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:39:15.567 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:39:15.567 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:39:15.567 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:39:15.567 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:39:15.567 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:39:15.831 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:39:16.090 [2024-06-07 12:41:39.711664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:39:16.090 [2024-06-07 12:41:39.712060] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:16.090 [2024-06-07 12:41:39.712254] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:39:16.090 [2024-06-07 12:41:39.712408] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:16.090 [2024-06-07 12:41:39.714790] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:16.090 [2024-06-07 12:41:39.714984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:39:16.090 [2024-06-07 12:41:39.715181] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:39:16.090 [2024-06-07 12:41:39.715279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:39:16.090 [2024-06-07 12:41:39.715417] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:39:16.090 [2024-06-07 12:41:39.715726] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:16.090 [2024-06-07 12:41:39.715848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:39:16.090 [2024-06-07 12:41:39.716098] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:39:16.090 [2024-06-07 12:41:39.716214] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:39:16.090 [2024-06-07 12:41:39.716427] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:16.090 pt2 00:39:16.090 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:16.090 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:16.348 12:41:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:16.607 12:41:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:16.607 "name": "raid_bdev1", 00:39:16.607 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:16.607 "strip_size_kb": 0, 00:39:16.607 "state": "online", 00:39:16.607 "raid_level": "raid1", 00:39:16.607 "superblock": true, 00:39:16.607 "num_base_bdevs": 2, 00:39:16.607 "num_base_bdevs_discovered": 1, 00:39:16.607 "num_base_bdevs_operational": 1, 00:39:16.607 "base_bdevs_list": [ 00:39:16.607 { 00:39:16.607 "name": null, 00:39:16.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:16.607 "is_configured": false, 00:39:16.607 "data_offset": 256, 00:39:16.607 "data_size": 7936 00:39:16.607 }, 00:39:16.607 { 00:39:16.607 "name": "pt2", 00:39:16.607 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:16.607 "is_configured": true, 00:39:16.607 "data_offset": 256, 00:39:16.607 "data_size": 7936 00:39:16.607 } 00:39:16.607 ] 00:39:16.607 }' 00:39:16.607 12:41:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:16.607 12:41:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:17.175 12:41:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:17.464 [2024-06-07 12:41:40.868621] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:17.464 [2024-06-07 12:41:40.870176] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:17.464 [2024-06-07 12:41:40.870363] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:17.464 [2024-06-07 12:41:40.870503] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:17.464 [2024-06-07 12:41:40.870594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:39:17.464 12:41:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:17.464 12:41:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:39:17.464 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:39:17.464 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:39:17.464 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:39:17.723 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:39:17.982 [2024-06-07 12:41:41.404689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:39:17.982 [2024-06-07 12:41:41.405087] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:17.982 [2024-06-07 12:41:41.405278] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:39:17.982 [2024-06-07 12:41:41.405405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:17.982 [2024-06-07 12:41:41.410912] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:17.982 [2024-06-07 12:41:41.411481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:39:17.982 [2024-06-07 12:41:41.412008] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:39:17.982 [2024-06-07 12:41:41.412344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:39:17.982 [2024-06-07 12:41:41.413118] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:39:17.982 [2024-06-07 12:41:41.413407] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:17.982 [2024-06-07 12:41:41.413728] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state configuring 00:39:17.982 [2024-06-07 12:41:41.414109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:39:17.982 [2024-06-07 12:41:41.414506] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:39:17.982 pt1 00:39:17.982 [2024-06-07 12:41:41.414802] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:17.982 [2024-06-07 12:41:41.415172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:39:17.983 [2024-06-07 12:41:41.415568] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:39:17.983 [2024-06-07 12:41:41.415848] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:39:17.983 [2024-06-07 12:41:41.416181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:17.983 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:18.241 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:18.241 "name": "raid_bdev1", 00:39:18.241 "uuid": "edfea29a-41e2-40d1-91d4-1b3d09a42b1e", 00:39:18.241 "strip_size_kb": 0, 00:39:18.241 "state": "online", 00:39:18.241 "raid_level": "raid1", 00:39:18.241 "superblock": true, 00:39:18.241 "num_base_bdevs": 2, 00:39:18.241 "num_base_bdevs_discovered": 1, 00:39:18.241 "num_base_bdevs_operational": 1, 00:39:18.241 "base_bdevs_list": [ 00:39:18.241 { 00:39:18.241 "name": null, 00:39:18.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:18.241 "is_configured": false, 00:39:18.241 "data_offset": 256, 00:39:18.241 "data_size": 7936 00:39:18.241 }, 00:39:18.241 { 00:39:18.241 "name": "pt2", 00:39:18.241 "uuid": "00000000-0000-0000-0000-000000000002", 00:39:18.241 "is_configured": true, 00:39:18.241 "data_offset": 256, 00:39:18.241 "data_size": 7936 00:39:18.241 } 00:39:18.241 ] 00:39:18.241 }' 00:39:18.241 12:41:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:18.241 12:41:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:18.808 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:39:18.808 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:39:19.066 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:39:19.066 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:39:19.066 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:19.324 [2024-06-07 12:41:42.832587] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:19.324 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' edfea29a-41e2-40d1-91d4-1b3d09a42b1e '!=' edfea29a-41e2-40d1-91d4-1b3d09a42b1e ']' 00:39:19.324 12:41:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 228934 00:39:19.324 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@949 -- # '[' -z 228934 ']' 00:39:19.324 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # kill -0 228934 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # uname 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 228934 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 228934' 00:39:19.325 killing process with pid 228934 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # kill 228934 00:39:19.325 [2024-06-07 12:41:42.894959] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:19.325 12:41:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@973 -- # wait 228934 00:39:19.325 [2024-06-07 12:41:42.895193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:19.325 [2024-06-07 12:41:42.895253] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:19.325 [2024-06-07 12:41:42.895265] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:39:19.325 [2024-06-07 12:41:42.947410] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:19.970 12:41:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:39:19.970 00:39:19.970 real 0m16.477s 00:39:19.971 user 0m30.092s 00:39:19.971 sys 0m2.861s 00:39:19.971 12:41:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:19.971 12:41:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:19.971 ************************************ 00:39:19.971 END TEST raid_superblock_test_md_separate 00:39:19.971 ************************************ 00:39:19.971 12:41:43 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:39:19.971 12:41:43 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:39:19.971 12:41:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:39:19.971 12:41:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:19.971 12:41:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:19.971 ************************************ 00:39:19.971 START TEST raid_rebuild_test_sb_md_separate 00:39:19.971 ************************************ 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=229460 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 229460 /var/tmp/spdk-raid.sock 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 229460 ']' 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:19.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:19.971 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:19.971 [2024-06-07 12:41:43.440613] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:39:19.971 [2024-06-07 12:41:43.441018] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid229460 ] 00:39:19.971 I/O size of 3145728 is greater than zero copy threshold (65536). 00:39:19.971 Zero copy mechanism will not be used. 00:39:19.971 [2024-06-07 12:41:43.578440] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:20.229 [2024-06-07 12:41:43.671917] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:20.229 [2024-06-07 12:41:43.752809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:20.229 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:20.229 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:39:20.229 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:20.229 12:41:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:39:20.796 BaseBdev1_malloc 00:39:20.796 12:41:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:20.796 [2024-06-07 12:41:44.432740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:20.796 [2024-06-07 12:41:44.433071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:20.796 [2024-06-07 12:41:44.433294] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:39:20.796 [2024-06-07 12:41:44.433463] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:20.796 [2024-06-07 12:41:44.435853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:20.796 [2024-06-07 12:41:44.436041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:20.796 BaseBdev1 00:39:21.054 12:41:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:39:21.054 12:41:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:39:21.312 BaseBdev2_malloc 00:39:21.312 12:41:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:39:21.312 [2024-06-07 12:41:44.948858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:39:21.312 [2024-06-07 12:41:44.949133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:21.312 [2024-06-07 12:41:44.949213] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:39:21.312 [2024-06-07 12:41:44.949384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:21.312 [2024-06-07 12:41:44.951403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:21.312 [2024-06-07 12:41:44.951553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:39:21.312 BaseBdev2 00:39:21.569 12:41:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:39:21.569 spare_malloc 00:39:21.569 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:39:21.828 spare_delay 00:39:21.828 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:22.086 [2024-06-07 12:41:45.634157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:22.086 [2024-06-07 12:41:45.634481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:22.086 [2024-06-07 12:41:45.634618] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:39:22.086 [2024-06-07 12:41:45.634786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:22.086 [2024-06-07 12:41:45.637148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:22.086 [2024-06-07 12:41:45.637338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:22.086 spare 00:39:22.086 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:39:22.344 [2024-06-07 12:41:45.854288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:22.344 [2024-06-07 12:41:45.856559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:22.344 [2024-06-07 12:41:45.856889] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:39:22.344 [2024-06-07 12:41:45.857064] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:22.344 [2024-06-07 12:41:45.857300] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:39:22.345 [2024-06-07 12:41:45.857588] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:39:22.345 [2024-06-07 12:41:45.857696] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:39:22.345 [2024-06-07 12:41:45.857885] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:22.345 12:41:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:22.603 12:41:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:22.603 "name": "raid_bdev1", 00:39:22.603 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:22.603 "strip_size_kb": 0, 00:39:22.603 "state": "online", 00:39:22.603 "raid_level": "raid1", 00:39:22.603 "superblock": true, 00:39:22.603 "num_base_bdevs": 2, 00:39:22.603 "num_base_bdevs_discovered": 2, 00:39:22.603 "num_base_bdevs_operational": 2, 00:39:22.603 "base_bdevs_list": [ 00:39:22.603 { 00:39:22.603 "name": "BaseBdev1", 00:39:22.603 "uuid": "1a5eeb3a-0711-50d3-ad80-7ed5ffe2ac1e", 00:39:22.603 "is_configured": true, 00:39:22.603 "data_offset": 256, 00:39:22.603 "data_size": 7936 00:39:22.603 }, 00:39:22.603 { 00:39:22.603 "name": "BaseBdev2", 00:39:22.603 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:22.604 "is_configured": true, 00:39:22.604 "data_offset": 256, 00:39:22.604 "data_size": 7936 00:39:22.604 } 00:39:22.604 ] 00:39:22.604 }' 00:39:22.604 12:41:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:22.604 12:41:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:23.170 12:41:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:39:23.170 12:41:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:39:23.428 [2024-06-07 12:41:47.010525] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:23.428 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:39:23.428 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:23.428 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:39:23.993 [2024-06-07 12:41:47.570795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002600 00:39:23.993 /dev/nbd0 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:23.993 1+0 records in 00:39:23.993 1+0 records out 00:39:23.993 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000428397 s, 9.6 MB/s 00:39:23.993 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:39:24.251 12:41:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:39:24.817 7936+0 records in 00:39:24.817 7936+0 records out 00:39:24.817 32505856 bytes (33 MB, 31 MiB) copied, 0.554374 s, 58.6 MB/s 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:24.817 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:39:25.076 [2024-06-07 12:41:48.499396] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:39:25.076 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:39:25.334 [2024-06-07 12:41:48.799166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:25.334 12:41:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:25.592 12:41:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:25.592 "name": "raid_bdev1", 00:39:25.592 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:25.592 "strip_size_kb": 0, 00:39:25.592 "state": "online", 00:39:25.592 "raid_level": "raid1", 00:39:25.592 "superblock": true, 00:39:25.592 "num_base_bdevs": 2, 00:39:25.592 "num_base_bdevs_discovered": 1, 00:39:25.592 "num_base_bdevs_operational": 1, 00:39:25.592 "base_bdevs_list": [ 00:39:25.592 { 00:39:25.592 "name": null, 00:39:25.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:25.592 "is_configured": false, 00:39:25.592 "data_offset": 256, 00:39:25.592 "data_size": 7936 00:39:25.592 }, 00:39:25.592 { 00:39:25.592 "name": "BaseBdev2", 00:39:25.592 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:25.592 "is_configured": true, 00:39:25.592 "data_offset": 256, 00:39:25.592 "data_size": 7936 00:39:25.592 } 00:39:25.592 ] 00:39:25.592 }' 00:39:25.592 12:41:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:25.592 12:41:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:26.157 12:41:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:26.414 [2024-06-07 12:41:50.007382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:26.414 [2024-06-07 12:41:50.010773] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00019c960 00:39:26.414 [2024-06-07 12:41:50.013501] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:26.414 12:41:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:27.412 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:27.979 "name": "raid_bdev1", 00:39:27.979 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:27.979 "strip_size_kb": 0, 00:39:27.979 "state": "online", 00:39:27.979 "raid_level": "raid1", 00:39:27.979 "superblock": true, 00:39:27.979 "num_base_bdevs": 2, 00:39:27.979 "num_base_bdevs_discovered": 2, 00:39:27.979 "num_base_bdevs_operational": 2, 00:39:27.979 "process": { 00:39:27.979 "type": "rebuild", 00:39:27.979 "target": "spare", 00:39:27.979 "progress": { 00:39:27.979 "blocks": 3072, 00:39:27.979 "percent": 38 00:39:27.979 } 00:39:27.979 }, 00:39:27.979 "base_bdevs_list": [ 00:39:27.979 { 00:39:27.979 "name": "spare", 00:39:27.979 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:27.979 "is_configured": true, 00:39:27.979 "data_offset": 256, 00:39:27.979 "data_size": 7936 00:39:27.979 }, 00:39:27.979 { 00:39:27.979 "name": "BaseBdev2", 00:39:27.979 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:27.979 "is_configured": true, 00:39:27.979 "data_offset": 256, 00:39:27.979 "data_size": 7936 00:39:27.979 } 00:39:27.979 ] 00:39:27.979 }' 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:27.979 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:39:28.237 [2024-06-07 12:41:51.667508] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:28.237 [2024-06-07 12:41:51.727267] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:28.237 [2024-06-07 12:41:51.727662] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:28.237 [2024-06-07 12:41:51.727722] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:28.237 [2024-06-07 12:41:51.727802] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:28.237 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:28.495 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:28.495 "name": "raid_bdev1", 00:39:28.495 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:28.495 "strip_size_kb": 0, 00:39:28.495 "state": "online", 00:39:28.495 "raid_level": "raid1", 00:39:28.495 "superblock": true, 00:39:28.495 "num_base_bdevs": 2, 00:39:28.495 "num_base_bdevs_discovered": 1, 00:39:28.495 "num_base_bdevs_operational": 1, 00:39:28.495 "base_bdevs_list": [ 00:39:28.495 { 00:39:28.495 "name": null, 00:39:28.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:28.495 "is_configured": false, 00:39:28.495 "data_offset": 256, 00:39:28.495 "data_size": 7936 00:39:28.495 }, 00:39:28.495 { 00:39:28.495 "name": "BaseBdev2", 00:39:28.495 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:28.495 "is_configured": true, 00:39:28.495 "data_offset": 256, 00:39:28.495 "data_size": 7936 00:39:28.495 } 00:39:28.495 ] 00:39:28.495 }' 00:39:28.495 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:28.495 12:41:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:29.060 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:29.318 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:29.318 "name": "raid_bdev1", 00:39:29.318 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:29.318 "strip_size_kb": 0, 00:39:29.318 "state": "online", 00:39:29.318 "raid_level": "raid1", 00:39:29.318 "superblock": true, 00:39:29.318 "num_base_bdevs": 2, 00:39:29.318 "num_base_bdevs_discovered": 1, 00:39:29.318 "num_base_bdevs_operational": 1, 00:39:29.318 "base_bdevs_list": [ 00:39:29.318 { 00:39:29.318 "name": null, 00:39:29.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:29.318 "is_configured": false, 00:39:29.318 "data_offset": 256, 00:39:29.318 "data_size": 7936 00:39:29.318 }, 00:39:29.318 { 00:39:29.318 "name": "BaseBdev2", 00:39:29.318 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:29.318 "is_configured": true, 00:39:29.318 "data_offset": 256, 00:39:29.318 "data_size": 7936 00:39:29.318 } 00:39:29.318 ] 00:39:29.318 }' 00:39:29.318 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:29.318 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:29.318 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:29.577 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:29.577 12:41:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:29.577 [2024-06-07 12:41:53.193890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:29.577 [2024-06-07 12:41:53.196996] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d00019cb00 00:39:29.577 [2024-06-07 12:41:53.199094] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:29.577 12:41:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:31.010 "name": "raid_bdev1", 00:39:31.010 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:31.010 "strip_size_kb": 0, 00:39:31.010 "state": "online", 00:39:31.010 "raid_level": "raid1", 00:39:31.010 "superblock": true, 00:39:31.010 "num_base_bdevs": 2, 00:39:31.010 "num_base_bdevs_discovered": 2, 00:39:31.010 "num_base_bdevs_operational": 2, 00:39:31.010 "process": { 00:39:31.010 "type": "rebuild", 00:39:31.010 "target": "spare", 00:39:31.010 "progress": { 00:39:31.010 "blocks": 3072, 00:39:31.010 "percent": 38 00:39:31.010 } 00:39:31.010 }, 00:39:31.010 "base_bdevs_list": [ 00:39:31.010 { 00:39:31.010 "name": "spare", 00:39:31.010 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:31.010 "is_configured": true, 00:39:31.010 "data_offset": 256, 00:39:31.010 "data_size": 7936 00:39:31.010 }, 00:39:31.010 { 00:39:31.010 "name": "BaseBdev2", 00:39:31.010 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:31.010 "is_configured": true, 00:39:31.010 "data_offset": 256, 00:39:31.010 "data_size": 7936 00:39:31.010 } 00:39:31.010 ] 00:39:31.010 }' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:39:31.010 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1121 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:31.010 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:31.011 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:31.576 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:31.576 "name": "raid_bdev1", 00:39:31.576 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:31.576 "strip_size_kb": 0, 00:39:31.576 "state": "online", 00:39:31.576 "raid_level": "raid1", 00:39:31.576 "superblock": true, 00:39:31.576 "num_base_bdevs": 2, 00:39:31.576 "num_base_bdevs_discovered": 2, 00:39:31.576 "num_base_bdevs_operational": 2, 00:39:31.576 "process": { 00:39:31.576 "type": "rebuild", 00:39:31.576 "target": "spare", 00:39:31.576 "progress": { 00:39:31.576 "blocks": 4352, 00:39:31.576 "percent": 54 00:39:31.576 } 00:39:31.576 }, 00:39:31.576 "base_bdevs_list": [ 00:39:31.576 { 00:39:31.576 "name": "spare", 00:39:31.576 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:31.576 "is_configured": true, 00:39:31.576 "data_offset": 256, 00:39:31.576 "data_size": 7936 00:39:31.576 }, 00:39:31.576 { 00:39:31.576 "name": "BaseBdev2", 00:39:31.576 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:31.576 "is_configured": true, 00:39:31.576 "data_offset": 256, 00:39:31.576 "data_size": 7936 00:39:31.576 } 00:39:31.576 ] 00:39:31.576 }' 00:39:31.576 12:41:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:31.576 12:41:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:31.576 12:41:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:31.576 12:41:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:31.576 12:41:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:32.511 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:32.770 [2024-06-07 12:41:56.321925] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:39:32.770 [2024-06-07 12:41:56.322217] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:39:32.770 [2024-06-07 12:41:56.322496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:33.028 "name": "raid_bdev1", 00:39:33.028 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:33.028 "strip_size_kb": 0, 00:39:33.028 "state": "online", 00:39:33.028 "raid_level": "raid1", 00:39:33.028 "superblock": true, 00:39:33.028 "num_base_bdevs": 2, 00:39:33.028 "num_base_bdevs_discovered": 2, 00:39:33.028 "num_base_bdevs_operational": 2, 00:39:33.028 "base_bdevs_list": [ 00:39:33.028 { 00:39:33.028 "name": "spare", 00:39:33.028 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:33.028 "is_configured": true, 00:39:33.028 "data_offset": 256, 00:39:33.028 "data_size": 7936 00:39:33.028 }, 00:39:33.028 { 00:39:33.028 "name": "BaseBdev2", 00:39:33.028 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:33.028 "is_configured": true, 00:39:33.028 "data_offset": 256, 00:39:33.028 "data_size": 7936 00:39:33.028 } 00:39:33.028 ] 00:39:33.028 }' 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:33.028 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:33.343 "name": "raid_bdev1", 00:39:33.343 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:33.343 "strip_size_kb": 0, 00:39:33.343 "state": "online", 00:39:33.343 "raid_level": "raid1", 00:39:33.343 "superblock": true, 00:39:33.343 "num_base_bdevs": 2, 00:39:33.343 "num_base_bdevs_discovered": 2, 00:39:33.343 "num_base_bdevs_operational": 2, 00:39:33.343 "base_bdevs_list": [ 00:39:33.343 { 00:39:33.343 "name": "spare", 00:39:33.343 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:33.343 "is_configured": true, 00:39:33.343 "data_offset": 256, 00:39:33.343 "data_size": 7936 00:39:33.343 }, 00:39:33.343 { 00:39:33.343 "name": "BaseBdev2", 00:39:33.343 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:33.343 "is_configured": true, 00:39:33.343 "data_offset": 256, 00:39:33.343 "data_size": 7936 00:39:33.343 } 00:39:33.343 ] 00:39:33.343 }' 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:33.343 12:41:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:33.600 12:41:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:33.600 "name": "raid_bdev1", 00:39:33.601 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:33.601 "strip_size_kb": 0, 00:39:33.601 "state": "online", 00:39:33.601 "raid_level": "raid1", 00:39:33.601 "superblock": true, 00:39:33.601 "num_base_bdevs": 2, 00:39:33.601 "num_base_bdevs_discovered": 2, 00:39:33.601 "num_base_bdevs_operational": 2, 00:39:33.601 "base_bdevs_list": [ 00:39:33.601 { 00:39:33.601 "name": "spare", 00:39:33.601 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:33.601 "is_configured": true, 00:39:33.601 "data_offset": 256, 00:39:33.601 "data_size": 7936 00:39:33.601 }, 00:39:33.601 { 00:39:33.601 "name": "BaseBdev2", 00:39:33.601 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:33.601 "is_configured": true, 00:39:33.601 "data_offset": 256, 00:39:33.601 "data_size": 7936 00:39:33.601 } 00:39:33.601 ] 00:39:33.601 }' 00:39:33.601 12:41:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:33.601 12:41:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:34.534 12:41:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:39:34.534 [2024-06-07 12:41:58.081683] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:39:34.534 [2024-06-07 12:41:58.081940] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:39:34.534 [2024-06-07 12:41:58.082152] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:34.534 [2024-06-07 12:41:58.082385] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:34.534 [2024-06-07 12:41:58.082505] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:39:34.534 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:34.534 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:39:34.792 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:39:35.050 /dev/nbd0 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:35.050 1+0 records in 00:39:35.050 1+0 records out 00:39:35.050 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312932 s, 13.1 MB/s 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:39:35.050 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:39:35.308 /dev/nbd1 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:35.566 1+0 records in 00:39:35.566 1+0 records out 00:39:35.566 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000696947 s, 5.9 MB/s 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:39:35.566 12:41:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:35.566 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:35.824 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:39:36.086 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:36.345 12:41:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:36.603 [2024-06-07 12:42:00.196478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:36.603 [2024-06-07 12:42:00.196886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:36.603 [2024-06-07 12:42:00.196995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:39:36.603 [2024-06-07 12:42:00.197240] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:36.603 [2024-06-07 12:42:00.199769] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:36.604 [2024-06-07 12:42:00.199897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:36.604 [2024-06-07 12:42:00.200183] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:39:36.604 [2024-06-07 12:42:00.200406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:36.604 [2024-06-07 12:42:00.200706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:36.604 spare 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:36.604 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:36.862 [2024-06-07 12:42:00.301008] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009380 00:39:36.862 [2024-06-07 12:42:00.301308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:39:36.862 [2024-06-07 12:42:00.301613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb4f0 00:39:36.862 [2024-06-07 12:42:00.301868] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009380 00:39:36.862 [2024-06-07 12:42:00.301985] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009380 00:39:36.862 [2024-06-07 12:42:00.302203] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:36.862 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:36.862 "name": "raid_bdev1", 00:39:36.862 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:36.862 "strip_size_kb": 0, 00:39:36.862 "state": "online", 00:39:36.862 "raid_level": "raid1", 00:39:36.862 "superblock": true, 00:39:36.862 "num_base_bdevs": 2, 00:39:36.862 "num_base_bdevs_discovered": 2, 00:39:36.862 "num_base_bdevs_operational": 2, 00:39:36.862 "base_bdevs_list": [ 00:39:36.862 { 00:39:36.862 "name": "spare", 00:39:36.862 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:36.862 "is_configured": true, 00:39:36.862 "data_offset": 256, 00:39:36.862 "data_size": 7936 00:39:36.862 }, 00:39:36.862 { 00:39:36.862 "name": "BaseBdev2", 00:39:36.862 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:36.862 "is_configured": true, 00:39:36.862 "data_offset": 256, 00:39:36.862 "data_size": 7936 00:39:36.862 } 00:39:36.862 ] 00:39:36.862 }' 00:39:36.863 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:36.863 12:42:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:37.808 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:38.075 "name": "raid_bdev1", 00:39:38.075 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:38.075 "strip_size_kb": 0, 00:39:38.075 "state": "online", 00:39:38.075 "raid_level": "raid1", 00:39:38.075 "superblock": true, 00:39:38.075 "num_base_bdevs": 2, 00:39:38.075 "num_base_bdevs_discovered": 2, 00:39:38.075 "num_base_bdevs_operational": 2, 00:39:38.075 "base_bdevs_list": [ 00:39:38.075 { 00:39:38.075 "name": "spare", 00:39:38.075 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:38.075 "is_configured": true, 00:39:38.075 "data_offset": 256, 00:39:38.075 "data_size": 7936 00:39:38.075 }, 00:39:38.075 { 00:39:38.075 "name": "BaseBdev2", 00:39:38.075 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:38.075 "is_configured": true, 00:39:38.075 "data_offset": 256, 00:39:38.075 "data_size": 7936 00:39:38.075 } 00:39:38.075 ] 00:39:38.075 }' 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.075 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:39:38.346 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:39:38.346 12:42:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:39:38.618 [2024-06-07 12:42:02.209168] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:38.618 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:38.892 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:38.892 "name": "raid_bdev1", 00:39:38.892 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:38.892 "strip_size_kb": 0, 00:39:38.892 "state": "online", 00:39:38.892 "raid_level": "raid1", 00:39:38.892 "superblock": true, 00:39:38.892 "num_base_bdevs": 2, 00:39:38.892 "num_base_bdevs_discovered": 1, 00:39:38.892 "num_base_bdevs_operational": 1, 00:39:38.892 "base_bdevs_list": [ 00:39:38.892 { 00:39:38.892 "name": null, 00:39:38.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:38.892 "is_configured": false, 00:39:38.892 "data_offset": 256, 00:39:38.892 "data_size": 7936 00:39:38.892 }, 00:39:38.892 { 00:39:38.892 "name": "BaseBdev2", 00:39:38.892 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:38.892 "is_configured": true, 00:39:38.892 "data_offset": 256, 00:39:38.892 "data_size": 7936 00:39:38.892 } 00:39:38.892 ] 00:39:38.892 }' 00:39:38.892 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:38.892 12:42:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:39.829 12:42:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:39:39.829 [2024-06-07 12:42:03.465352] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:39.829 [2024-06-07 12:42:03.465768] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:39:39.829 [2024-06-07 12:42:03.465907] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:39:39.829 [2024-06-07 12:42:03.466037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:39.829 [2024-06-07 12:42:03.468847] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb690 00:39:39.829 [2024-06-07 12:42:03.470905] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:40.086 12:42:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:41.018 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:41.277 "name": "raid_bdev1", 00:39:41.277 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:41.277 "strip_size_kb": 0, 00:39:41.277 "state": "online", 00:39:41.277 "raid_level": "raid1", 00:39:41.277 "superblock": true, 00:39:41.277 "num_base_bdevs": 2, 00:39:41.277 "num_base_bdevs_discovered": 2, 00:39:41.277 "num_base_bdevs_operational": 2, 00:39:41.277 "process": { 00:39:41.277 "type": "rebuild", 00:39:41.277 "target": "spare", 00:39:41.277 "progress": { 00:39:41.277 "blocks": 3072, 00:39:41.277 "percent": 38 00:39:41.277 } 00:39:41.277 }, 00:39:41.277 "base_bdevs_list": [ 00:39:41.277 { 00:39:41.277 "name": "spare", 00:39:41.277 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:41.277 "is_configured": true, 00:39:41.277 "data_offset": 256, 00:39:41.277 "data_size": 7936 00:39:41.277 }, 00:39:41.277 { 00:39:41.277 "name": "BaseBdev2", 00:39:41.277 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:41.277 "is_configured": true, 00:39:41.277 "data_offset": 256, 00:39:41.277 "data_size": 7936 00:39:41.277 } 00:39:41.277 ] 00:39:41.277 }' 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:41.277 12:42:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:41.843 [2024-06-07 12:42:05.184312] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:41.843 [2024-06-07 12:42:05.284390] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:41.843 [2024-06-07 12:42:05.284772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:41.843 [2024-06-07 12:42:05.284829] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:41.843 [2024-06-07 12:42:05.284933] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:41.843 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:42.100 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:42.100 "name": "raid_bdev1", 00:39:42.100 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:42.100 "strip_size_kb": 0, 00:39:42.100 "state": "online", 00:39:42.100 "raid_level": "raid1", 00:39:42.100 "superblock": true, 00:39:42.100 "num_base_bdevs": 2, 00:39:42.100 "num_base_bdevs_discovered": 1, 00:39:42.100 "num_base_bdevs_operational": 1, 00:39:42.100 "base_bdevs_list": [ 00:39:42.100 { 00:39:42.100 "name": null, 00:39:42.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:42.100 "is_configured": false, 00:39:42.100 "data_offset": 256, 00:39:42.100 "data_size": 7936 00:39:42.100 }, 00:39:42.100 { 00:39:42.100 "name": "BaseBdev2", 00:39:42.100 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:42.100 "is_configured": true, 00:39:42.100 "data_offset": 256, 00:39:42.100 "data_size": 7936 00:39:42.100 } 00:39:42.100 ] 00:39:42.100 }' 00:39:42.100 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:42.100 12:42:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:42.666 12:42:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:39:42.924 [2024-06-07 12:42:06.498894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:39:42.924 [2024-06-07 12:42:06.499294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:42.924 [2024-06-07 12:42:06.499468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009980 00:39:42.924 [2024-06-07 12:42:06.499632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:42.924 [2024-06-07 12:42:06.499965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:42.924 [2024-06-07 12:42:06.500122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:39:42.924 [2024-06-07 12:42:06.500355] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:39:42.924 [2024-06-07 12:42:06.500474] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:39:42.924 [2024-06-07 12:42:06.500576] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:39:42.924 [2024-06-07 12:42:06.500781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:39:42.924 [2024-06-07 12:42:06.503695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0001bb9d0 00:39:42.924 spare 00:39:42.924 [2024-06-07 12:42:06.505734] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:39:42.924 12:42:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:44.298 "name": "raid_bdev1", 00:39:44.298 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:44.298 "strip_size_kb": 0, 00:39:44.298 "state": "online", 00:39:44.298 "raid_level": "raid1", 00:39:44.298 "superblock": true, 00:39:44.298 "num_base_bdevs": 2, 00:39:44.298 "num_base_bdevs_discovered": 2, 00:39:44.298 "num_base_bdevs_operational": 2, 00:39:44.298 "process": { 00:39:44.298 "type": "rebuild", 00:39:44.298 "target": "spare", 00:39:44.298 "progress": { 00:39:44.298 "blocks": 3072, 00:39:44.298 "percent": 38 00:39:44.298 } 00:39:44.298 }, 00:39:44.298 "base_bdevs_list": [ 00:39:44.298 { 00:39:44.298 "name": "spare", 00:39:44.298 "uuid": "4a9ac90b-c746-58b1-be7c-269bb200f79f", 00:39:44.298 "is_configured": true, 00:39:44.298 "data_offset": 256, 00:39:44.298 "data_size": 7936 00:39:44.298 }, 00:39:44.298 { 00:39:44.298 "name": "BaseBdev2", 00:39:44.298 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:44.298 "is_configured": true, 00:39:44.298 "data_offset": 256, 00:39:44.298 "data_size": 7936 00:39:44.298 } 00:39:44.298 ] 00:39:44.298 }' 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:39:44.298 12:42:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:39:44.556 [2024-06-07 12:42:08.180635] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:44.814 [2024-06-07 12:42:08.217853] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:39:44.814 [2024-06-07 12:42:08.218213] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:44.814 [2024-06-07 12:42:08.218345] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:39:44.814 [2024-06-07 12:42:08.218414] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:44.814 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:45.072 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:45.072 "name": "raid_bdev1", 00:39:45.072 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:45.072 "strip_size_kb": 0, 00:39:45.072 "state": "online", 00:39:45.072 "raid_level": "raid1", 00:39:45.072 "superblock": true, 00:39:45.072 "num_base_bdevs": 2, 00:39:45.072 "num_base_bdevs_discovered": 1, 00:39:45.072 "num_base_bdevs_operational": 1, 00:39:45.072 "base_bdevs_list": [ 00:39:45.072 { 00:39:45.072 "name": null, 00:39:45.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:45.072 "is_configured": false, 00:39:45.072 "data_offset": 256, 00:39:45.072 "data_size": 7936 00:39:45.072 }, 00:39:45.072 { 00:39:45.072 "name": "BaseBdev2", 00:39:45.072 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:45.072 "is_configured": true, 00:39:45.072 "data_offset": 256, 00:39:45.072 "data_size": 7936 00:39:45.072 } 00:39:45.072 ] 00:39:45.072 }' 00:39:45.072 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:45.072 12:42:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:45.638 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:45.985 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:45.985 "name": "raid_bdev1", 00:39:45.985 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:45.985 "strip_size_kb": 0, 00:39:45.985 "state": "online", 00:39:45.985 "raid_level": "raid1", 00:39:45.985 "superblock": true, 00:39:45.985 "num_base_bdevs": 2, 00:39:45.985 "num_base_bdevs_discovered": 1, 00:39:45.985 "num_base_bdevs_operational": 1, 00:39:45.985 "base_bdevs_list": [ 00:39:45.985 { 00:39:45.985 "name": null, 00:39:45.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:45.985 "is_configured": false, 00:39:45.985 "data_offset": 256, 00:39:45.985 "data_size": 7936 00:39:45.985 }, 00:39:45.985 { 00:39:45.985 "name": "BaseBdev2", 00:39:45.985 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:45.985 "is_configured": true, 00:39:45.985 "data_offset": 256, 00:39:45.985 "data_size": 7936 00:39:45.985 } 00:39:45.985 ] 00:39:45.985 }' 00:39:45.985 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:45.985 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:45.985 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:45.986 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:45.986 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:39:46.289 12:42:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:39:46.547 [2024-06-07 12:42:10.116565] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:39:46.547 [2024-06-07 12:42:10.116719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:46.547 [2024-06-07 12:42:10.116785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009f80 00:39:46.548 [2024-06-07 12:42:10.116812] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:46.548 [2024-06-07 12:42:10.117028] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:46.548 [2024-06-07 12:42:10.117062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:39:46.548 [2024-06-07 12:42:10.117138] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:39:46.548 [2024-06-07 12:42:10.117152] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:39:46.548 [2024-06-07 12:42:10.117161] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:46.548 BaseBdev1 00:39:46.548 12:42:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:47.921 "name": "raid_bdev1", 00:39:47.921 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:47.921 "strip_size_kb": 0, 00:39:47.921 "state": "online", 00:39:47.921 "raid_level": "raid1", 00:39:47.921 "superblock": true, 00:39:47.921 "num_base_bdevs": 2, 00:39:47.921 "num_base_bdevs_discovered": 1, 00:39:47.921 "num_base_bdevs_operational": 1, 00:39:47.921 "base_bdevs_list": [ 00:39:47.921 { 00:39:47.921 "name": null, 00:39:47.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:47.921 "is_configured": false, 00:39:47.921 "data_offset": 256, 00:39:47.921 "data_size": 7936 00:39:47.921 }, 00:39:47.921 { 00:39:47.921 "name": "BaseBdev2", 00:39:47.921 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:47.921 "is_configured": true, 00:39:47.921 "data_offset": 256, 00:39:47.921 "data_size": 7936 00:39:47.921 } 00:39:47.921 ] 00:39:47.921 }' 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:47.921 12:42:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:48.854 "name": "raid_bdev1", 00:39:48.854 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:48.854 "strip_size_kb": 0, 00:39:48.854 "state": "online", 00:39:48.854 "raid_level": "raid1", 00:39:48.854 "superblock": true, 00:39:48.854 "num_base_bdevs": 2, 00:39:48.854 "num_base_bdevs_discovered": 1, 00:39:48.854 "num_base_bdevs_operational": 1, 00:39:48.854 "base_bdevs_list": [ 00:39:48.854 { 00:39:48.854 "name": null, 00:39:48.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:48.854 "is_configured": false, 00:39:48.854 "data_offset": 256, 00:39:48.854 "data_size": 7936 00:39:48.854 }, 00:39:48.854 { 00:39:48.854 "name": "BaseBdev2", 00:39:48.854 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:48.854 "is_configured": true, 00:39:48.854 "data_offset": 256, 00:39:48.854 "data_size": 7936 00:39:48.854 } 00:39:48.854 ] 00:39:48.854 }' 00:39:48.854 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:39:49.112 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:39:49.373 [2024-06-07 12:42:12.760988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:49.373 [2024-06-07 12:42:12.761180] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:39:49.373 [2024-06-07 12:42:12.761194] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:39:49.373 request: 00:39:49.373 { 00:39:49.373 "raid_bdev": "raid_bdev1", 00:39:49.373 "base_bdev": "BaseBdev1", 00:39:49.373 "method": "bdev_raid_add_base_bdev", 00:39:49.373 "req_id": 1 00:39:49.373 } 00:39:49.373 Got JSON-RPC error response 00:39:49.373 response: 00:39:49.373 { 00:39:49.373 "code": -22, 00:39:49.373 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:39:49.373 } 00:39:49.373 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # es=1 00:39:49.373 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:39:49.373 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:39:49.373 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:39:49.373 12:42:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:50.335 12:42:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:50.594 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:50.594 "name": "raid_bdev1", 00:39:50.594 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:50.594 "strip_size_kb": 0, 00:39:50.594 "state": "online", 00:39:50.594 "raid_level": "raid1", 00:39:50.594 "superblock": true, 00:39:50.594 "num_base_bdevs": 2, 00:39:50.594 "num_base_bdevs_discovered": 1, 00:39:50.594 "num_base_bdevs_operational": 1, 00:39:50.594 "base_bdevs_list": [ 00:39:50.594 { 00:39:50.594 "name": null, 00:39:50.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:50.594 "is_configured": false, 00:39:50.594 "data_offset": 256, 00:39:50.594 "data_size": 7936 00:39:50.594 }, 00:39:50.594 { 00:39:50.594 "name": "BaseBdev2", 00:39:50.594 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:50.594 "is_configured": true, 00:39:50.594 "data_offset": 256, 00:39:50.594 "data_size": 7936 00:39:50.594 } 00:39:50.594 ] 00:39:50.594 }' 00:39:50.594 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:50.594 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:39:51.170 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:39:51.442 "name": "raid_bdev1", 00:39:51.442 "uuid": "989f7586-8078-4c72-8d3c-404f353e5541", 00:39:51.442 "strip_size_kb": 0, 00:39:51.442 "state": "online", 00:39:51.442 "raid_level": "raid1", 00:39:51.442 "superblock": true, 00:39:51.442 "num_base_bdevs": 2, 00:39:51.442 "num_base_bdevs_discovered": 1, 00:39:51.442 "num_base_bdevs_operational": 1, 00:39:51.442 "base_bdevs_list": [ 00:39:51.442 { 00:39:51.442 "name": null, 00:39:51.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:51.442 "is_configured": false, 00:39:51.442 "data_offset": 256, 00:39:51.442 "data_size": 7936 00:39:51.442 }, 00:39:51.442 { 00:39:51.442 "name": "BaseBdev2", 00:39:51.442 "uuid": "17ced8ad-0b0c-52c9-b98a-87f793ac648b", 00:39:51.442 "is_configured": true, 00:39:51.442 "data_offset": 256, 00:39:51.442 "data_size": 7936 00:39:51.442 } 00:39:51.442 ] 00:39:51.442 }' 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 229460 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 229460 ']' 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 229460 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:39:51.442 12:42:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 229460 00:39:51.442 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:39:51.442 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:39:51.442 killing process with pid 229460 00:39:51.442 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 229460' 00:39:51.442 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 229460 00:39:51.442 Received shutdown signal, test time was about 60.000000 seconds 00:39:51.442 00:39:51.442 Latency(us) 00:39:51.442 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:51.442 =================================================================================================================== 00:39:51.442 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:39:51.442 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 229460 00:39:51.442 [2024-06-07 12:42:15.011138] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:39:51.442 [2024-06-07 12:42:15.011293] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:39:51.442 [2024-06-07 12:42:15.011345] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:39:51.442 [2024-06-07 12:42:15.011355] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state offline 00:39:51.442 [2024-06-07 12:42:15.079061] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:39:52.030 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:39:52.030 00:39:52.030 real 0m32.100s 00:39:52.030 user 0m51.331s 00:39:52.030 sys 0m5.169s 00:39:52.030 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:39:52.030 ************************************ 00:39:52.030 END TEST raid_rebuild_test_sb_md_separate 00:39:52.031 12:42:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:39:52.031 ************************************ 00:39:52.031 12:42:15 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:39:52.031 12:42:15 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:39:52.031 12:42:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:39:52.031 12:42:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:39:52.031 12:42:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:39:52.031 ************************************ 00:39:52.031 START TEST raid_state_function_test_sb_md_interleaved 00:39:52.031 ************************************ 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # echo BaseBdev1 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # echo BaseBdev2 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=230358 00:39:52.031 Process raid pid: 230358 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 230358' 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 230358 /var/tmp/spdk-raid.sock 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 230358 ']' 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:39:52.031 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:39:52.031 12:42:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:39:52.031 [2024-06-07 12:42:15.607899] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:39:52.031 [2024-06-07 12:42:15.608122] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:39:52.318 [2024-06-07 12:42:15.749798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:52.318 [2024-06-07 12:42:15.844314] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:39:52.318 [2024-06-07 12:42:15.927455] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:53.273 [2024-06-07 12:42:16.802952] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:39:53.273 [2024-06-07 12:42:16.803068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:39:53.273 [2024-06-07 12:42:16.803083] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:53.273 [2024-06-07 12:42:16.803115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:53.273 12:42:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:53.530 12:42:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:53.530 "name": "Existed_Raid", 00:39:53.530 "uuid": "ade9f750-62cd-49d4-9bc3-304f01f8a6e5", 00:39:53.530 "strip_size_kb": 0, 00:39:53.530 "state": "configuring", 00:39:53.530 "raid_level": "raid1", 00:39:53.530 "superblock": true, 00:39:53.530 "num_base_bdevs": 2, 00:39:53.530 "num_base_bdevs_discovered": 0, 00:39:53.530 "num_base_bdevs_operational": 2, 00:39:53.530 "base_bdevs_list": [ 00:39:53.530 { 00:39:53.530 "name": "BaseBdev1", 00:39:53.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:53.530 "is_configured": false, 00:39:53.530 "data_offset": 0, 00:39:53.530 "data_size": 0 00:39:53.530 }, 00:39:53.530 { 00:39:53.530 "name": "BaseBdev2", 00:39:53.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:53.530 "is_configured": false, 00:39:53.530 "data_offset": 0, 00:39:53.530 "data_size": 0 00:39:53.530 } 00:39:53.530 ] 00:39:53.530 }' 00:39:53.530 12:42:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:53.530 12:42:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:39:54.094 12:42:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:39:54.351 [2024-06-07 12:42:17.894963] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:39:54.351 [2024-06-07 12:42:17.895033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005480 name Existed_Raid, state configuring 00:39:54.352 12:42:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:54.609 [2024-06-07 12:42:18.127078] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:39:54.609 [2024-06-07 12:42:18.127239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:39:54.609 [2024-06-07 12:42:18.127256] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:54.609 [2024-06-07 12:42:18.127287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:54.609 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:39:54.867 [2024-06-07 12:42:18.387663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:54.867 BaseBdev1 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:39:54.867 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:39:55.126 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:39:55.464 [ 00:39:55.464 { 00:39:55.464 "name": "BaseBdev1", 00:39:55.464 "aliases": [ 00:39:55.464 "c737d37d-9922-4391-b2c9-a80af4a2d522" 00:39:55.464 ], 00:39:55.464 "product_name": "Malloc disk", 00:39:55.464 "block_size": 4128, 00:39:55.464 "num_blocks": 8192, 00:39:55.464 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:39:55.464 "md_size": 32, 00:39:55.464 "md_interleave": true, 00:39:55.464 "dif_type": 0, 00:39:55.464 "assigned_rate_limits": { 00:39:55.464 "rw_ios_per_sec": 0, 00:39:55.464 "rw_mbytes_per_sec": 0, 00:39:55.464 "r_mbytes_per_sec": 0, 00:39:55.464 "w_mbytes_per_sec": 0 00:39:55.464 }, 00:39:55.464 "claimed": true, 00:39:55.464 "claim_type": "exclusive_write", 00:39:55.464 "zoned": false, 00:39:55.464 "supported_io_types": { 00:39:55.464 "read": true, 00:39:55.464 "write": true, 00:39:55.464 "unmap": true, 00:39:55.464 "write_zeroes": true, 00:39:55.464 "flush": true, 00:39:55.464 "reset": true, 00:39:55.464 "compare": false, 00:39:55.464 "compare_and_write": false, 00:39:55.464 "abort": true, 00:39:55.464 "nvme_admin": false, 00:39:55.464 "nvme_io": false 00:39:55.464 }, 00:39:55.464 "memory_domains": [ 00:39:55.464 { 00:39:55.464 "dma_device_id": "system", 00:39:55.464 "dma_device_type": 1 00:39:55.464 }, 00:39:55.464 { 00:39:55.464 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:55.464 "dma_device_type": 2 00:39:55.464 } 00:39:55.464 ], 00:39:55.464 "driver_specific": {} 00:39:55.464 } 00:39:55.464 ] 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:55.464 12:42:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:55.722 12:42:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:55.722 "name": "Existed_Raid", 00:39:55.722 "uuid": "3f859dfa-39bb-4224-a83f-d839877530ec", 00:39:55.722 "strip_size_kb": 0, 00:39:55.722 "state": "configuring", 00:39:55.722 "raid_level": "raid1", 00:39:55.722 "superblock": true, 00:39:55.722 "num_base_bdevs": 2, 00:39:55.722 "num_base_bdevs_discovered": 1, 00:39:55.722 "num_base_bdevs_operational": 2, 00:39:55.722 "base_bdevs_list": [ 00:39:55.722 { 00:39:55.722 "name": "BaseBdev1", 00:39:55.722 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:39:55.722 "is_configured": true, 00:39:55.722 "data_offset": 256, 00:39:55.722 "data_size": 7936 00:39:55.722 }, 00:39:55.722 { 00:39:55.722 "name": "BaseBdev2", 00:39:55.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:55.722 "is_configured": false, 00:39:55.722 "data_offset": 0, 00:39:55.722 "data_size": 0 00:39:55.722 } 00:39:55.722 ] 00:39:55.722 }' 00:39:55.722 12:42:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:55.722 12:42:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:39:56.288 12:42:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:39:56.545 [2024-06-07 12:42:20.011994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:39:56.545 [2024-06-07 12:42:20.012089] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000005780 name Existed_Raid, state configuring 00:39:56.546 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:39:56.804 [2024-06-07 12:42:20.248122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:39:56.804 [2024-06-07 12:42:20.250263] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:39:56.804 [2024-06-07 12:42:20.250354] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:56.804 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:57.061 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:57.061 "name": "Existed_Raid", 00:39:57.061 "uuid": "3b335a4c-05fc-402f-bbe6-83a555f12e8e", 00:39:57.061 "strip_size_kb": 0, 00:39:57.061 "state": "configuring", 00:39:57.061 "raid_level": "raid1", 00:39:57.061 "superblock": true, 00:39:57.061 "num_base_bdevs": 2, 00:39:57.061 "num_base_bdevs_discovered": 1, 00:39:57.061 "num_base_bdevs_operational": 2, 00:39:57.061 "base_bdevs_list": [ 00:39:57.061 { 00:39:57.061 "name": "BaseBdev1", 00:39:57.061 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:39:57.061 "is_configured": true, 00:39:57.061 "data_offset": 256, 00:39:57.061 "data_size": 7936 00:39:57.061 }, 00:39:57.061 { 00:39:57.061 "name": "BaseBdev2", 00:39:57.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:39:57.061 "is_configured": false, 00:39:57.061 "data_offset": 0, 00:39:57.061 "data_size": 0 00:39:57.061 } 00:39:57.061 ] 00:39:57.061 }' 00:39:57.061 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:57.061 12:42:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:39:57.629 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:39:57.887 [2024-06-07 12:42:21.438892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:39:57.887 [2024-06-07 12:42:21.439135] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006080 00:39:57.887 [2024-06-07 12:42:21.439155] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:39:57.887 [2024-06-07 12:42:21.439308] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002120 00:39:57.887 [2024-06-07 12:42:21.439417] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006080 00:39:57.887 [2024-06-07 12:42:21.439433] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x616000006080 00:39:57.887 [2024-06-07 12:42:21.439497] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:39:57.887 BaseBdev2 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:39:57.887 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:39:58.145 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:39:58.403 [ 00:39:58.403 { 00:39:58.403 "name": "BaseBdev2", 00:39:58.403 "aliases": [ 00:39:58.403 "ebb1269f-b521-4ce2-9aa0-2f203125665b" 00:39:58.403 ], 00:39:58.403 "product_name": "Malloc disk", 00:39:58.403 "block_size": 4128, 00:39:58.403 "num_blocks": 8192, 00:39:58.403 "uuid": "ebb1269f-b521-4ce2-9aa0-2f203125665b", 00:39:58.403 "md_size": 32, 00:39:58.403 "md_interleave": true, 00:39:58.403 "dif_type": 0, 00:39:58.403 "assigned_rate_limits": { 00:39:58.403 "rw_ios_per_sec": 0, 00:39:58.403 "rw_mbytes_per_sec": 0, 00:39:58.403 "r_mbytes_per_sec": 0, 00:39:58.403 "w_mbytes_per_sec": 0 00:39:58.403 }, 00:39:58.403 "claimed": true, 00:39:58.403 "claim_type": "exclusive_write", 00:39:58.403 "zoned": false, 00:39:58.403 "supported_io_types": { 00:39:58.403 "read": true, 00:39:58.403 "write": true, 00:39:58.403 "unmap": true, 00:39:58.403 "write_zeroes": true, 00:39:58.403 "flush": true, 00:39:58.403 "reset": true, 00:39:58.403 "compare": false, 00:39:58.403 "compare_and_write": false, 00:39:58.403 "abort": true, 00:39:58.403 "nvme_admin": false, 00:39:58.403 "nvme_io": false 00:39:58.403 }, 00:39:58.403 "memory_domains": [ 00:39:58.403 { 00:39:58.403 "dma_device_id": "system", 00:39:58.403 "dma_device_type": 1 00:39:58.403 }, 00:39:58.403 { 00:39:58.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:58.403 "dma_device_type": 2 00:39:58.403 } 00:39:58.403 ], 00:39:58.403 "driver_specific": {} 00:39:58.403 } 00:39:58.403 ] 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:39:58.403 12:42:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:39:58.662 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:39:58.662 "name": "Existed_Raid", 00:39:58.662 "uuid": "3b335a4c-05fc-402f-bbe6-83a555f12e8e", 00:39:58.662 "strip_size_kb": 0, 00:39:58.662 "state": "online", 00:39:58.662 "raid_level": "raid1", 00:39:58.662 "superblock": true, 00:39:58.662 "num_base_bdevs": 2, 00:39:58.662 "num_base_bdevs_discovered": 2, 00:39:58.662 "num_base_bdevs_operational": 2, 00:39:58.662 "base_bdevs_list": [ 00:39:58.662 { 00:39:58.662 "name": "BaseBdev1", 00:39:58.662 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:39:58.662 "is_configured": true, 00:39:58.662 "data_offset": 256, 00:39:58.662 "data_size": 7936 00:39:58.662 }, 00:39:58.662 { 00:39:58.662 "name": "BaseBdev2", 00:39:58.662 "uuid": "ebb1269f-b521-4ce2-9aa0-2f203125665b", 00:39:58.662 "is_configured": true, 00:39:58.662 "data_offset": 256, 00:39:58.662 "data_size": 7936 00:39:58.662 } 00:39:58.662 ] 00:39:58.662 }' 00:39:58.662 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:39:58.662 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:39:59.228 12:42:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:39:59.794 [2024-06-07 12:42:23.155559] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:39:59.794 "name": "Existed_Raid", 00:39:59.794 "aliases": [ 00:39:59.794 "3b335a4c-05fc-402f-bbe6-83a555f12e8e" 00:39:59.794 ], 00:39:59.794 "product_name": "Raid Volume", 00:39:59.794 "block_size": 4128, 00:39:59.794 "num_blocks": 7936, 00:39:59.794 "uuid": "3b335a4c-05fc-402f-bbe6-83a555f12e8e", 00:39:59.794 "md_size": 32, 00:39:59.794 "md_interleave": true, 00:39:59.794 "dif_type": 0, 00:39:59.794 "assigned_rate_limits": { 00:39:59.794 "rw_ios_per_sec": 0, 00:39:59.794 "rw_mbytes_per_sec": 0, 00:39:59.794 "r_mbytes_per_sec": 0, 00:39:59.794 "w_mbytes_per_sec": 0 00:39:59.794 }, 00:39:59.794 "claimed": false, 00:39:59.794 "zoned": false, 00:39:59.794 "supported_io_types": { 00:39:59.794 "read": true, 00:39:59.794 "write": true, 00:39:59.794 "unmap": false, 00:39:59.794 "write_zeroes": true, 00:39:59.794 "flush": false, 00:39:59.794 "reset": true, 00:39:59.794 "compare": false, 00:39:59.794 "compare_and_write": false, 00:39:59.794 "abort": false, 00:39:59.794 "nvme_admin": false, 00:39:59.794 "nvme_io": false 00:39:59.794 }, 00:39:59.794 "memory_domains": [ 00:39:59.794 { 00:39:59.794 "dma_device_id": "system", 00:39:59.794 "dma_device_type": 1 00:39:59.794 }, 00:39:59.794 { 00:39:59.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:59.794 "dma_device_type": 2 00:39:59.794 }, 00:39:59.794 { 00:39:59.794 "dma_device_id": "system", 00:39:59.794 "dma_device_type": 1 00:39:59.794 }, 00:39:59.794 { 00:39:59.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:39:59.794 "dma_device_type": 2 00:39:59.794 } 00:39:59.794 ], 00:39:59.794 "driver_specific": { 00:39:59.794 "raid": { 00:39:59.794 "uuid": "3b335a4c-05fc-402f-bbe6-83a555f12e8e", 00:39:59.794 "strip_size_kb": 0, 00:39:59.794 "state": "online", 00:39:59.794 "raid_level": "raid1", 00:39:59.794 "superblock": true, 00:39:59.794 "num_base_bdevs": 2, 00:39:59.794 "num_base_bdevs_discovered": 2, 00:39:59.794 "num_base_bdevs_operational": 2, 00:39:59.794 "base_bdevs_list": [ 00:39:59.794 { 00:39:59.794 "name": "BaseBdev1", 00:39:59.794 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:39:59.794 "is_configured": true, 00:39:59.794 "data_offset": 256, 00:39:59.794 "data_size": 7936 00:39:59.794 }, 00:39:59.794 { 00:39:59.794 "name": "BaseBdev2", 00:39:59.794 "uuid": "ebb1269f-b521-4ce2-9aa0-2f203125665b", 00:39:59.794 "is_configured": true, 00:39:59.794 "data_offset": 256, 00:39:59.794 "data_size": 7936 00:39:59.794 } 00:39:59.794 ] 00:39:59.794 } 00:39:59.794 } 00:39:59.794 }' 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:39:59.794 BaseBdev2' 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:39:59.794 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:00.051 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:00.051 "name": "BaseBdev1", 00:40:00.051 "aliases": [ 00:40:00.051 "c737d37d-9922-4391-b2c9-a80af4a2d522" 00:40:00.051 ], 00:40:00.051 "product_name": "Malloc disk", 00:40:00.051 "block_size": 4128, 00:40:00.051 "num_blocks": 8192, 00:40:00.051 "uuid": "c737d37d-9922-4391-b2c9-a80af4a2d522", 00:40:00.051 "md_size": 32, 00:40:00.051 "md_interleave": true, 00:40:00.051 "dif_type": 0, 00:40:00.051 "assigned_rate_limits": { 00:40:00.051 "rw_ios_per_sec": 0, 00:40:00.051 "rw_mbytes_per_sec": 0, 00:40:00.051 "r_mbytes_per_sec": 0, 00:40:00.051 "w_mbytes_per_sec": 0 00:40:00.051 }, 00:40:00.051 "claimed": true, 00:40:00.051 "claim_type": "exclusive_write", 00:40:00.051 "zoned": false, 00:40:00.051 "supported_io_types": { 00:40:00.051 "read": true, 00:40:00.052 "write": true, 00:40:00.052 "unmap": true, 00:40:00.052 "write_zeroes": true, 00:40:00.052 "flush": true, 00:40:00.052 "reset": true, 00:40:00.052 "compare": false, 00:40:00.052 "compare_and_write": false, 00:40:00.052 "abort": true, 00:40:00.052 "nvme_admin": false, 00:40:00.052 "nvme_io": false 00:40:00.052 }, 00:40:00.052 "memory_domains": [ 00:40:00.052 { 00:40:00.052 "dma_device_id": "system", 00:40:00.052 "dma_device_type": 1 00:40:00.052 }, 00:40:00.052 { 00:40:00.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:00.052 "dma_device_type": 2 00:40:00.052 } 00:40:00.052 ], 00:40:00.052 "driver_specific": {} 00:40:00.052 }' 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:00.052 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:40:00.309 12:42:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:00.567 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:00.567 "name": "BaseBdev2", 00:40:00.567 "aliases": [ 00:40:00.567 "ebb1269f-b521-4ce2-9aa0-2f203125665b" 00:40:00.567 ], 00:40:00.567 "product_name": "Malloc disk", 00:40:00.567 "block_size": 4128, 00:40:00.567 "num_blocks": 8192, 00:40:00.567 "uuid": "ebb1269f-b521-4ce2-9aa0-2f203125665b", 00:40:00.567 "md_size": 32, 00:40:00.567 "md_interleave": true, 00:40:00.567 "dif_type": 0, 00:40:00.567 "assigned_rate_limits": { 00:40:00.567 "rw_ios_per_sec": 0, 00:40:00.567 "rw_mbytes_per_sec": 0, 00:40:00.567 "r_mbytes_per_sec": 0, 00:40:00.567 "w_mbytes_per_sec": 0 00:40:00.567 }, 00:40:00.567 "claimed": true, 00:40:00.567 "claim_type": "exclusive_write", 00:40:00.567 "zoned": false, 00:40:00.567 "supported_io_types": { 00:40:00.567 "read": true, 00:40:00.567 "write": true, 00:40:00.567 "unmap": true, 00:40:00.567 "write_zeroes": true, 00:40:00.567 "flush": true, 00:40:00.567 "reset": true, 00:40:00.567 "compare": false, 00:40:00.567 "compare_and_write": false, 00:40:00.567 "abort": true, 00:40:00.567 "nvme_admin": false, 00:40:00.567 "nvme_io": false 00:40:00.567 }, 00:40:00.567 "memory_domains": [ 00:40:00.567 { 00:40:00.567 "dma_device_id": "system", 00:40:00.567 "dma_device_type": 1 00:40:00.567 }, 00:40:00.567 { 00:40:00.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:00.567 "dma_device_type": 2 00:40:00.567 } 00:40:00.567 ], 00:40:00.567 "driver_specific": {} 00:40:00.567 }' 00:40:00.567 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:00.567 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:00.567 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:00.567 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:00.825 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:40:01.083 [2024-06-07 12:42:24.651589] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:01.083 12:42:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:40:01.649 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:01.649 "name": "Existed_Raid", 00:40:01.649 "uuid": "3b335a4c-05fc-402f-bbe6-83a555f12e8e", 00:40:01.649 "strip_size_kb": 0, 00:40:01.649 "state": "online", 00:40:01.649 "raid_level": "raid1", 00:40:01.649 "superblock": true, 00:40:01.649 "num_base_bdevs": 2, 00:40:01.649 "num_base_bdevs_discovered": 1, 00:40:01.649 "num_base_bdevs_operational": 1, 00:40:01.649 "base_bdevs_list": [ 00:40:01.649 { 00:40:01.649 "name": null, 00:40:01.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:01.649 "is_configured": false, 00:40:01.649 "data_offset": 256, 00:40:01.649 "data_size": 7936 00:40:01.649 }, 00:40:01.649 { 00:40:01.649 "name": "BaseBdev2", 00:40:01.649 "uuid": "ebb1269f-b521-4ce2-9aa0-2f203125665b", 00:40:01.649 "is_configured": true, 00:40:01.649 "data_offset": 256, 00:40:01.649 "data_size": 7936 00:40:01.649 } 00:40:01.649 ] 00:40:01.649 }' 00:40:01.649 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:01.649 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:02.214 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:40:02.214 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:02.214 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:02.214 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:40:02.471 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:40:02.471 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:40:02.471 12:42:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:40:02.729 [2024-06-07 12:42:26.163574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:40:02.729 [2024-06-07 12:42:26.163748] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:02.729 [2024-06-07 12:42:26.187302] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:02.729 [2024-06-07 12:42:26.187362] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:02.729 [2024-06-07 12:42:26.187376] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006080 name Existed_Raid, state offline 00:40:02.729 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:40:02.729 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:40:02.729 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:02.729 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 230358 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 230358 ']' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 230358 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 230358 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 230358' 00:40:02.987 killing process with pid 230358 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 230358 00:40:02.987 [2024-06-07 12:42:26.481364] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:02.987 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 230358 00:40:02.987 [2024-06-07 12:42:26.481459] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:03.245 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:40:03.245 00:40:03.245 real 0m11.276s 00:40:03.245 user 0m19.900s 00:40:03.245 sys 0m1.934s 00:40:03.245 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:03.245 12:42:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:03.245 ************************************ 00:40:03.245 END TEST raid_state_function_test_sb_md_interleaved 00:40:03.245 ************************************ 00:40:03.504 12:42:26 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:40:03.504 12:42:26 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:40:03.504 12:42:26 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:03.504 12:42:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:03.504 ************************************ 00:40:03.504 START TEST raid_superblock_test_md_interleaved 00:40:03.504 ************************************ 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:40:03.504 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=230727 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 230727 /var/tmp/spdk-raid.sock 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 230727 ']' 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:03.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:03.505 12:42:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:03.505 [2024-06-07 12:42:26.955211] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:03.505 [2024-06-07 12:42:26.955485] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid230727 ] 00:40:03.505 [2024-06-07 12:42:27.094465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:03.763 [2024-06-07 12:42:27.195857] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:03.763 [2024-06-07 12:42:27.286576] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:03.763 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:40:04.021 malloc1 00:40:04.021 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:04.279 [2024-06-07 12:42:27.903523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:04.279 [2024-06-07 12:42:27.903710] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:04.279 [2024-06-07 12:42:27.903768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:40:04.279 [2024-06-07 12:42:27.903848] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:04.279 [2024-06-07 12:42:27.906200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:04.279 [2024-06-07 12:42:27.906285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:04.279 pt1 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:40:04.538 12:42:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:40:04.796 malloc2 00:40:04.796 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:05.054 [2024-06-07 12:42:28.471865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:05.054 [2024-06-07 12:42:28.472002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:05.054 [2024-06-07 12:42:28.472056] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:40:05.054 [2024-06-07 12:42:28.472106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:05.054 [2024-06-07 12:42:28.474316] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:05.054 [2024-06-07 12:42:28.474390] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:05.054 pt2 00:40:05.054 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:40:05.054 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:40:05.054 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:40:05.312 [2024-06-07 12:42:28.727978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:05.312 [2024-06-07 12:42:28.730291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:05.312 [2024-06-07 12:42:28.730523] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000006c80 00:40:05.312 [2024-06-07 12:42:28.730550] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:05.312 [2024-06-07 12:42:28.730743] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000022c0 00:40:05.312 [2024-06-07 12:42:28.730841] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000006c80 00:40:05.312 [2024-06-07 12:42:28.730852] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000006c80 00:40:05.312 [2024-06-07 12:42:28.730915] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:05.312 12:42:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:05.570 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:05.570 "name": "raid_bdev1", 00:40:05.570 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:05.570 "strip_size_kb": 0, 00:40:05.570 "state": "online", 00:40:05.570 "raid_level": "raid1", 00:40:05.570 "superblock": true, 00:40:05.570 "num_base_bdevs": 2, 00:40:05.570 "num_base_bdevs_discovered": 2, 00:40:05.570 "num_base_bdevs_operational": 2, 00:40:05.570 "base_bdevs_list": [ 00:40:05.570 { 00:40:05.570 "name": "pt1", 00:40:05.570 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:05.570 "is_configured": true, 00:40:05.570 "data_offset": 256, 00:40:05.570 "data_size": 7936 00:40:05.570 }, 00:40:05.570 { 00:40:05.570 "name": "pt2", 00:40:05.570 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:05.570 "is_configured": true, 00:40:05.570 "data_offset": 256, 00:40:05.570 "data_size": 7936 00:40:05.570 } 00:40:05.570 ] 00:40:05.570 }' 00:40:05.570 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:05.570 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:06.520 12:42:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:06.520 [2024-06-07 12:42:30.064394] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:06.520 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:06.520 "name": "raid_bdev1", 00:40:06.520 "aliases": [ 00:40:06.520 "76f68417-6b73-4784-bc52-cbef6098215f" 00:40:06.520 ], 00:40:06.520 "product_name": "Raid Volume", 00:40:06.520 "block_size": 4128, 00:40:06.520 "num_blocks": 7936, 00:40:06.520 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:06.520 "md_size": 32, 00:40:06.520 "md_interleave": true, 00:40:06.520 "dif_type": 0, 00:40:06.520 "assigned_rate_limits": { 00:40:06.520 "rw_ios_per_sec": 0, 00:40:06.520 "rw_mbytes_per_sec": 0, 00:40:06.520 "r_mbytes_per_sec": 0, 00:40:06.520 "w_mbytes_per_sec": 0 00:40:06.520 }, 00:40:06.520 "claimed": false, 00:40:06.520 "zoned": false, 00:40:06.520 "supported_io_types": { 00:40:06.520 "read": true, 00:40:06.520 "write": true, 00:40:06.520 "unmap": false, 00:40:06.520 "write_zeroes": true, 00:40:06.520 "flush": false, 00:40:06.520 "reset": true, 00:40:06.520 "compare": false, 00:40:06.520 "compare_and_write": false, 00:40:06.520 "abort": false, 00:40:06.520 "nvme_admin": false, 00:40:06.520 "nvme_io": false 00:40:06.520 }, 00:40:06.520 "memory_domains": [ 00:40:06.520 { 00:40:06.520 "dma_device_id": "system", 00:40:06.520 "dma_device_type": 1 00:40:06.520 }, 00:40:06.520 { 00:40:06.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:06.520 "dma_device_type": 2 00:40:06.520 }, 00:40:06.520 { 00:40:06.520 "dma_device_id": "system", 00:40:06.520 "dma_device_type": 1 00:40:06.521 }, 00:40:06.521 { 00:40:06.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:06.521 "dma_device_type": 2 00:40:06.521 } 00:40:06.521 ], 00:40:06.521 "driver_specific": { 00:40:06.521 "raid": { 00:40:06.521 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:06.521 "strip_size_kb": 0, 00:40:06.521 "state": "online", 00:40:06.521 "raid_level": "raid1", 00:40:06.521 "superblock": true, 00:40:06.521 "num_base_bdevs": 2, 00:40:06.521 "num_base_bdevs_discovered": 2, 00:40:06.521 "num_base_bdevs_operational": 2, 00:40:06.521 "base_bdevs_list": [ 00:40:06.521 { 00:40:06.521 "name": "pt1", 00:40:06.521 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:06.521 "is_configured": true, 00:40:06.521 "data_offset": 256, 00:40:06.521 "data_size": 7936 00:40:06.521 }, 00:40:06.521 { 00:40:06.521 "name": "pt2", 00:40:06.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:06.521 "is_configured": true, 00:40:06.521 "data_offset": 256, 00:40:06.521 "data_size": 7936 00:40:06.521 } 00:40:06.521 ] 00:40:06.521 } 00:40:06.521 } 00:40:06.521 }' 00:40:06.521 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:06.521 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:06.521 pt2' 00:40:06.521 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:06.521 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:06.521 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:06.800 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:06.800 "name": "pt1", 00:40:06.800 "aliases": [ 00:40:06.800 "00000000-0000-0000-0000-000000000001" 00:40:06.800 ], 00:40:06.800 "product_name": "passthru", 00:40:06.800 "block_size": 4128, 00:40:06.800 "num_blocks": 8192, 00:40:06.800 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:06.800 "md_size": 32, 00:40:06.800 "md_interleave": true, 00:40:06.800 "dif_type": 0, 00:40:06.800 "assigned_rate_limits": { 00:40:06.800 "rw_ios_per_sec": 0, 00:40:06.800 "rw_mbytes_per_sec": 0, 00:40:06.800 "r_mbytes_per_sec": 0, 00:40:06.800 "w_mbytes_per_sec": 0 00:40:06.800 }, 00:40:06.800 "claimed": true, 00:40:06.800 "claim_type": "exclusive_write", 00:40:06.800 "zoned": false, 00:40:06.800 "supported_io_types": { 00:40:06.800 "read": true, 00:40:06.800 "write": true, 00:40:06.800 "unmap": true, 00:40:06.800 "write_zeroes": true, 00:40:06.800 "flush": true, 00:40:06.800 "reset": true, 00:40:06.800 "compare": false, 00:40:06.800 "compare_and_write": false, 00:40:06.800 "abort": true, 00:40:06.800 "nvme_admin": false, 00:40:06.800 "nvme_io": false 00:40:06.800 }, 00:40:06.800 "memory_domains": [ 00:40:06.800 { 00:40:06.800 "dma_device_id": "system", 00:40:06.800 "dma_device_type": 1 00:40:06.800 }, 00:40:06.800 { 00:40:06.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:06.800 "dma_device_type": 2 00:40:06.800 } 00:40:06.800 ], 00:40:06.800 "driver_specific": { 00:40:06.800 "passthru": { 00:40:06.800 "name": "pt1", 00:40:06.800 "base_bdev_name": "malloc1" 00:40:06.800 } 00:40:06.800 } 00:40:06.800 }' 00:40:06.800 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:07.058 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:07.316 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:07.316 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:07.316 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:07.316 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:07.316 12:42:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:07.574 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:07.574 "name": "pt2", 00:40:07.574 "aliases": [ 00:40:07.574 "00000000-0000-0000-0000-000000000002" 00:40:07.574 ], 00:40:07.574 "product_name": "passthru", 00:40:07.574 "block_size": 4128, 00:40:07.574 "num_blocks": 8192, 00:40:07.574 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:07.574 "md_size": 32, 00:40:07.574 "md_interleave": true, 00:40:07.574 "dif_type": 0, 00:40:07.574 "assigned_rate_limits": { 00:40:07.574 "rw_ios_per_sec": 0, 00:40:07.574 "rw_mbytes_per_sec": 0, 00:40:07.574 "r_mbytes_per_sec": 0, 00:40:07.574 "w_mbytes_per_sec": 0 00:40:07.574 }, 00:40:07.574 "claimed": true, 00:40:07.574 "claim_type": "exclusive_write", 00:40:07.574 "zoned": false, 00:40:07.574 "supported_io_types": { 00:40:07.574 "read": true, 00:40:07.574 "write": true, 00:40:07.574 "unmap": true, 00:40:07.574 "write_zeroes": true, 00:40:07.574 "flush": true, 00:40:07.574 "reset": true, 00:40:07.574 "compare": false, 00:40:07.574 "compare_and_write": false, 00:40:07.574 "abort": true, 00:40:07.574 "nvme_admin": false, 00:40:07.574 "nvme_io": false 00:40:07.574 }, 00:40:07.574 "memory_domains": [ 00:40:07.574 { 00:40:07.574 "dma_device_id": "system", 00:40:07.574 "dma_device_type": 1 00:40:07.574 }, 00:40:07.574 { 00:40:07.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:07.574 "dma_device_type": 2 00:40:07.574 } 00:40:07.574 ], 00:40:07.574 "driver_specific": { 00:40:07.574 "passthru": { 00:40:07.574 "name": "pt2", 00:40:07.574 "base_bdev_name": "malloc2" 00:40:07.574 } 00:40:07.574 } 00:40:07.574 }' 00:40:07.574 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:07.574 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:07.574 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:07.574 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:07.832 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:40:08.090 [2024-06-07 12:42:31.720519] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:08.349 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=76f68417-6b73-4784-bc52-cbef6098215f 00:40:08.350 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 76f68417-6b73-4784-bc52-cbef6098215f ']' 00:40:08.350 12:42:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:08.350 [2024-06-07 12:42:31.992394] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:08.350 [2024-06-07 12:42:31.992458] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:08.350 [2024-06-07 12:42:31.992586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:08.350 [2024-06-07 12:42:31.992674] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:08.350 [2024-06-07 12:42:31.992687] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000006c80 name raid_bdev1, state offline 00:40:08.608 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:08.608 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:40:08.866 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:40:08.866 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:40:08.866 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:08.866 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:09.126 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:40:09.126 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:09.424 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:40:09.424 12:42:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:40:09.682 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:40:09.939 [2024-06-07 12:42:33.520595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:40:09.939 [2024-06-07 12:42:33.522777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:40:09.939 [2024-06-07 12:42:33.522852] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:40:09.939 [2024-06-07 12:42:33.522948] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:40:09.939 [2024-06-07 12:42:33.522987] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:09.939 [2024-06-07 12:42:33.523000] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007280 name raid_bdev1, state configuring 00:40:09.939 request: 00:40:09.939 { 00:40:09.939 "name": "raid_bdev1", 00:40:09.939 "raid_level": "raid1", 00:40:09.939 "base_bdevs": [ 00:40:09.939 "malloc1", 00:40:09.939 "malloc2" 00:40:09.939 ], 00:40:09.939 "superblock": false, 00:40:09.940 "method": "bdev_raid_create", 00:40:09.940 "req_id": 1 00:40:09.940 } 00:40:09.940 Got JSON-RPC error response 00:40:09.940 response: 00:40:09.940 { 00:40:09.940 "code": -17, 00:40:09.940 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:40:09.940 } 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:09.940 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:40:10.198 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:40:10.198 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:40:10.198 12:42:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:10.455 [2024-06-07 12:42:34.080605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:10.455 [2024-06-07 12:42:34.080778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:10.455 [2024-06-07 12:42:34.080815] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:40:10.455 [2024-06-07 12:42:34.080849] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:10.455 [2024-06-07 12:42:34.083039] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:10.455 [2024-06-07 12:42:34.083114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:10.455 [2024-06-07 12:42:34.083183] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:10.455 [2024-06-07 12:42:34.083250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:10.455 pt1 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:10.712 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:10.970 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:10.970 "name": "raid_bdev1", 00:40:10.970 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:10.970 "strip_size_kb": 0, 00:40:10.970 "state": "configuring", 00:40:10.970 "raid_level": "raid1", 00:40:10.970 "superblock": true, 00:40:10.970 "num_base_bdevs": 2, 00:40:10.970 "num_base_bdevs_discovered": 1, 00:40:10.970 "num_base_bdevs_operational": 2, 00:40:10.970 "base_bdevs_list": [ 00:40:10.970 { 00:40:10.970 "name": "pt1", 00:40:10.970 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:10.970 "is_configured": true, 00:40:10.970 "data_offset": 256, 00:40:10.970 "data_size": 7936 00:40:10.970 }, 00:40:10.970 { 00:40:10.970 "name": null, 00:40:10.970 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:10.970 "is_configured": false, 00:40:10.970 "data_offset": 256, 00:40:10.970 "data_size": 7936 00:40:10.970 } 00:40:10.970 ] 00:40:10.970 }' 00:40:10.970 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:10.970 12:42:34 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:11.904 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:40:11.904 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:40:11.904 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:11.904 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:12.162 [2024-06-07 12:42:35.644861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:12.162 [2024-06-07 12:42:35.645092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:12.162 [2024-06-07 12:42:35.645159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008180 00:40:12.162 [2024-06-07 12:42:35.645206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:12.162 [2024-06-07 12:42:35.645473] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:12.162 [2024-06-07 12:42:35.645550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:12.162 [2024-06-07 12:42:35.645660] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:12.162 [2024-06-07 12:42:35.645701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:12.162 [2024-06-07 12:42:35.645841] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:40:12.162 [2024-06-07 12:42:35.645868] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:12.162 [2024-06-07 12:42:35.645954] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:40:12.162 [2024-06-07 12:42:35.646026] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:40:12.162 [2024-06-07 12:42:35.646054] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:40:12.162 [2024-06-07 12:42:35.646110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:12.162 pt2 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:12.162 12:42:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:12.420 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:12.420 "name": "raid_bdev1", 00:40:12.420 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:12.420 "strip_size_kb": 0, 00:40:12.420 "state": "online", 00:40:12.420 "raid_level": "raid1", 00:40:12.420 "superblock": true, 00:40:12.420 "num_base_bdevs": 2, 00:40:12.420 "num_base_bdevs_discovered": 2, 00:40:12.420 "num_base_bdevs_operational": 2, 00:40:12.420 "base_bdevs_list": [ 00:40:12.420 { 00:40:12.420 "name": "pt1", 00:40:12.420 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:12.420 "is_configured": true, 00:40:12.420 "data_offset": 256, 00:40:12.420 "data_size": 7936 00:40:12.420 }, 00:40:12.420 { 00:40:12.420 "name": "pt2", 00:40:12.420 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:12.420 "is_configured": true, 00:40:12.420 "data_offset": 256, 00:40:12.420 "data_size": 7936 00:40:12.420 } 00:40:12.420 ] 00:40:12.420 }' 00:40:12.420 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:12.420 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:13.354 12:42:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:40:13.612 [2024-06-07 12:42:37.145187] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:13.612 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:40:13.612 "name": "raid_bdev1", 00:40:13.612 "aliases": [ 00:40:13.612 "76f68417-6b73-4784-bc52-cbef6098215f" 00:40:13.612 ], 00:40:13.612 "product_name": "Raid Volume", 00:40:13.612 "block_size": 4128, 00:40:13.612 "num_blocks": 7936, 00:40:13.612 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:13.612 "md_size": 32, 00:40:13.612 "md_interleave": true, 00:40:13.612 "dif_type": 0, 00:40:13.612 "assigned_rate_limits": { 00:40:13.612 "rw_ios_per_sec": 0, 00:40:13.612 "rw_mbytes_per_sec": 0, 00:40:13.612 "r_mbytes_per_sec": 0, 00:40:13.612 "w_mbytes_per_sec": 0 00:40:13.612 }, 00:40:13.612 "claimed": false, 00:40:13.612 "zoned": false, 00:40:13.612 "supported_io_types": { 00:40:13.612 "read": true, 00:40:13.612 "write": true, 00:40:13.612 "unmap": false, 00:40:13.612 "write_zeroes": true, 00:40:13.612 "flush": false, 00:40:13.612 "reset": true, 00:40:13.612 "compare": false, 00:40:13.612 "compare_and_write": false, 00:40:13.612 "abort": false, 00:40:13.612 "nvme_admin": false, 00:40:13.612 "nvme_io": false 00:40:13.612 }, 00:40:13.612 "memory_domains": [ 00:40:13.612 { 00:40:13.612 "dma_device_id": "system", 00:40:13.612 "dma_device_type": 1 00:40:13.612 }, 00:40:13.612 { 00:40:13.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:13.612 "dma_device_type": 2 00:40:13.612 }, 00:40:13.612 { 00:40:13.612 "dma_device_id": "system", 00:40:13.612 "dma_device_type": 1 00:40:13.612 }, 00:40:13.612 { 00:40:13.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:13.612 "dma_device_type": 2 00:40:13.612 } 00:40:13.612 ], 00:40:13.612 "driver_specific": { 00:40:13.612 "raid": { 00:40:13.612 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:13.612 "strip_size_kb": 0, 00:40:13.612 "state": "online", 00:40:13.612 "raid_level": "raid1", 00:40:13.612 "superblock": true, 00:40:13.612 "num_base_bdevs": 2, 00:40:13.612 "num_base_bdevs_discovered": 2, 00:40:13.612 "num_base_bdevs_operational": 2, 00:40:13.612 "base_bdevs_list": [ 00:40:13.612 { 00:40:13.612 "name": "pt1", 00:40:13.613 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:13.613 "is_configured": true, 00:40:13.613 "data_offset": 256, 00:40:13.613 "data_size": 7936 00:40:13.613 }, 00:40:13.613 { 00:40:13.613 "name": "pt2", 00:40:13.613 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:13.613 "is_configured": true, 00:40:13.613 "data_offset": 256, 00:40:13.613 "data_size": 7936 00:40:13.613 } 00:40:13.613 ] 00:40:13.613 } 00:40:13.613 } 00:40:13.613 }' 00:40:13.613 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:40:13.613 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:40:13.613 pt2' 00:40:13.613 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:13.613 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:40:13.613 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:14.179 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:14.179 "name": "pt1", 00:40:14.179 "aliases": [ 00:40:14.179 "00000000-0000-0000-0000-000000000001" 00:40:14.179 ], 00:40:14.179 "product_name": "passthru", 00:40:14.179 "block_size": 4128, 00:40:14.179 "num_blocks": 8192, 00:40:14.179 "uuid": "00000000-0000-0000-0000-000000000001", 00:40:14.179 "md_size": 32, 00:40:14.179 "md_interleave": true, 00:40:14.179 "dif_type": 0, 00:40:14.179 "assigned_rate_limits": { 00:40:14.179 "rw_ios_per_sec": 0, 00:40:14.179 "rw_mbytes_per_sec": 0, 00:40:14.179 "r_mbytes_per_sec": 0, 00:40:14.179 "w_mbytes_per_sec": 0 00:40:14.179 }, 00:40:14.179 "claimed": true, 00:40:14.179 "claim_type": "exclusive_write", 00:40:14.179 "zoned": false, 00:40:14.179 "supported_io_types": { 00:40:14.179 "read": true, 00:40:14.179 "write": true, 00:40:14.179 "unmap": true, 00:40:14.179 "write_zeroes": true, 00:40:14.179 "flush": true, 00:40:14.179 "reset": true, 00:40:14.179 "compare": false, 00:40:14.179 "compare_and_write": false, 00:40:14.179 "abort": true, 00:40:14.179 "nvme_admin": false, 00:40:14.179 "nvme_io": false 00:40:14.179 }, 00:40:14.179 "memory_domains": [ 00:40:14.179 { 00:40:14.179 "dma_device_id": "system", 00:40:14.179 "dma_device_type": 1 00:40:14.179 }, 00:40:14.179 { 00:40:14.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:14.179 "dma_device_type": 2 00:40:14.180 } 00:40:14.180 ], 00:40:14.180 "driver_specific": { 00:40:14.180 "passthru": { 00:40:14.180 "name": "pt1", 00:40:14.180 "base_bdev_name": "malloc1" 00:40:14.180 } 00:40:14.180 } 00:40:14.180 }' 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:14.180 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:14.438 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:14.438 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:14.438 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:14.438 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:14.438 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:40:14.439 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:40:14.439 12:42:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:40:14.697 "name": "pt2", 00:40:14.697 "aliases": [ 00:40:14.697 "00000000-0000-0000-0000-000000000002" 00:40:14.697 ], 00:40:14.697 "product_name": "passthru", 00:40:14.697 "block_size": 4128, 00:40:14.697 "num_blocks": 8192, 00:40:14.697 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:14.697 "md_size": 32, 00:40:14.697 "md_interleave": true, 00:40:14.697 "dif_type": 0, 00:40:14.697 "assigned_rate_limits": { 00:40:14.697 "rw_ios_per_sec": 0, 00:40:14.697 "rw_mbytes_per_sec": 0, 00:40:14.697 "r_mbytes_per_sec": 0, 00:40:14.697 "w_mbytes_per_sec": 0 00:40:14.697 }, 00:40:14.697 "claimed": true, 00:40:14.697 "claim_type": "exclusive_write", 00:40:14.697 "zoned": false, 00:40:14.697 "supported_io_types": { 00:40:14.697 "read": true, 00:40:14.697 "write": true, 00:40:14.697 "unmap": true, 00:40:14.697 "write_zeroes": true, 00:40:14.697 "flush": true, 00:40:14.697 "reset": true, 00:40:14.697 "compare": false, 00:40:14.697 "compare_and_write": false, 00:40:14.697 "abort": true, 00:40:14.697 "nvme_admin": false, 00:40:14.697 "nvme_io": false 00:40:14.697 }, 00:40:14.697 "memory_domains": [ 00:40:14.697 { 00:40:14.697 "dma_device_id": "system", 00:40:14.697 "dma_device_type": 1 00:40:14.697 }, 00:40:14.697 { 00:40:14.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:14.697 "dma_device_type": 2 00:40:14.697 } 00:40:14.697 ], 00:40:14.697 "driver_specific": { 00:40:14.697 "passthru": { 00:40:14.697 "name": "pt2", 00:40:14.697 "base_bdev_name": "malloc2" 00:40:14.697 } 00:40:14.697 } 00:40:14.697 }' 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:40:14.697 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:14.955 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:40:15.213 [2024-06-07 12:42:38.765418] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:15.214 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 76f68417-6b73-4784-bc52-cbef6098215f '!=' 76f68417-6b73-4784-bc52-cbef6098215f ']' 00:40:15.214 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:40:15.214 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:40:15.214 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:40:15.214 12:42:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:40:15.472 [2024-06-07 12:42:39.017306] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:15.472 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:15.730 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:15.730 "name": "raid_bdev1", 00:40:15.730 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:15.730 "strip_size_kb": 0, 00:40:15.730 "state": "online", 00:40:15.730 "raid_level": "raid1", 00:40:15.730 "superblock": true, 00:40:15.730 "num_base_bdevs": 2, 00:40:15.730 "num_base_bdevs_discovered": 1, 00:40:15.730 "num_base_bdevs_operational": 1, 00:40:15.730 "base_bdevs_list": [ 00:40:15.730 { 00:40:15.730 "name": null, 00:40:15.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:15.730 "is_configured": false, 00:40:15.730 "data_offset": 256, 00:40:15.730 "data_size": 7936 00:40:15.730 }, 00:40:15.730 { 00:40:15.730 "name": "pt2", 00:40:15.730 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:15.730 "is_configured": true, 00:40:15.730 "data_offset": 256, 00:40:15.730 "data_size": 7936 00:40:15.730 } 00:40:15.730 ] 00:40:15.730 }' 00:40:15.730 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:15.730 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:16.294 12:42:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:16.551 [2024-06-07 12:42:40.181417] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:16.551 [2024-06-07 12:42:40.181480] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:16.551 [2024-06-07 12:42:40.181559] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:16.551 [2024-06-07 12:42:40.181606] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:16.551 [2024-06-07 12:42:40.181616] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:16.857 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:40:17.115 12:42:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:40:17.373 [2024-06-07 12:42:40.997641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:40:17.373 [2024-06-07 12:42:40.997822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:17.373 [2024-06-07 12:42:40.997868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:40:17.373 [2024-06-07 12:42:40.997902] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:17.373 [2024-06-07 12:42:41.000150] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:17.373 [2024-06-07 12:42:41.000239] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:40:17.373 [2024-06-07 12:42:41.000308] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:40:17.373 [2024-06-07 12:42:41.000345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:17.373 [2024-06-07 12:42:41.000407] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000008a80 00:40:17.373 [2024-06-07 12:42:41.000417] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:17.373 [2024-06-07 12:42:41.000490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:40:17.373 [2024-06-07 12:42:41.000536] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000008a80 00:40:17.373 [2024-06-07 12:42:41.000545] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000008a80 00:40:17.373 [2024-06-07 12:42:41.000587] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:17.373 pt2 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:17.631 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:17.890 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:17.890 "name": "raid_bdev1", 00:40:17.890 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:17.890 "strip_size_kb": 0, 00:40:17.890 "state": "online", 00:40:17.890 "raid_level": "raid1", 00:40:17.890 "superblock": true, 00:40:17.890 "num_base_bdevs": 2, 00:40:17.890 "num_base_bdevs_discovered": 1, 00:40:17.890 "num_base_bdevs_operational": 1, 00:40:17.890 "base_bdevs_list": [ 00:40:17.890 { 00:40:17.890 "name": null, 00:40:17.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:17.890 "is_configured": false, 00:40:17.890 "data_offset": 256, 00:40:17.890 "data_size": 7936 00:40:17.890 }, 00:40:17.890 { 00:40:17.890 "name": "pt2", 00:40:17.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:17.890 "is_configured": true, 00:40:17.890 "data_offset": 256, 00:40:17.890 "data_size": 7936 00:40:17.890 } 00:40:17.890 ] 00:40:17.890 }' 00:40:17.890 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:17.890 12:42:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:18.456 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:18.715 [2024-06-07 12:42:42.313964] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:18.715 [2024-06-07 12:42:42.314025] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:18.715 [2024-06-07 12:42:42.314103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:18.715 [2024-06-07 12:42:42.314148] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:18.715 [2024-06-07 12:42:42.314159] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000008a80 name raid_bdev1, state offline 00:40:18.715 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:40:18.715 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:19.296 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:40:19.296 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:40:19.296 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:40:19.296 12:42:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:40:19.564 [2024-06-07 12:42:42.977986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:40:19.564 [2024-06-07 12:42:42.978133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:19.564 [2024-06-07 12:42:42.978181] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008d80 00:40:19.564 [2024-06-07 12:42:42.978206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:19.564 [2024-06-07 12:42:42.980365] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:19.564 [2024-06-07 12:42:42.980427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:40:19.564 [2024-06-07 12:42:42.980502] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:40:19.564 [2024-06-07 12:42:42.980542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:40:19.564 [2024-06-07 12:42:42.980677] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:40:19.564 [2024-06-07 12:42:42.980707] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:19.564 [2024-06-07 12:42:42.980733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009380 name raid_bdev1, state configuring 00:40:19.564 [2024-06-07 12:42:42.980813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:40:19.564 [2024-06-07 12:42:42.980876] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009680 00:40:19.564 [2024-06-07 12:42:42.980898] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:19.564 [2024-06-07 12:42:42.980986] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002870 00:40:19.564 [2024-06-07 12:42:42.981049] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009680 00:40:19.564 [2024-06-07 12:42:42.981061] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009680 00:40:19.564 [2024-06-07 12:42:42.981098] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:19.564 pt1 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:19.564 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:19.822 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:19.822 "name": "raid_bdev1", 00:40:19.822 "uuid": "76f68417-6b73-4784-bc52-cbef6098215f", 00:40:19.822 "strip_size_kb": 0, 00:40:19.822 "state": "online", 00:40:19.822 "raid_level": "raid1", 00:40:19.822 "superblock": true, 00:40:19.822 "num_base_bdevs": 2, 00:40:19.822 "num_base_bdevs_discovered": 1, 00:40:19.822 "num_base_bdevs_operational": 1, 00:40:19.822 "base_bdevs_list": [ 00:40:19.822 { 00:40:19.822 "name": null, 00:40:19.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:19.822 "is_configured": false, 00:40:19.822 "data_offset": 256, 00:40:19.822 "data_size": 7936 00:40:19.822 }, 00:40:19.822 { 00:40:19.822 "name": "pt2", 00:40:19.822 "uuid": "00000000-0000-0000-0000-000000000002", 00:40:19.822 "is_configured": true, 00:40:19.822 "data_offset": 256, 00:40:19.822 "data_size": 7936 00:40:19.822 } 00:40:19.822 ] 00:40:19.822 }' 00:40:19.822 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:19.822 12:42:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:20.754 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:40:20.754 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:40:20.754 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:40:20.754 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:40:20.754 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:21.012 [2024-06-07 12:42:44.578361] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 76f68417-6b73-4784-bc52-cbef6098215f '!=' 76f68417-6b73-4784-bc52-cbef6098215f ']' 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 230727 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 230727 ']' 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 230727 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 230727 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:21.012 killing process with pid 230727 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 230727' 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # kill 230727 00:40:21.012 [2024-06-07 12:42:44.627541] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:21.012 [2024-06-07 12:42:44.627633] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:21.012 [2024-06-07 12:42:44.627687] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:21.012 [2024-06-07 12:42:44.627698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009680 name raid_bdev1, state offline 00:40:21.012 12:42:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@973 -- # wait 230727 00:40:21.270 [2024-06-07 12:42:44.675713] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:21.529 12:42:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:40:21.529 00:40:21.529 real 0m18.128s 00:40:21.529 user 0m33.548s 00:40:21.529 sys 0m2.853s 00:40:21.529 12:42:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:21.529 12:42:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:21.529 ************************************ 00:40:21.529 END TEST raid_superblock_test_md_interleaved 00:40:21.529 ************************************ 00:40:21.529 12:42:45 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:40:21.529 12:42:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:40:21.529 12:42:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:21.529 12:42:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:21.529 ************************************ 00:40:21.530 START TEST raid_rebuild_test_sb_md_interleaved 00:40:21.530 ************************************ 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false false 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # echo BaseBdev1 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # echo BaseBdev2 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=231256 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 231256 /var/tmp/spdk-raid.sock 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 231256 ']' 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:40:21.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:40:21.530 12:42:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:21.530 [2024-06-07 12:42:45.153872] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:21.530 I/O size of 3145728 is greater than zero copy threshold (65536). 00:40:21.530 Zero copy mechanism will not be used. 00:40:21.530 [2024-06-07 12:42:45.154131] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid231256 ] 00:40:21.829 [2024-06-07 12:42:45.298828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:21.829 [2024-06-07 12:42:45.397659] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:22.089 [2024-06-07 12:42:45.480877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:40:22.654 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:40:22.654 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:40:22.654 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:40:22.654 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:40:22.911 BaseBdev1_malloc 00:40:22.911 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:40:23.168 [2024-06-07 12:42:46.710614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:40:23.168 [2024-06-07 12:42:46.710882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:23.168 [2024-06-07 12:42:46.710992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000005a80 00:40:23.168 [2024-06-07 12:42:46.711105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:23.168 [2024-06-07 12:42:46.714105] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:23.168 [2024-06-07 12:42:46.714196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:40:23.168 BaseBdev1 00:40:23.168 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:40:23.168 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:40:23.426 BaseBdev2_malloc 00:40:23.426 12:42:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:40:23.684 [2024-06-07 12:42:47.270215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:40:23.684 [2024-06-07 12:42:47.270345] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:23.684 [2024-06-07 12:42:47.270403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000006680 00:40:23.684 [2024-06-07 12:42:47.270452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:23.684 [2024-06-07 12:42:47.272695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:23.684 [2024-06-07 12:42:47.272769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:40:23.684 BaseBdev2 00:40:23.684 12:42:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:40:23.942 spare_malloc 00:40:23.942 12:42:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:40:24.199 spare_delay 00:40:24.199 12:42:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:24.457 [2024-06-07 12:42:47.974809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:24.457 [2024-06-07 12:42:47.974953] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:24.457 [2024-06-07 12:42:47.975004] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000007880 00:40:24.457 [2024-06-07 12:42:47.975066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:24.457 [2024-06-07 12:42:47.977312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:24.457 [2024-06-07 12:42:47.977378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:24.457 spare 00:40:24.457 12:42:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:40:24.715 [2024-06-07 12:42:48.199002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:24.715 [2024-06-07 12:42:48.201680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:40:24.715 [2024-06-07 12:42:48.201923] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000007e80 00:40:24.715 [2024-06-07 12:42:48.201942] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:24.715 [2024-06-07 12:42:48.202142] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002460 00:40:24.715 [2024-06-07 12:42:48.202295] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000007e80 00:40:24.715 [2024-06-07 12:42:48.202318] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000007e80 00:40:24.715 [2024-06-07 12:42:48.202399] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:24.715 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:24.973 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:24.973 "name": "raid_bdev1", 00:40:24.973 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:24.973 "strip_size_kb": 0, 00:40:24.973 "state": "online", 00:40:24.973 "raid_level": "raid1", 00:40:24.973 "superblock": true, 00:40:24.973 "num_base_bdevs": 2, 00:40:24.973 "num_base_bdevs_discovered": 2, 00:40:24.973 "num_base_bdevs_operational": 2, 00:40:24.973 "base_bdevs_list": [ 00:40:24.973 { 00:40:24.973 "name": "BaseBdev1", 00:40:24.973 "uuid": "6fb2ef6e-ab3f-5808-8a45-4ff772db1d73", 00:40:24.973 "is_configured": true, 00:40:24.973 "data_offset": 256, 00:40:24.973 "data_size": 7936 00:40:24.973 }, 00:40:24.973 { 00:40:24.973 "name": "BaseBdev2", 00:40:24.973 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:24.973 "is_configured": true, 00:40:24.973 "data_offset": 256, 00:40:24.973 "data_size": 7936 00:40:24.973 } 00:40:24.973 ] 00:40:24.973 }' 00:40:24.973 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:24.973 12:42:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:25.540 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:40:25.540 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:40:25.797 [2024-06-07 12:42:49.411235] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:40:25.797 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:40:25.797 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:40:25.797 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:26.361 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:40:26.361 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:40:26.361 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:40:26.361 12:42:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:40:26.619 [2024-06-07 12:42:50.111179] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:26.619 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:26.876 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:26.876 "name": "raid_bdev1", 00:40:26.876 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:26.876 "strip_size_kb": 0, 00:40:26.876 "state": "online", 00:40:26.876 "raid_level": "raid1", 00:40:26.876 "superblock": true, 00:40:26.876 "num_base_bdevs": 2, 00:40:26.876 "num_base_bdevs_discovered": 1, 00:40:26.876 "num_base_bdevs_operational": 1, 00:40:26.876 "base_bdevs_list": [ 00:40:26.876 { 00:40:26.876 "name": null, 00:40:26.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:26.876 "is_configured": false, 00:40:26.876 "data_offset": 256, 00:40:26.876 "data_size": 7936 00:40:26.876 }, 00:40:26.876 { 00:40:26.876 "name": "BaseBdev2", 00:40:26.876 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:26.876 "is_configured": true, 00:40:26.876 "data_offset": 256, 00:40:26.876 "data_size": 7936 00:40:26.876 } 00:40:26.876 ] 00:40:26.876 }' 00:40:26.876 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:26.876 12:42:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:27.808 12:42:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:28.065 [2024-06-07 12:42:51.531335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:28.065 [2024-06-07 12:42:51.537391] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002530 00:40:28.065 [2024-06-07 12:42:51.540102] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:28.065 12:42:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:29.036 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:29.294 "name": "raid_bdev1", 00:40:29.294 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:29.294 "strip_size_kb": 0, 00:40:29.294 "state": "online", 00:40:29.294 "raid_level": "raid1", 00:40:29.294 "superblock": true, 00:40:29.294 "num_base_bdevs": 2, 00:40:29.294 "num_base_bdevs_discovered": 2, 00:40:29.294 "num_base_bdevs_operational": 2, 00:40:29.294 "process": { 00:40:29.294 "type": "rebuild", 00:40:29.294 "target": "spare", 00:40:29.294 "progress": { 00:40:29.294 "blocks": 3072, 00:40:29.294 "percent": 38 00:40:29.294 } 00:40:29.294 }, 00:40:29.294 "base_bdevs_list": [ 00:40:29.294 { 00:40:29.294 "name": "spare", 00:40:29.294 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:29.294 "is_configured": true, 00:40:29.294 "data_offset": 256, 00:40:29.294 "data_size": 7936 00:40:29.294 }, 00:40:29.294 { 00:40:29.294 "name": "BaseBdev2", 00:40:29.294 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:29.294 "is_configured": true, 00:40:29.294 "data_offset": 256, 00:40:29.294 "data_size": 7936 00:40:29.294 } 00:40:29.294 ] 00:40:29.294 }' 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:29.294 12:42:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:40:29.859 [2024-06-07 12:42:53.233706] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:29.859 [2024-06-07 12:42:53.255218] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:29.859 [2024-06-07 12:42:53.255557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:29.859 [2024-06-07 12:42:53.255683] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:29.859 [2024-06-07 12:42:53.255727] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:29.859 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:30.118 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:30.118 "name": "raid_bdev1", 00:40:30.118 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:30.118 "strip_size_kb": 0, 00:40:30.118 "state": "online", 00:40:30.118 "raid_level": "raid1", 00:40:30.118 "superblock": true, 00:40:30.118 "num_base_bdevs": 2, 00:40:30.118 "num_base_bdevs_discovered": 1, 00:40:30.118 "num_base_bdevs_operational": 1, 00:40:30.118 "base_bdevs_list": [ 00:40:30.118 { 00:40:30.118 "name": null, 00:40:30.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:30.118 "is_configured": false, 00:40:30.118 "data_offset": 256, 00:40:30.118 "data_size": 7936 00:40:30.118 }, 00:40:30.118 { 00:40:30.118 "name": "BaseBdev2", 00:40:30.118 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:30.118 "is_configured": true, 00:40:30.118 "data_offset": 256, 00:40:30.118 "data_size": 7936 00:40:30.118 } 00:40:30.118 ] 00:40:30.118 }' 00:40:30.118 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:30.118 12:42:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:30.683 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:30.941 "name": "raid_bdev1", 00:40:30.941 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:30.941 "strip_size_kb": 0, 00:40:30.941 "state": "online", 00:40:30.941 "raid_level": "raid1", 00:40:30.941 "superblock": true, 00:40:30.941 "num_base_bdevs": 2, 00:40:30.941 "num_base_bdevs_discovered": 1, 00:40:30.941 "num_base_bdevs_operational": 1, 00:40:30.941 "base_bdevs_list": [ 00:40:30.941 { 00:40:30.941 "name": null, 00:40:30.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:30.941 "is_configured": false, 00:40:30.941 "data_offset": 256, 00:40:30.941 "data_size": 7936 00:40:30.941 }, 00:40:30.941 { 00:40:30.941 "name": "BaseBdev2", 00:40:30.941 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:30.941 "is_configured": true, 00:40:30.941 "data_offset": 256, 00:40:30.941 "data_size": 7936 00:40:30.941 } 00:40:30.941 ] 00:40:30.941 }' 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:30.941 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:31.198 [2024-06-07 12:42:54.775288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:31.198 [2024-06-07 12:42:54.781063] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d0000026d0 00:40:31.198 [2024-06-07 12:42:54.783673] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:31.198 12:42:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:32.658 12:42:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:32.658 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:32.658 "name": "raid_bdev1", 00:40:32.658 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:32.658 "strip_size_kb": 0, 00:40:32.658 "state": "online", 00:40:32.658 "raid_level": "raid1", 00:40:32.658 "superblock": true, 00:40:32.658 "num_base_bdevs": 2, 00:40:32.658 "num_base_bdevs_discovered": 2, 00:40:32.658 "num_base_bdevs_operational": 2, 00:40:32.658 "process": { 00:40:32.658 "type": "rebuild", 00:40:32.658 "target": "spare", 00:40:32.658 "progress": { 00:40:32.658 "blocks": 3328, 00:40:32.658 "percent": 41 00:40:32.658 } 00:40:32.658 }, 00:40:32.658 "base_bdevs_list": [ 00:40:32.658 { 00:40:32.658 "name": "spare", 00:40:32.658 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:32.658 "is_configured": true, 00:40:32.658 "data_offset": 256, 00:40:32.658 "data_size": 7936 00:40:32.658 }, 00:40:32.658 { 00:40:32.658 "name": "BaseBdev2", 00:40:32.658 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:32.658 "is_configured": true, 00:40:32.658 "data_offset": 256, 00:40:32.658 "data_size": 7936 00:40:32.658 } 00:40:32.658 ] 00:40:32.658 }' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:40:32.659 /home/vagrant/spdk_repo/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1183 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:32.659 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:33.227 "name": "raid_bdev1", 00:40:33.227 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:33.227 "strip_size_kb": 0, 00:40:33.227 "state": "online", 00:40:33.227 "raid_level": "raid1", 00:40:33.227 "superblock": true, 00:40:33.227 "num_base_bdevs": 2, 00:40:33.227 "num_base_bdevs_discovered": 2, 00:40:33.227 "num_base_bdevs_operational": 2, 00:40:33.227 "process": { 00:40:33.227 "type": "rebuild", 00:40:33.227 "target": "spare", 00:40:33.227 "progress": { 00:40:33.227 "blocks": 4352, 00:40:33.227 "percent": 54 00:40:33.227 } 00:40:33.227 }, 00:40:33.227 "base_bdevs_list": [ 00:40:33.227 { 00:40:33.227 "name": "spare", 00:40:33.227 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:33.227 "is_configured": true, 00:40:33.227 "data_offset": 256, 00:40:33.227 "data_size": 7936 00:40:33.227 }, 00:40:33.227 { 00:40:33.227 "name": "BaseBdev2", 00:40:33.227 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:33.227 "is_configured": true, 00:40:33.227 "data_offset": 256, 00:40:33.227 "data_size": 7936 00:40:33.227 } 00:40:33.227 ] 00:40:33.227 }' 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:33.227 12:42:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:40:34.160 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:34.161 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:34.418 [2024-06-07 12:42:57.907836] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:40:34.418 [2024-06-07 12:42:57.908193] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:40:34.418 [2024-06-07 12:42:57.908546] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:34.418 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:34.418 "name": "raid_bdev1", 00:40:34.418 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:34.418 "strip_size_kb": 0, 00:40:34.418 "state": "online", 00:40:34.418 "raid_level": "raid1", 00:40:34.418 "superblock": true, 00:40:34.418 "num_base_bdevs": 2, 00:40:34.418 "num_base_bdevs_discovered": 2, 00:40:34.418 "num_base_bdevs_operational": 2, 00:40:34.418 "process": { 00:40:34.418 "type": "rebuild", 00:40:34.418 "target": "spare", 00:40:34.418 "progress": { 00:40:34.418 "blocks": 7680, 00:40:34.418 "percent": 96 00:40:34.418 } 00:40:34.418 }, 00:40:34.418 "base_bdevs_list": [ 00:40:34.418 { 00:40:34.418 "name": "spare", 00:40:34.418 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:34.418 "is_configured": true, 00:40:34.418 "data_offset": 256, 00:40:34.418 "data_size": 7936 00:40:34.418 }, 00:40:34.418 { 00:40:34.418 "name": "BaseBdev2", 00:40:34.418 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:34.418 "is_configured": true, 00:40:34.418 "data_offset": 256, 00:40:34.418 "data_size": 7936 00:40:34.418 } 00:40:34.418 ] 00:40:34.418 }' 00:40:34.418 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:34.418 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:34.418 12:42:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:34.418 12:42:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:34.418 12:42:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:35.793 "name": "raid_bdev1", 00:40:35.793 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:35.793 "strip_size_kb": 0, 00:40:35.793 "state": "online", 00:40:35.793 "raid_level": "raid1", 00:40:35.793 "superblock": true, 00:40:35.793 "num_base_bdevs": 2, 00:40:35.793 "num_base_bdevs_discovered": 2, 00:40:35.793 "num_base_bdevs_operational": 2, 00:40:35.793 "base_bdevs_list": [ 00:40:35.793 { 00:40:35.793 "name": "spare", 00:40:35.793 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:35.793 "is_configured": true, 00:40:35.793 "data_offset": 256, 00:40:35.793 "data_size": 7936 00:40:35.793 }, 00:40:35.793 { 00:40:35.793 "name": "BaseBdev2", 00:40:35.793 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:35.793 "is_configured": true, 00:40:35.793 "data_offset": 256, 00:40:35.793 "data_size": 7936 00:40:35.793 } 00:40:35.793 ] 00:40:35.793 }' 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:35.793 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:36.052 "name": "raid_bdev1", 00:40:36.052 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:36.052 "strip_size_kb": 0, 00:40:36.052 "state": "online", 00:40:36.052 "raid_level": "raid1", 00:40:36.052 "superblock": true, 00:40:36.052 "num_base_bdevs": 2, 00:40:36.052 "num_base_bdevs_discovered": 2, 00:40:36.052 "num_base_bdevs_operational": 2, 00:40:36.052 "base_bdevs_list": [ 00:40:36.052 { 00:40:36.052 "name": "spare", 00:40:36.052 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:36.052 "is_configured": true, 00:40:36.052 "data_offset": 256, 00:40:36.052 "data_size": 7936 00:40:36.052 }, 00:40:36.052 { 00:40:36.052 "name": "BaseBdev2", 00:40:36.052 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:36.052 "is_configured": true, 00:40:36.052 "data_offset": 256, 00:40:36.052 "data_size": 7936 00:40:36.052 } 00:40:36.052 ] 00:40:36.052 }' 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:36.052 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:36.311 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:36.311 "name": "raid_bdev1", 00:40:36.311 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:36.311 "strip_size_kb": 0, 00:40:36.311 "state": "online", 00:40:36.311 "raid_level": "raid1", 00:40:36.311 "superblock": true, 00:40:36.311 "num_base_bdevs": 2, 00:40:36.311 "num_base_bdevs_discovered": 2, 00:40:36.311 "num_base_bdevs_operational": 2, 00:40:36.311 "base_bdevs_list": [ 00:40:36.311 { 00:40:36.311 "name": "spare", 00:40:36.311 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:36.311 "is_configured": true, 00:40:36.311 "data_offset": 256, 00:40:36.311 "data_size": 7936 00:40:36.311 }, 00:40:36.311 { 00:40:36.311 "name": "BaseBdev2", 00:40:36.311 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:36.311 "is_configured": true, 00:40:36.311 "data_offset": 256, 00:40:36.311 "data_size": 7936 00:40:36.311 } 00:40:36.311 ] 00:40:36.311 }' 00:40:36.311 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:36.311 12:42:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:37.246 12:43:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:40:37.246 [2024-06-07 12:43:00.868189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:40:37.246 [2024-06-07 12:43:00.868494] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:40:37.246 [2024-06-07 12:43:00.868727] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:37.246 [2024-06-07 12:43:00.868953] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:37.246 [2024-06-07 12:43:00.869063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000007e80 name raid_bdev1, state offline 00:40:37.504 12:43:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:37.504 12:43:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:40:37.761 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:40:37.761 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:40:37.761 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:40:37.761 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:38.018 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:38.018 [2024-06-07 12:43:01.636290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:38.018 [2024-06-07 12:43:01.636701] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:38.018 [2024-06-07 12:43:01.636883] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008a80 00:40:38.018 [2024-06-07 12:43:01.637018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:38.018 [2024-06-07 12:43:01.639625] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:38.018 [2024-06-07 12:43:01.639858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:38.018 [2024-06-07 12:43:01.640077] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:40:38.018 [2024-06-07 12:43:01.640280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:38.018 [2024-06-07 12:43:01.640548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:40:38.018 spare 00:40:38.018 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:40:38.018 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:38.018 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:38.276 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:38.276 [2024-06-07 12:43:01.740826] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x616000009080 00:40:38.276 [2024-06-07 12:43:01.741060] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:40:38.276 [2024-06-07 12:43:01.741257] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002a10 00:40:38.276 [2024-06-07 12:43:01.741586] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x616000009080 00:40:38.276 [2024-06-07 12:43:01.741705] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x616000009080 00:40:38.276 [2024-06-07 12:43:01.741836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:38.533 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:38.533 "name": "raid_bdev1", 00:40:38.533 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:38.533 "strip_size_kb": 0, 00:40:38.533 "state": "online", 00:40:38.533 "raid_level": "raid1", 00:40:38.533 "superblock": true, 00:40:38.533 "num_base_bdevs": 2, 00:40:38.533 "num_base_bdevs_discovered": 2, 00:40:38.533 "num_base_bdevs_operational": 2, 00:40:38.533 "base_bdevs_list": [ 00:40:38.533 { 00:40:38.533 "name": "spare", 00:40:38.533 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:38.533 "is_configured": true, 00:40:38.533 "data_offset": 256, 00:40:38.533 "data_size": 7936 00:40:38.533 }, 00:40:38.533 { 00:40:38.533 "name": "BaseBdev2", 00:40:38.533 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:38.533 "is_configured": true, 00:40:38.533 "data_offset": 256, 00:40:38.533 "data_size": 7936 00:40:38.533 } 00:40:38.533 ] 00:40:38.533 }' 00:40:38.533 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:38.533 12:43:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:39.098 12:43:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:39.663 "name": "raid_bdev1", 00:40:39.663 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:39.663 "strip_size_kb": 0, 00:40:39.663 "state": "online", 00:40:39.663 "raid_level": "raid1", 00:40:39.663 "superblock": true, 00:40:39.663 "num_base_bdevs": 2, 00:40:39.663 "num_base_bdevs_discovered": 2, 00:40:39.663 "num_base_bdevs_operational": 2, 00:40:39.663 "base_bdevs_list": [ 00:40:39.663 { 00:40:39.663 "name": "spare", 00:40:39.663 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:39.663 "is_configured": true, 00:40:39.663 "data_offset": 256, 00:40:39.663 "data_size": 7936 00:40:39.663 }, 00:40:39.663 { 00:40:39.663 "name": "BaseBdev2", 00:40:39.663 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:39.663 "is_configured": true, 00:40:39.663 "data_offset": 256, 00:40:39.663 "data_size": 7936 00:40:39.663 } 00:40:39.663 ] 00:40:39.663 }' 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:39.663 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:40:39.920 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:40:39.920 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:40:40.178 [2024-06-07 12:43:03.701020] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:40.178 12:43:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:40.744 12:43:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:40.744 "name": "raid_bdev1", 00:40:40.744 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:40.744 "strip_size_kb": 0, 00:40:40.744 "state": "online", 00:40:40.744 "raid_level": "raid1", 00:40:40.744 "superblock": true, 00:40:40.744 "num_base_bdevs": 2, 00:40:40.744 "num_base_bdevs_discovered": 1, 00:40:40.744 "num_base_bdevs_operational": 1, 00:40:40.744 "base_bdevs_list": [ 00:40:40.744 { 00:40:40.744 "name": null, 00:40:40.744 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:40.744 "is_configured": false, 00:40:40.744 "data_offset": 256, 00:40:40.744 "data_size": 7936 00:40:40.744 }, 00:40:40.744 { 00:40:40.744 "name": "BaseBdev2", 00:40:40.744 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:40.744 "is_configured": true, 00:40:40.744 "data_offset": 256, 00:40:40.744 "data_size": 7936 00:40:40.744 } 00:40:40.744 ] 00:40:40.744 }' 00:40:40.744 12:43:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:40.744 12:43:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:41.003 12:43:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:40:41.302 [2024-06-07 12:43:04.845209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:41.302 [2024-06-07 12:43:04.845662] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:40:41.302 [2024-06-07 12:43:04.845800] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:40:41.302 [2024-06-07 12:43:04.845930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:41.302 [2024-06-07 12:43:04.851732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002bb0 00:40:41.302 [2024-06-07 12:43:04.854093] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:41.302 12:43:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:42.271 12:43:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:42.529 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:42.529 "name": "raid_bdev1", 00:40:42.529 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:42.529 "strip_size_kb": 0, 00:40:42.529 "state": "online", 00:40:42.529 "raid_level": "raid1", 00:40:42.529 "superblock": true, 00:40:42.529 "num_base_bdevs": 2, 00:40:42.529 "num_base_bdevs_discovered": 2, 00:40:42.529 "num_base_bdevs_operational": 2, 00:40:42.529 "process": { 00:40:42.529 "type": "rebuild", 00:40:42.529 "target": "spare", 00:40:42.529 "progress": { 00:40:42.529 "blocks": 3072, 00:40:42.529 "percent": 38 00:40:42.529 } 00:40:42.529 }, 00:40:42.529 "base_bdevs_list": [ 00:40:42.529 { 00:40:42.529 "name": "spare", 00:40:42.529 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:42.529 "is_configured": true, 00:40:42.529 "data_offset": 256, 00:40:42.529 "data_size": 7936 00:40:42.529 }, 00:40:42.529 { 00:40:42.529 "name": "BaseBdev2", 00:40:42.529 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:42.529 "is_configured": true, 00:40:42.529 "data_offset": 256, 00:40:42.529 "data_size": 7936 00:40:42.529 } 00:40:42.529 ] 00:40:42.529 }' 00:40:42.529 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:42.529 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:42.529 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:42.786 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:42.786 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:43.045 [2024-06-07 12:43:06.500156] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:43.045 [2024-06-07 12:43:06.566708] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:43.045 [2024-06-07 12:43:06.567146] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:43.045 [2024-06-07 12:43:06.567217] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:43.045 [2024-06-07 12:43:06.567355] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:43.045 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:43.303 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:43.303 "name": "raid_bdev1", 00:40:43.303 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:43.303 "strip_size_kb": 0, 00:40:43.303 "state": "online", 00:40:43.303 "raid_level": "raid1", 00:40:43.303 "superblock": true, 00:40:43.303 "num_base_bdevs": 2, 00:40:43.303 "num_base_bdevs_discovered": 1, 00:40:43.303 "num_base_bdevs_operational": 1, 00:40:43.303 "base_bdevs_list": [ 00:40:43.303 { 00:40:43.303 "name": null, 00:40:43.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:43.303 "is_configured": false, 00:40:43.303 "data_offset": 256, 00:40:43.303 "data_size": 7936 00:40:43.303 }, 00:40:43.303 { 00:40:43.303 "name": "BaseBdev2", 00:40:43.303 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:43.303 "is_configured": true, 00:40:43.303 "data_offset": 256, 00:40:43.303 "data_size": 7936 00:40:43.303 } 00:40:43.303 ] 00:40:43.303 }' 00:40:43.303 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:43.303 12:43:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:44.240 12:43:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:40:44.498 [2024-06-07 12:43:07.978690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:40:44.498 [2024-06-07 12:43:07.979086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:44.498 [2024-06-07 12:43:07.979265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009680 00:40:44.498 [2024-06-07 12:43:07.979389] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:44.498 [2024-06-07 12:43:07.979644] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:44.498 [2024-06-07 12:43:07.979813] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:40:44.498 [2024-06-07 12:43:07.980027] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:40:44.498 [2024-06-07 12:43:07.980156] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:40:44.498 [2024-06-07 12:43:07.980273] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:40:44.498 [2024-06-07 12:43:07.980469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:40:44.498 [2024-06-07 12:43:07.985765] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x60d000002ef0 00:40:44.498 spare 00:40:44.498 [2024-06-07 12:43:07.988026] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:40:44.498 12:43:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:45.431 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:45.997 "name": "raid_bdev1", 00:40:45.997 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:45.997 "strip_size_kb": 0, 00:40:45.997 "state": "online", 00:40:45.997 "raid_level": "raid1", 00:40:45.997 "superblock": true, 00:40:45.997 "num_base_bdevs": 2, 00:40:45.997 "num_base_bdevs_discovered": 2, 00:40:45.997 "num_base_bdevs_operational": 2, 00:40:45.997 "process": { 00:40:45.997 "type": "rebuild", 00:40:45.997 "target": "spare", 00:40:45.997 "progress": { 00:40:45.997 "blocks": 3584, 00:40:45.997 "percent": 45 00:40:45.997 } 00:40:45.997 }, 00:40:45.997 "base_bdevs_list": [ 00:40:45.997 { 00:40:45.997 "name": "spare", 00:40:45.997 "uuid": "85c9e39c-834b-5249-a0bf-701e1e49cc13", 00:40:45.997 "is_configured": true, 00:40:45.997 "data_offset": 256, 00:40:45.997 "data_size": 7936 00:40:45.997 }, 00:40:45.997 { 00:40:45.997 "name": "BaseBdev2", 00:40:45.997 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:45.997 "is_configured": true, 00:40:45.997 "data_offset": 256, 00:40:45.997 "data_size": 7936 00:40:45.997 } 00:40:45.997 ] 00:40:45.997 }' 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:40:45.997 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:40:46.254 [2024-06-07 12:43:09.831049] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:46.511 [2024-06-07 12:43:09.902892] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:40:46.511 [2024-06-07 12:43:09.903333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:40:46.511 [2024-06-07 12:43:09.903466] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:40:46.511 [2024-06-07 12:43:09.903512] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:46.511 12:43:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:46.769 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:46.769 "name": "raid_bdev1", 00:40:46.769 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:46.769 "strip_size_kb": 0, 00:40:46.769 "state": "online", 00:40:46.769 "raid_level": "raid1", 00:40:46.769 "superblock": true, 00:40:46.769 "num_base_bdevs": 2, 00:40:46.769 "num_base_bdevs_discovered": 1, 00:40:46.769 "num_base_bdevs_operational": 1, 00:40:46.769 "base_bdevs_list": [ 00:40:46.769 { 00:40:46.769 "name": null, 00:40:46.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:46.769 "is_configured": false, 00:40:46.769 "data_offset": 256, 00:40:46.769 "data_size": 7936 00:40:46.769 }, 00:40:46.769 { 00:40:46.769 "name": "BaseBdev2", 00:40:46.769 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:46.769 "is_configured": true, 00:40:46.769 "data_offset": 256, 00:40:46.769 "data_size": 7936 00:40:46.769 } 00:40:46.769 ] 00:40:46.769 }' 00:40:46.769 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:46.769 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:47.334 12:43:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:47.592 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:47.592 "name": "raid_bdev1", 00:40:47.592 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:47.592 "strip_size_kb": 0, 00:40:47.592 "state": "online", 00:40:47.592 "raid_level": "raid1", 00:40:47.592 "superblock": true, 00:40:47.592 "num_base_bdevs": 2, 00:40:47.592 "num_base_bdevs_discovered": 1, 00:40:47.592 "num_base_bdevs_operational": 1, 00:40:47.592 "base_bdevs_list": [ 00:40:47.592 { 00:40:47.592 "name": null, 00:40:47.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:47.592 "is_configured": false, 00:40:47.592 "data_offset": 256, 00:40:47.592 "data_size": 7936 00:40:47.592 }, 00:40:47.592 { 00:40:47.592 "name": "BaseBdev2", 00:40:47.592 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:47.592 "is_configured": true, 00:40:47.592 "data_offset": 256, 00:40:47.592 "data_size": 7936 00:40:47.592 } 00:40:47.592 ] 00:40:47.592 }' 00:40:47.592 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:47.592 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:47.592 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:47.850 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:47.850 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:40:48.108 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:40:48.377 [2024-06-07 12:43:11.754844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:40:48.377 [2024-06-07 12:43:11.755300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:40:48.377 [2024-06-07 12:43:11.755509] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009c80 00:40:48.377 [2024-06-07 12:43:11.755655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:40:48.377 [2024-06-07 12:43:11.755935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:40:48.377 [2024-06-07 12:43:11.756067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:40:48.377 [2024-06-07 12:43:11.756269] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:40:48.377 [2024-06-07 12:43:11.756381] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:40:48.377 [2024-06-07 12:43:11.756469] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:40:48.377 BaseBdev1 00:40:48.377 12:43:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:49.346 12:43:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:49.602 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:49.602 "name": "raid_bdev1", 00:40:49.602 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:49.602 "strip_size_kb": 0, 00:40:49.602 "state": "online", 00:40:49.602 "raid_level": "raid1", 00:40:49.602 "superblock": true, 00:40:49.602 "num_base_bdevs": 2, 00:40:49.602 "num_base_bdevs_discovered": 1, 00:40:49.602 "num_base_bdevs_operational": 1, 00:40:49.602 "base_bdevs_list": [ 00:40:49.602 { 00:40:49.603 "name": null, 00:40:49.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:49.603 "is_configured": false, 00:40:49.603 "data_offset": 256, 00:40:49.603 "data_size": 7936 00:40:49.603 }, 00:40:49.603 { 00:40:49.603 "name": "BaseBdev2", 00:40:49.603 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:49.603 "is_configured": true, 00:40:49.603 "data_offset": 256, 00:40:49.603 "data_size": 7936 00:40:49.603 } 00:40:49.603 ] 00:40:49.603 }' 00:40:49.603 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:49.603 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:50.532 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:50.532 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:50.532 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:50.532 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:50.533 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:50.533 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:50.533 12:43:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:50.790 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:50.790 "name": "raid_bdev1", 00:40:50.790 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:50.790 "strip_size_kb": 0, 00:40:50.790 "state": "online", 00:40:50.790 "raid_level": "raid1", 00:40:50.790 "superblock": true, 00:40:50.790 "num_base_bdevs": 2, 00:40:50.790 "num_base_bdevs_discovered": 1, 00:40:50.791 "num_base_bdevs_operational": 1, 00:40:50.791 "base_bdevs_list": [ 00:40:50.791 { 00:40:50.791 "name": null, 00:40:50.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:50.791 "is_configured": false, 00:40:50.791 "data_offset": 256, 00:40:50.791 "data_size": 7936 00:40:50.791 }, 00:40:50.791 { 00:40:50.791 "name": "BaseBdev2", 00:40:50.791 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:50.791 "is_configured": true, 00:40:50.791 "data_offset": 256, 00:40:50.791 "data_size": 7936 00:40:50.791 } 00:40:50.791 ] 00:40:50.791 }' 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:40:50.791 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:40:51.048 [2024-06-07 12:43:14.635308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:40:51.048 [2024-06-07 12:43:14.635939] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:40:51.048 [2024-06-07 12:43:14.636124] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:40:51.048 request: 00:40:51.048 { 00:40:51.048 "raid_bdev": "raid_bdev1", 00:40:51.048 "base_bdev": "BaseBdev1", 00:40:51.048 "method": "bdev_raid_add_base_bdev", 00:40:51.048 "req_id": 1 00:40:51.048 } 00:40:51.048 Got JSON-RPC error response 00:40:51.048 response: 00:40:51.048 { 00:40:51.048 "code": -22, 00:40:51.048 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:40:51.048 } 00:40:51.048 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:40:51.048 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:40:51.048 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:40:51.048 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:40:51.048 12:43:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:52.422 12:43:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:40:52.422 "name": "raid_bdev1", 00:40:52.422 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:52.422 "strip_size_kb": 0, 00:40:52.422 "state": "online", 00:40:52.422 "raid_level": "raid1", 00:40:52.422 "superblock": true, 00:40:52.422 "num_base_bdevs": 2, 00:40:52.422 "num_base_bdevs_discovered": 1, 00:40:52.422 "num_base_bdevs_operational": 1, 00:40:52.422 "base_bdevs_list": [ 00:40:52.422 { 00:40:52.422 "name": null, 00:40:52.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:52.422 "is_configured": false, 00:40:52.422 "data_offset": 256, 00:40:52.422 "data_size": 7936 00:40:52.422 }, 00:40:52.422 { 00:40:52.422 "name": "BaseBdev2", 00:40:52.422 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:52.422 "is_configured": true, 00:40:52.422 "data_offset": 256, 00:40:52.422 "data_size": 7936 00:40:52.422 } 00:40:52.422 ] 00:40:52.422 }' 00:40:52.422 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:40:52.422 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:53.424 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:40:53.424 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:40:53.424 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:40:53.425 "name": "raid_bdev1", 00:40:53.425 "uuid": "f448160f-dd79-4b20-b251-bff27b931272", 00:40:53.425 "strip_size_kb": 0, 00:40:53.425 "state": "online", 00:40:53.425 "raid_level": "raid1", 00:40:53.425 "superblock": true, 00:40:53.425 "num_base_bdevs": 2, 00:40:53.425 "num_base_bdevs_discovered": 1, 00:40:53.425 "num_base_bdevs_operational": 1, 00:40:53.425 "base_bdevs_list": [ 00:40:53.425 { 00:40:53.425 "name": null, 00:40:53.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:40:53.425 "is_configured": false, 00:40:53.425 "data_offset": 256, 00:40:53.425 "data_size": 7936 00:40:53.425 }, 00:40:53.425 { 00:40:53.425 "name": "BaseBdev2", 00:40:53.425 "uuid": "7d080502-150f-5cb3-bb5c-5346092cbeb2", 00:40:53.425 "is_configured": true, 00:40:53.425 "data_offset": 256, 00:40:53.425 "data_size": 7936 00:40:53.425 } 00:40:53.425 ] 00:40:53.425 }' 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:40:53.425 12:43:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 231256 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 231256 ']' 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 231256 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:40:53.425 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 231256 00:40:53.696 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:40:53.696 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:40:53.696 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 231256' 00:40:53.696 killing process with pid 231256 00:40:53.696 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 231256 00:40:53.696 Received shutdown signal, test time was about 60.000000 seconds 00:40:53.696 00:40:53.696 Latency(us) 00:40:53.696 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:53.696 =================================================================================================================== 00:40:53.696 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:40:53.696 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 231256 00:40:53.696 [2024-06-07 12:43:17.062203] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:40:53.696 [2024-06-07 12:43:17.062399] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:40:53.696 [2024-06-07 12:43:17.062479] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:40:53.696 [2024-06-07 12:43:17.062579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x616000009080 name raid_bdev1, state offline 00:40:53.696 [2024-06-07 12:43:17.129763] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:40:53.954 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:40:53.954 00:40:53.954 real 0m32.397s 00:40:53.954 user 0m52.481s 00:40:53.954 sys 0m3.903s 00:40:53.954 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:53.954 12:43:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:40:53.954 ************************************ 00:40:53.954 END TEST raid_rebuild_test_sb_md_interleaved 00:40:53.954 ************************************ 00:40:53.954 12:43:17 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:40:53.954 12:43:17 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:40:53.954 12:43:17 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 231256 ']' 00:40:53.954 12:43:17 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 231256 00:40:53.954 12:43:17 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:40:53.954 00:40:53.954 real 19m33.758s 00:40:53.954 user 33m27.766s 00:40:53.954 sys 3m17.470s 00:40:53.954 12:43:17 bdev_raid -- common/autotest_common.sh@1125 -- # xtrace_disable 00:40:53.954 12:43:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:40:53.954 ************************************ 00:40:53.954 END TEST bdev_raid 00:40:53.954 ************************************ 00:40:54.212 12:43:17 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test_config.sh 00:40:54.212 12:43:17 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:40:54.212 12:43:17 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:40:54.212 12:43:17 -- common/autotest_common.sh@10 -- # set +x 00:40:54.212 ************************************ 00:40:54.212 START TEST bdevperf_config 00:40:54.212 ************************************ 00:40:54.212 12:43:17 bdevperf_config -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test_config.sh 00:40:54.212 * Looking for test storage... 00:40:54.212 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/common.sh 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/home/vagrant/spdk_repo/spdk/build/examples/bdevperf 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/conf.json 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:40:54.212 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:40:54.212 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:40:54.212 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:40:54.212 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:40:54.212 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:40:54.212 12:43:17 bdevperf_config -- bdevperf/test_config.sh@22 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -t 2 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/conf.json -j /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:40:57.494 12:43:20 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-06-07 12:43:17.819682] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:57.494 [2024-06-07 12:43:17.820006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232121 ] 00:40:57.494 Using job config with 4 jobs 00:40:57.494 [2024-06-07 12:43:17.971012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:57.494 [2024-06-07 12:43:18.079880] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:57.494 cpumask for '\''job0'\'' is too big 00:40:57.494 cpumask for '\''job1'\'' is too big 00:40:57.494 cpumask for '\''job2'\'' is too big 00:40:57.494 cpumask for '\''job3'\'' is too big 00:40:57.494 Running I/O for 2 seconds... 00:40:57.494 00:40:57.494 Latency(us) 00:40:57.494 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.00 78964.94 77.11 0.00 0.00 3239.92 893.32 6990.51 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 78990.45 77.14 0.00 0.00 3235.57 807.50 6241.52 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 78977.38 77.13 0.00 0.00 3233.10 838.70 5430.13 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 79060.40 77.21 0.00 0.00 3226.57 331.58 5398.92 00:40:57.494 =================================================================================================================== 00:40:57.494 Total : 315993.17 308.59 0.00 0.00 3233.78 331.58 6990.51' 00:40:57.494 12:43:20 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-06-07 12:43:17.819682] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:57.494 [2024-06-07 12:43:17.820006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232121 ] 00:40:57.494 Using job config with 4 jobs 00:40:57.494 [2024-06-07 12:43:17.971012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:57.494 [2024-06-07 12:43:18.079880] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:57.494 cpumask for '\''job0'\'' is too big 00:40:57.494 cpumask for '\''job1'\'' is too big 00:40:57.494 cpumask for '\''job2'\'' is too big 00:40:57.494 cpumask for '\''job3'\'' is too big 00:40:57.494 Running I/O for 2 seconds... 00:40:57.494 00:40:57.494 Latency(us) 00:40:57.494 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.00 78964.94 77.11 0.00 0.00 3239.92 893.32 6990.51 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 78990.45 77.14 0.00 0.00 3235.57 807.50 6241.52 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 78977.38 77.13 0.00 0.00 3233.10 838.70 5430.13 00:40:57.494 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.494 Malloc0 : 2.01 79060.40 77.21 0.00 0.00 3226.57 331.58 5398.92 00:40:57.495 =================================================================================================================== 00:40:57.495 Total : 315993.17 308.59 0.00 0.00 3233.78 331.58 6990.51' 00:40:57.495 12:43:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:40:57.495 12:43:20 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-07 12:43:17.819682] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:57.495 [2024-06-07 12:43:17.820006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232121 ] 00:40:57.495 Using job config with 4 jobs 00:40:57.495 [2024-06-07 12:43:17.971012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:57.495 [2024-06-07 12:43:18.079880] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:57.495 cpumask for '\''job0'\'' is too big 00:40:57.495 cpumask for '\''job1'\'' is too big 00:40:57.495 cpumask for '\''job2'\'' is too big 00:40:57.495 cpumask for '\''job3'\'' is too big 00:40:57.495 Running I/O for 2 seconds... 00:40:57.495 00:40:57.495 Latency(us) 00:40:57.495 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:57.495 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.495 Malloc0 : 2.00 78964.94 77.11 0.00 0.00 3239.92 893.32 6990.51 00:40:57.495 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.495 Malloc0 : 2.01 78990.45 77.14 0.00 0.00 3235.57 807.50 6241.52 00:40:57.495 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.495 Malloc0 : 2.01 78977.38 77.13 0.00 0.00 3233.10 838.70 5430.13 00:40:57.495 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:40:57.495 Malloc0 : 2.01 79060.40 77.21 0.00 0.00 3226.57 331.58 5398.92 00:40:57.495 =================================================================================================================== 00:40:57.495 Total : 315993.17 308.59 0.00 0.00 3233.78 331.58 6990.51' 00:40:57.495 12:43:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:40:57.495 12:43:20 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:40:57.495 12:43:20 bdevperf_config -- bdevperf/test_config.sh@25 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -C -t 2 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/conf.json -j /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:40:57.495 [2024-06-07 12:43:20.797960] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:40:57.495 [2024-06-07 12:43:20.798548] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232162 ] 00:40:57.495 [2024-06-07 12:43:20.944906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:57.495 [2024-06-07 12:43:21.042829] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:40:57.754 cpumask for 'job0' is too big 00:40:57.754 cpumask for 'job1' is too big 00:40:57.754 cpumask for 'job2' is too big 00:40:57.754 cpumask for 'job3' is too big 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:41:00.284 Running I/O for 2 seconds... 00:41:00.284 00:41:00.284 Latency(us) 00:41:00.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:00.284 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:41:00.284 Malloc0 : 2.00 90937.66 88.81 0.00 0.00 2813.53 709.97 5242.88 00:41:00.284 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:41:00.284 Malloc0 : 2.00 90923.15 88.79 0.00 0.00 2811.79 612.45 5149.26 00:41:00.284 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:41:00.284 Malloc0 : 2.01 90977.51 88.85 0.00 0.00 2807.94 733.38 5118.05 00:41:00.284 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:41:00.284 Malloc0 : 2.01 90963.55 88.83 0.00 0.00 2806.39 546.13 5118.05 00:41:00.284 =================================================================================================================== 00:41:00.284 Total : 363801.87 355.28 0.00 0.00 2809.91 546.13 5242.88' 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:00.284 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:00.284 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:00.284 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:00.284 12:43:23 bdevperf_config -- bdevperf/test_config.sh@32 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -t 2 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/conf.json -j /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-06-07 12:43:23.755892] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:03.570 [2024-06-07 12:43:23.756298] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232201 ] 00:41:03.570 Using job config with 3 jobs 00:41:03.570 [2024-06-07 12:43:23.913031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:03.570 [2024-06-07 12:43:24.003720] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:03.570 cpumask for '\''job0'\'' is too big 00:41:03.570 cpumask for '\''job1'\'' is too big 00:41:03.570 cpumask for '\''job2'\'' is too big 00:41:03.570 Running I/O for 2 seconds... 00:41:03.570 00:41:03.570 Latency(us) 00:41:03.570 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121765.01 118.91 0.00 0.00 2101.08 608.55 3479.65 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121741.37 118.89 0.00 0.00 2100.18 565.64 2933.52 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.01 121800.74 118.95 0.00 0.00 2097.68 269.17 2402.99 00:41:03.570 =================================================================================================================== 00:41:03.570 Total : 365307.11 356.75 0.00 0.00 2099.64 269.17 3479.65' 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-06-07 12:43:23.755892] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:03.570 [2024-06-07 12:43:23.756298] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232201 ] 00:41:03.570 Using job config with 3 jobs 00:41:03.570 [2024-06-07 12:43:23.913031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:03.570 [2024-06-07 12:43:24.003720] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:03.570 cpumask for '\''job0'\'' is too big 00:41:03.570 cpumask for '\''job1'\'' is too big 00:41:03.570 cpumask for '\''job2'\'' is too big 00:41:03.570 Running I/O for 2 seconds... 00:41:03.570 00:41:03.570 Latency(us) 00:41:03.570 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121765.01 118.91 0.00 0.00 2101.08 608.55 3479.65 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121741.37 118.89 0.00 0.00 2100.18 565.64 2933.52 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.01 121800.74 118.95 0.00 0.00 2097.68 269.17 2402.99 00:41:03.570 =================================================================================================================== 00:41:03.570 Total : 365307.11 356.75 0.00 0.00 2099.64 269.17 3479.65' 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-07 12:43:23.755892] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:03.570 [2024-06-07 12:43:23.756298] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232201 ] 00:41:03.570 Using job config with 3 jobs 00:41:03.570 [2024-06-07 12:43:23.913031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:03.570 [2024-06-07 12:43:24.003720] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:03.570 cpumask for '\''job0'\'' is too big 00:41:03.570 cpumask for '\''job1'\'' is too big 00:41:03.570 cpumask for '\''job2'\'' is too big 00:41:03.570 Running I/O for 2 seconds... 00:41:03.570 00:41:03.570 Latency(us) 00:41:03.570 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121765.01 118.91 0.00 0.00 2101.08 608.55 3479.65 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.00 121741.37 118.89 0.00 0.00 2100.18 565.64 2933.52 00:41:03.570 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:41:03.570 Malloc0 : 2.01 121800.74 118.95 0.00 0.00 2097.68 269.17 2402.99 00:41:03.570 =================================================================================================================== 00:41:03.570 Total : 365307.11 356.75 0.00 0.00 2099.64 269.17 3479.65' 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:41:03.570 12:43:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:03.571 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:03.571 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:03.571 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:03.571 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:41:03.571 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:41:03.571 12:43:26 bdevperf_config -- bdevperf/test_config.sh@42 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -t 2 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/conf.json -j /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:41:06.101 12:43:29 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-06-07 12:43:26.720725] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:06.101 [2024-06-07 12:43:26.721024] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232247 ] 00:41:06.101 Using job config with 4 jobs 00:41:06.101 [2024-06-07 12:43:26.866020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:06.101 [2024-06-07 12:43:26.969952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:06.101 cpumask for '\''job0'\'' is too big 00:41:06.101 cpumask for '\''job1'\'' is too big 00:41:06.101 cpumask for '\''job2'\'' is too big 00:41:06.101 cpumask for '\''job3'\'' is too big 00:41:06.101 Running I/O for 2 seconds... 00:41:06.101 00:41:06.101 Latency(us) 00:41:06.101 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40640.89 39.69 0.00 0.00 6296.19 1810.04 13918.60 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40632.25 39.68 0.00 0.00 6294.24 2059.70 13918.60 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40625.44 39.67 0.00 0.00 6285.91 1755.43 12170.97 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40667.18 39.71 0.00 0.00 6276.55 1973.88 12170.97 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40660.49 39.71 0.00 0.00 6268.74 1739.82 10485.76 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.02 40652.59 39.70 0.00 0.00 6266.62 1989.49 10548.18 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.02 40645.20 39.69 0.00 0.00 6259.01 1739.82 9050.21 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.02 40746.71 39.79 0.00 0.00 6239.92 442.76 9112.62 00:41:06.101 =================================================================================================================== 00:41:06.101 Total : 325270.74 317.65 0.00 0.00 6273.36 442.76 13918.60' 00:41:06.101 12:43:29 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-06-07 12:43:26.720725] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:06.101 [2024-06-07 12:43:26.721024] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232247 ] 00:41:06.101 Using job config with 4 jobs 00:41:06.101 [2024-06-07 12:43:26.866020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:06.101 [2024-06-07 12:43:26.969952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:06.101 cpumask for '\''job0'\'' is too big 00:41:06.101 cpumask for '\''job1'\'' is too big 00:41:06.101 cpumask for '\''job2'\'' is too big 00:41:06.101 cpumask for '\''job3'\'' is too big 00:41:06.101 Running I/O for 2 seconds... 00:41:06.101 00:41:06.101 Latency(us) 00:41:06.101 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40640.89 39.69 0.00 0.00 6296.19 1810.04 13918.60 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40632.25 39.68 0.00 0.00 6294.24 2059.70 13918.60 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40625.44 39.67 0.00 0.00 6285.91 1755.43 12170.97 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40667.18 39.71 0.00 0.00 6276.55 1973.88 12170.97 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40660.49 39.71 0.00 0.00 6268.74 1739.82 10485.76 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.02 40652.59 39.70 0.00 0.00 6266.62 1989.49 10548.18 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.02 40645.20 39.69 0.00 0.00 6259.01 1739.82 9050.21 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.02 40746.71 39.79 0.00 0.00 6239.92 442.76 9112.62 00:41:06.101 =================================================================================================================== 00:41:06.101 Total : 325270.74 317.65 0.00 0.00 6273.36 442.76 13918.60' 00:41:06.101 12:43:29 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-07 12:43:26.720725] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:06.101 [2024-06-07 12:43:26.721024] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232247 ] 00:41:06.101 Using job config with 4 jobs 00:41:06.101 [2024-06-07 12:43:26.866020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:06.101 [2024-06-07 12:43:26.969952] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:06.101 cpumask for '\''job0'\'' is too big 00:41:06.101 cpumask for '\''job1'\'' is too big 00:41:06.101 cpumask for '\''job2'\'' is too big 00:41:06.101 cpumask for '\''job3'\'' is too big 00:41:06.101 Running I/O for 2 seconds... 00:41:06.101 00:41:06.101 Latency(us) 00:41:06.101 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40640.89 39.69 0.00 0.00 6296.19 1810.04 13918.60 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40632.25 39.68 0.00 0.00 6294.24 2059.70 13918.60 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc0 : 2.01 40625.44 39.67 0.00 0.00 6285.91 1755.43 12170.97 00:41:06.101 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.101 Malloc1 : 2.01 40667.18 39.71 0.00 0.00 6276.55 1973.88 12170.97 00:41:06.101 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.102 Malloc0 : 2.01 40660.49 39.71 0.00 0.00 6268.74 1739.82 10485.76 00:41:06.102 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.102 Malloc1 : 2.02 40652.59 39.70 0.00 0.00 6266.62 1989.49 10548.18 00:41:06.102 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.102 Malloc0 : 2.02 40645.20 39.69 0.00 0.00 6259.01 1739.82 9050.21 00:41:06.102 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:41:06.102 Malloc1 : 2.02 40746.71 39.79 0.00 0.00 6239.92 442.76 9112.62 00:41:06.102 =================================================================================================================== 00:41:06.102 Total : 325270.74 317.65 0.00 0.00 6273.36 442.76 13918.60' 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdevperf/test.conf 00:41:06.102 12:43:29 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:41:06.102 00:41:06.102 real 0m12.023s 00:41:06.102 user 0m10.072s 00:41:06.102 sys 0m1.391s 00:41:06.102 12:43:29 bdevperf_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:06.102 12:43:29 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:41:06.102 ************************************ 00:41:06.102 END TEST bdevperf_config 00:41:06.102 ************************************ 00:41:06.102 12:43:29 -- spdk/autotest.sh@192 -- # uname -s 00:41:06.102 12:43:29 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:41:06.102 12:43:29 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /home/vagrant/spdk_repo/spdk/test/interrupt/reactor_set_interrupt.sh 00:41:06.102 12:43:29 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:06.102 12:43:29 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:06.102 12:43:29 -- common/autotest_common.sh@10 -- # set +x 00:41:06.102 ************************************ 00:41:06.102 START TEST reactor_set_interrupt 00:41:06.102 ************************************ 00:41:06.102 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/interrupt/reactor_set_interrupt.sh 00:41:06.363 * Looking for test storage... 00:41:06.363 * Found test storage at /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/interrupt/interrupt_common.sh 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /home/vagrant/spdk_repo/spdk/test/interrupt/reactor_set_interrupt.sh 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/interrupt/../.. 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:41:06.363 12:43:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:41:06.363 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /home/vagrant/spdk_repo/spdk/../output ']' 00:41:06.364 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/common/build_config.sh ]] 00:41:06.364 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/test/common/build_config.sh 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/home/vagrant/spdk_repo/dpdk/build 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//home/vagrant/spdk_repo/dpdk/build/include 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:41:06.364 12:43:29 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:41:06.364 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/common 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/home/vagrant/spdk_repo/spdk/test/common 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/home/vagrant/spdk_repo/spdk 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/home/vagrant/spdk_repo/spdk/build/bin 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/home/vagrant/spdk_repo/spdk/test/app 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/home/vagrant/spdk_repo/spdk/build/examples 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /home/vagrant/spdk_repo/spdk/include/spdk/config.h ]] 00:41:06.364 12:43:29 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:41:06.364 #define SPDK_CONFIG_H 00:41:06.364 #define SPDK_CONFIG_APPS 1 00:41:06.364 #define SPDK_CONFIG_ARCH native 00:41:06.364 #define SPDK_CONFIG_ASAN 1 00:41:06.364 #undef SPDK_CONFIG_AVAHI 00:41:06.364 #undef SPDK_CONFIG_CET 00:41:06.364 #define SPDK_CONFIG_COVERAGE 1 00:41:06.364 #define SPDK_CONFIG_CROSS_PREFIX 00:41:06.364 #undef SPDK_CONFIG_CRYPTO 00:41:06.364 #undef SPDK_CONFIG_CRYPTO_MLX5 00:41:06.364 #undef SPDK_CONFIG_CUSTOMOCF 00:41:06.364 #undef SPDK_CONFIG_DAOS 00:41:06.364 #define SPDK_CONFIG_DAOS_DIR 00:41:06.364 #define SPDK_CONFIG_DEBUG 1 00:41:06.364 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:41:06.364 #define SPDK_CONFIG_DPDK_DIR /home/vagrant/spdk_repo/dpdk/build 00:41:06.364 #define SPDK_CONFIG_DPDK_INC_DIR //home/vagrant/spdk_repo/dpdk/build/include 00:41:06.364 #define SPDK_CONFIG_DPDK_LIB_DIR /home/vagrant/spdk_repo/dpdk/build/lib 00:41:06.364 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:41:06.364 #undef SPDK_CONFIG_DPDK_UADK 00:41:06.365 #define SPDK_CONFIG_ENV /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:41:06.365 #define SPDK_CONFIG_EXAMPLES 1 00:41:06.365 #undef SPDK_CONFIG_FC 00:41:06.365 #define SPDK_CONFIG_FC_PATH 00:41:06.365 #define SPDK_CONFIG_FIO_PLUGIN 1 00:41:06.365 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:41:06.365 #undef SPDK_CONFIG_FUSE 00:41:06.365 #undef SPDK_CONFIG_FUZZER 00:41:06.365 #define SPDK_CONFIG_FUZZER_LIB 00:41:06.365 #undef SPDK_CONFIG_GOLANG 00:41:06.365 #undef SPDK_CONFIG_HAVE_ARC4RANDOM 00:41:06.365 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:41:06.365 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:41:06.365 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:41:06.365 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:41:06.365 #undef SPDK_CONFIG_HAVE_LIBBSD 00:41:06.365 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:41:06.365 #define SPDK_CONFIG_IDXD 1 00:41:06.365 #undef SPDK_CONFIG_IDXD_KERNEL 00:41:06.365 #undef SPDK_CONFIG_IPSEC_MB 00:41:06.365 #define SPDK_CONFIG_IPSEC_MB_DIR 00:41:06.365 #define SPDK_CONFIG_ISAL 1 00:41:06.365 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:41:06.365 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:41:06.365 #define SPDK_CONFIG_LIBDIR 00:41:06.365 #undef SPDK_CONFIG_LTO 00:41:06.365 #define SPDK_CONFIG_MAX_LCORES 00:41:06.365 #define SPDK_CONFIG_NVME_CUSE 1 00:41:06.365 #undef SPDK_CONFIG_OCF 00:41:06.365 #define SPDK_CONFIG_OCF_PATH 00:41:06.365 #define SPDK_CONFIG_OPENSSL_PATH 00:41:06.365 #undef SPDK_CONFIG_PGO_CAPTURE 00:41:06.365 #define SPDK_CONFIG_PGO_DIR 00:41:06.365 #undef SPDK_CONFIG_PGO_USE 00:41:06.365 #define SPDK_CONFIG_PREFIX /usr/local 00:41:06.365 #undef SPDK_CONFIG_RAID5F 00:41:06.365 #undef SPDK_CONFIG_RBD 00:41:06.365 #define SPDK_CONFIG_RDMA 1 00:41:06.365 #define SPDK_CONFIG_RDMA_PROV verbs 00:41:06.365 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:41:06.365 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:41:06.365 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:41:06.365 #undef SPDK_CONFIG_SHARED 00:41:06.365 #undef SPDK_CONFIG_SMA 00:41:06.365 #define SPDK_CONFIG_TESTS 1 00:41:06.365 #undef SPDK_CONFIG_TSAN 00:41:06.365 #undef SPDK_CONFIG_UBLK 00:41:06.365 #undef SPDK_CONFIG_UBSAN 00:41:06.365 #define SPDK_CONFIG_UNIT_TESTS 1 00:41:06.365 #undef SPDK_CONFIG_URING 00:41:06.365 #define SPDK_CONFIG_URING_PATH 00:41:06.365 #undef SPDK_CONFIG_URING_ZNS 00:41:06.365 #undef SPDK_CONFIG_USDT 00:41:06.365 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:41:06.365 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:41:06.365 #undef SPDK_CONFIG_VFIO_USER 00:41:06.365 #define SPDK_CONFIG_VFIO_USER_DIR 00:41:06.365 #define SPDK_CONFIG_VHOST 1 00:41:06.365 #define SPDK_CONFIG_VIRTIO 1 00:41:06.365 #undef SPDK_CONFIG_VTUNE 00:41:06.365 #define SPDK_CONFIG_VTUNE_DIR 00:41:06.365 #define SPDK_CONFIG_WERROR 1 00:41:06.365 #define SPDK_CONFIG_WPDK_DIR 00:41:06.365 #undef SPDK_CONFIG_XNVME 00:41:06.365 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:41:06.365 12:43:29 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:06.365 12:43:29 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:06.365 12:43:29 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:06.365 12:43:29 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:06.365 12:43:29 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:41:06.365 12:43:29 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@6 -- # dirname /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@6 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@7 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm/../../../ 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/home/vagrant/spdk_repo/spdk 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/home/vagrant/spdk_repo/spdk/.run_test_name 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/home/vagrant/spdk_repo/spdk/../output/power 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@81 -- # [[ QEMU != QEMU ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /home/vagrant/spdk_repo/spdk/../output/power ]] 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 1 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 1 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 1 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:41:06.365 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : /home/vagrant/spdk_repo/dpdk/build 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : v22.11.4 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:41:06.366 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN= 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN= 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export 'VFIO_QEMU_BIN=/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN='/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 0 -eq 1 ]] 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 0 -eq 1 ]] 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j10 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:41:06.367 12:43:29 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 232323 ]] 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 232323 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:41:06.367 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.5J6aIx 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/interrupt /tmp/spdk.5J6aIx/tests/interrupt /tmp/spdk.5J6aIx 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=devtmpfs 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=4194304 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=4194304 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:06.627 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=6267031552 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=6270406656 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=2487136256 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=2508165120 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=21028864 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda5 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=xfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12111888384 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=20303577088 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=8191688704 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda2 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=xfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=896184320 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=1042161664 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=145977344 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda1 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=vfat 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=97312768 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=104607744 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=7294976 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=1254076416 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=1254080512 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=:/mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt/output 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=fuse.sshfs 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=89790775296 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=105088212992 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9912004608 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:41:06.628 * Looking for test storage... 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=12111888384 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ xfs == tmpfs ]] 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ xfs == ramfs ]] 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=10406281216 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.628 * Found test storage at /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # set -o errtrace 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # true 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@1688 -- # xtrace_fd 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /home/vagrant/spdk_repo/spdk/test/interrupt/common.sh 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/examples/interrupt_tgt 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/examples/interrupt_tgt 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=232372 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /home/vagrant/spdk_repo/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:41:06.628 12:43:30 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 232372 /var/tmp/spdk.sock 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 232372 ']' 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:06.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:06.628 12:43:30 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:41:06.628 [2024-06-07 12:43:30.083153] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:06.628 [2024-06-07 12:43:30.084336] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232372 ] 00:41:06.628 [2024-06-07 12:43:30.247298] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:41:06.888 [2024-06-07 12:43:30.345460] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:41:06.888 [2024-06-07 12:43:30.345646] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:41:06.888 [2024-06-07 12:43:30.345655] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:06.888 [2024-06-07 12:43:30.478908] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:41:07.455 12:43:31 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:07.455 12:43:31 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:41:07.455 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:41:07.455 12:43:31 reactor_set_interrupt -- interrupt/common.sh@67 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:41:07.714 Malloc0 00:41:07.714 Malloc1 00:41:07.714 Malloc2 00:41:07.971 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:41:07.971 12:43:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:41:07.971 12:43:31 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:41:07.971 12:43:31 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/home/vagrant/spdk_repo/spdk/test/interrupt/aiofile bs=2048 count=5000 00:41:07.971 5000+0 records in 00:41:07.971 5000+0 records out 00:41:07.971 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0254224 s, 403 MB/s 00:41:07.971 12:43:31 reactor_set_interrupt -- interrupt/common.sh@77 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_aio_create /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile AIO0 2048 00:41:08.230 AIO0 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 232372 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 232372 without_thd 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=232372 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_get_stats 00:41:08.230 12:43:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_get_stats 00:41:08.488 12:43:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:41:08.746 spdk_thread ids are 1 on reactor0. 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232372 0 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232372 0 idle 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232372 root 20 0 20.1t 45284 19140 S 0.0 0.4 0:00.39 reactor_0' 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232372 root 20 0 20.1t 45284 19140 S 0.0 0.4 0:00.39 reactor_0 00:41:08.746 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232372 1 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232372 1 idle 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232375 root 20 0 20.1t 47320 19140 S 0.0 0.4 0:00.00 reactor_1' 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232375 root 20 0 20.1t 47320 19140 S 0.0 0.4 0:00.00 reactor_1 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232372 2 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232372 2 idle 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:09.005 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232376 root 20 0 20.1t 47320 19140 S 0.0 0.4 0:00.00 reactor_2' 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232376 root 20 0 20.1t 47320 19140 S 0.0 0.4 0:00.00 reactor_2 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:41:09.263 12:43:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:41:09.521 [2024-06-07 12:43:33.044173] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:41:09.521 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:41:09.779 [2024-06-07 12:43:33.368192] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:41:09.779 [2024-06-07 12:43:33.369404] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:09.779 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:41:10.037 [2024-06-07 12:43:33.680224] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:41:10.037 [2024-06-07 12:43:33.681390] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 232372 0 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 232372 0 busy 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232372 root 20 0 20.1t 47480 19140 R 99.9 0.4 0:00.88 reactor_0' 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232372 root 20 0 20.1t 47480 19140 R 99.9 0.4 0:00.88 reactor_0 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 232372 2 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 232372 2 busy 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:10.295 12:43:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232376 root 20 0 20.1t 47480 19140 R 99.9 0.4 0:00.35 reactor_2' 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232376 root 20 0 20.1t 47480 19140 R 99.9 0.4 0:00.35 reactor_2 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:10.553 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:41:10.811 [2024-06-07 12:43:34.339973] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:41:10.811 [2024-06-07 12:43:34.341271] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 232372 2 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232372 2 idle 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:10.811 12:43:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232376 root 20 0 20.1t 47528 19140 S 0.0 0.4 0:00.65 reactor_2' 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232376 root 20 0 20.1t 47528 19140 S 0.0 0.4 0:00.65 reactor_2 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:11.068 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:41:11.325 [2024-06-07 12:43:34.884014] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:41:11.325 [2024-06-07 12:43:34.884965] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:11.325 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:41:11.325 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:41:11.325 12:43:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:41:11.582 [2024-06-07 12:43:35.216444] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:41:11.840 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 232372 0 00:41:11.840 12:43:35 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232372 0 idle 00:41:11.840 12:43:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232372 00:41:11.840 12:43:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232372 -w 256 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232372 root 20 0 20.1t 47624 19140 S 0.0 0.4 0:01.88 reactor_0' 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232372 root 20 0 20.1t 47624 19140 S 0.0 0.4 0:01.88 reactor_0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:41:11.841 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 232372 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 232372 ']' 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 232372 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 232372 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 232372' 00:41:11.841 killing process with pid 232372 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 232372 00:41:11.841 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 232372 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=232519 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 232519 /var/tmp/spdk.sock 00:41:12.407 12:43:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /home/vagrant/spdk_repo/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 232519 ']' 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:12.407 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:12.407 12:43:35 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:41:12.407 [2024-06-07 12:43:35.942787] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:12.407 [2024-06-07 12:43:35.943390] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232519 ] 00:41:12.666 [2024-06-07 12:43:36.104442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:41:12.666 [2024-06-07 12:43:36.204291] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:41:12.666 [2024-06-07 12:43:36.204386] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:41:12.666 [2024-06-07 12:43:36.204397] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:12.924 [2024-06-07 12:43:36.329561] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:41:13.546 12:43:37 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:13.546 12:43:37 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:41:13.546 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:41:13.546 12:43:37 reactor_set_interrupt -- interrupt/common.sh@67 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:41:13.820 Malloc0 00:41:13.820 Malloc1 00:41:13.820 Malloc2 00:41:13.820 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:41:13.820 12:43:37 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:41:13.820 12:43:37 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:41:13.820 12:43:37 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/home/vagrant/spdk_repo/spdk/test/interrupt/aiofile bs=2048 count=5000 00:41:13.820 5000+0 records in 00:41:13.820 5000+0 records out 00:41:13.820 10240000 bytes (10 MB, 9.8 MiB) copied, 0.032202 s, 318 MB/s 00:41:13.820 12:43:37 reactor_set_interrupt -- interrupt/common.sh@77 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_aio_create /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile AIO0 2048 00:41:14.079 AIO0 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 232519 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 232519 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=232519 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:14.079 12:43:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_get_stats 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:41:14.644 12:43:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py thread_get_stats 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:41:14.644 spdk_thread ids are 1 on reactor0. 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232519 0 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232519 0 idle 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:14.644 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232519 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.39 reactor_0' 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232519 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.39 reactor_0 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232519 1 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232519 1 idle 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:14.902 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232522 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.00 reactor_1' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232522 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.00 reactor_1 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 232519 2 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232519 2 idle 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232523 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.00 reactor_2' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232523 root 20 0 20.1t 43212 19096 S 0.0 0.4 0:00.00 reactor_2 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:41:15.160 12:43:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:41:15.724 [2024-06-07 12:43:39.139048] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:41:15.724 [2024-06-07 12:43:39.139692] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:41:15.724 [2024-06-07 12:43:39.140302] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:15.724 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:41:15.982 [2024-06-07 12:43:39.535170] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:41:15.982 [2024-06-07 12:43:39.536564] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:15.982 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:41:15.982 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 232519 0 00:41:15.982 12:43:39 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 232519 0 busy 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:15.983 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232519 root 20 0 20.1t 43328 19096 R 93.8 0.4 0:00.98 reactor_0' 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232519 root 20 0 20.1t 43328 19096 R 93.8 0.4 0:00.98 reactor_0 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 232519 2 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 232519 2 busy 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:16.240 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232523 root 20 0 20.1t 43328 19096 R 99.9 0.4 0:00.37 reactor_2' 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232523 root 20 0 20.1t 43328 19096 R 99.9 0.4 0:00.37 reactor_2 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:16.499 12:43:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:41:16.757 [2024-06-07 12:43:40.175246] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:41:16.757 [2024-06-07 12:43:40.176369] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 232519 2 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232519 2 idle 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232523 root 20 0 20.1t 43392 19096 S 0.0 0.4 0:00.63 reactor_2' 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232523 root 20 0 20.1t 43392 19096 S 0.0 0.4 0:00.63 reactor_2 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:16.757 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:41:17.015 [2024-06-07 12:43:40.607252] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:41:17.015 [2024-06-07 12:43:40.608138] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:41:17.015 [2024-06-07 12:43:40.608345] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 232519 0 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 232519 0 idle 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=232519 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 232519 -w 256 00:41:17.015 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 232519 root 20 0 20.1t 43440 19096 S 0.0 0.4 0:01.85 reactor_0' 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 232519 root 20 0 20.1t 43440 19096 S 0.0 0.4 0:01.85 reactor_0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:41:17.273 12:43:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 232519 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 232519 ']' 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 232519 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 232519 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 232519' 00:41:17.273 killing process with pid 232519 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 232519 00:41:17.273 12:43:40 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 232519 00:41:17.841 12:43:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:41:17.841 12:43:41 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile 00:41:17.841 00:41:17.841 real 0m11.542s 00:41:17.841 user 0m11.353s 00:41:17.841 sys 0m2.117s 00:41:17.841 12:43:41 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:17.841 12:43:41 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:41:17.841 ************************************ 00:41:17.841 END TEST reactor_set_interrupt 00:41:17.841 ************************************ 00:41:17.841 12:43:41 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /home/vagrant/spdk_repo/spdk/test/interrupt/reap_unregistered_poller.sh 00:41:17.841 12:43:41 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:17.841 12:43:41 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:17.841 12:43:41 -- common/autotest_common.sh@10 -- # set +x 00:41:17.841 ************************************ 00:41:17.841 START TEST reap_unregistered_poller 00:41:17.841 ************************************ 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/interrupt/reap_unregistered_poller.sh 00:41:17.841 * Looking for test storage... 00:41:17.841 * Found test storage at /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/interrupt/interrupt_common.sh 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /home/vagrant/spdk_repo/spdk/test/interrupt/reap_unregistered_poller.sh 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/interrupt/../.. 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:41:17.841 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /home/vagrant/spdk_repo/spdk/../output ']' 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/common/build_config.sh ]] 00:41:17.841 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/test/common/build_config.sh 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/home/vagrant/spdk_repo/dpdk/build 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//home/vagrant/spdk_repo/dpdk/build/include 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=n 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:41:17.841 12:43:41 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:41:17.842 12:43:41 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:41:17.842 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/common 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/home/vagrant/spdk_repo/spdk/test/common 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/home/vagrant/spdk_repo/spdk 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/home/vagrant/spdk_repo/spdk/build/bin 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/home/vagrant/spdk_repo/spdk/test/app 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/home/vagrant/spdk_repo/spdk/build/examples 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /home/vagrant/spdk_repo/spdk/include/spdk/config.h ]] 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:41:17.842 #define SPDK_CONFIG_H 00:41:17.842 #define SPDK_CONFIG_APPS 1 00:41:17.842 #define SPDK_CONFIG_ARCH native 00:41:17.842 #define SPDK_CONFIG_ASAN 1 00:41:17.842 #undef SPDK_CONFIG_AVAHI 00:41:17.842 #undef SPDK_CONFIG_CET 00:41:17.842 #define SPDK_CONFIG_COVERAGE 1 00:41:17.842 #define SPDK_CONFIG_CROSS_PREFIX 00:41:17.842 #undef SPDK_CONFIG_CRYPTO 00:41:17.842 #undef SPDK_CONFIG_CRYPTO_MLX5 00:41:17.842 #undef SPDK_CONFIG_CUSTOMOCF 00:41:17.842 #undef SPDK_CONFIG_DAOS 00:41:17.842 #define SPDK_CONFIG_DAOS_DIR 00:41:17.842 #define SPDK_CONFIG_DEBUG 1 00:41:17.842 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:41:17.842 #define SPDK_CONFIG_DPDK_DIR /home/vagrant/spdk_repo/dpdk/build 00:41:17.842 #define SPDK_CONFIG_DPDK_INC_DIR //home/vagrant/spdk_repo/dpdk/build/include 00:41:17.842 #define SPDK_CONFIG_DPDK_LIB_DIR /home/vagrant/spdk_repo/dpdk/build/lib 00:41:17.842 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:41:17.842 #undef SPDK_CONFIG_DPDK_UADK 00:41:17.842 #define SPDK_CONFIG_ENV /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:41:17.842 #define SPDK_CONFIG_EXAMPLES 1 00:41:17.842 #undef SPDK_CONFIG_FC 00:41:17.842 #define SPDK_CONFIG_FC_PATH 00:41:17.842 #define SPDK_CONFIG_FIO_PLUGIN 1 00:41:17.842 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:41:17.842 #undef SPDK_CONFIG_FUSE 00:41:17.842 #undef SPDK_CONFIG_FUZZER 00:41:17.842 #define SPDK_CONFIG_FUZZER_LIB 00:41:17.842 #undef SPDK_CONFIG_GOLANG 00:41:17.842 #undef SPDK_CONFIG_HAVE_ARC4RANDOM 00:41:17.842 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:41:17.842 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:41:17.842 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:41:17.842 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:41:17.842 #undef SPDK_CONFIG_HAVE_LIBBSD 00:41:17.842 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:41:17.842 #define SPDK_CONFIG_IDXD 1 00:41:17.842 #undef SPDK_CONFIG_IDXD_KERNEL 00:41:17.842 #undef SPDK_CONFIG_IPSEC_MB 00:41:17.842 #define SPDK_CONFIG_IPSEC_MB_DIR 00:41:17.842 #define SPDK_CONFIG_ISAL 1 00:41:17.842 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:41:17.842 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:41:17.842 #define SPDK_CONFIG_LIBDIR 00:41:17.842 #undef SPDK_CONFIG_LTO 00:41:17.842 #define SPDK_CONFIG_MAX_LCORES 00:41:17.842 #define SPDK_CONFIG_NVME_CUSE 1 00:41:17.842 #undef SPDK_CONFIG_OCF 00:41:17.842 #define SPDK_CONFIG_OCF_PATH 00:41:17.842 #define SPDK_CONFIG_OPENSSL_PATH 00:41:17.842 #undef SPDK_CONFIG_PGO_CAPTURE 00:41:17.842 #define SPDK_CONFIG_PGO_DIR 00:41:17.842 #undef SPDK_CONFIG_PGO_USE 00:41:17.842 #define SPDK_CONFIG_PREFIX /usr/local 00:41:17.842 #undef SPDK_CONFIG_RAID5F 00:41:17.842 #undef SPDK_CONFIG_RBD 00:41:17.842 #define SPDK_CONFIG_RDMA 1 00:41:17.842 #define SPDK_CONFIG_RDMA_PROV verbs 00:41:17.842 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:41:17.842 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:41:17.842 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:41:17.842 #undef SPDK_CONFIG_SHARED 00:41:17.842 #undef SPDK_CONFIG_SMA 00:41:17.842 #define SPDK_CONFIG_TESTS 1 00:41:17.842 #undef SPDK_CONFIG_TSAN 00:41:17.842 #undef SPDK_CONFIG_UBLK 00:41:17.842 #undef SPDK_CONFIG_UBSAN 00:41:17.842 #define SPDK_CONFIG_UNIT_TESTS 1 00:41:17.842 #undef SPDK_CONFIG_URING 00:41:17.842 #define SPDK_CONFIG_URING_PATH 00:41:17.842 #undef SPDK_CONFIG_URING_ZNS 00:41:17.842 #undef SPDK_CONFIG_USDT 00:41:17.842 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:41:17.842 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:41:17.842 #undef SPDK_CONFIG_VFIO_USER 00:41:17.842 #define SPDK_CONFIG_VFIO_USER_DIR 00:41:17.842 #define SPDK_CONFIG_VHOST 1 00:41:17.842 #define SPDK_CONFIG_VIRTIO 1 00:41:17.842 #undef SPDK_CONFIG_VTUNE 00:41:17.842 #define SPDK_CONFIG_VTUNE_DIR 00:41:17.842 #define SPDK_CONFIG_WERROR 1 00:41:17.842 #define SPDK_CONFIG_WPDK_DIR 00:41:17.842 #undef SPDK_CONFIG_XNVME 00:41:17.842 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:41:17.842 12:43:41 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:41:17.842 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:41:17.842 12:43:41 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:41:17.842 12:43:41 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:17.842 12:43:41 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:17.842 12:43:41 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:17.842 12:43:41 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:17.842 12:43:41 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:17.842 12:43:41 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:41:17.842 12:43:41 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:17.842 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@6 -- # dirname /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@6 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@7 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm/../../../ 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/home/vagrant/spdk_repo/spdk 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/home/vagrant/spdk_repo/spdk/.run_test_name 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/home/vagrant/spdk_repo/spdk/../output/power 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@81 -- # [[ QEMU != QEMU ]] 00:41:18.101 12:43:41 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /home/vagrant/spdk_repo/spdk/../output/power ]] 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : /home/vagrant/spdk_repo/dpdk/build 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : v22.11.4 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 1 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:41:18.101 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN= 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN= 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export 'VFIO_QEMU_BIN=/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN='/usr/local/qemu/vfio-user*/bin/qemu-system-x86_64' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 0 -eq 1 ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 0 -eq 1 ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j10 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 232691 ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 232691 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.r0A3QP 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/interrupt /tmp/spdk.r0A3QP/tests/interrupt /tmp/spdk.r0A3QP 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=devtmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=4194304 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=4194304 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=6267031552 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=6270406656 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=2487136256 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=2508165120 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=21028864 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda5 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=xfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12111863808 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=20303577088 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=8191713280 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda2 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=xfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=896184320 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=1042161664 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=145977344 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/vda1 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=vfat 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=97312768 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=104607744 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=7294976 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=1254076416 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=1254080512 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=:/mnt/jenkins_nvme/jenkins/workspace/rocky9-vg-autotest/rocky9-libvirt/output 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=fuse.sshfs 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=89786867712 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=105088212992 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9915912192 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:41:18.102 * Looking for test storage... 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=12111863808 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ xfs == tmpfs ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ xfs == ramfs ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=10406305792 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/interrupt 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:18.102 * Found test storage at /home/vagrant/spdk_repo/spdk/test/interrupt 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # set -o errtrace 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # true 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@1688 -- # xtrace_fd 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /home/vagrant/spdk_repo/spdk/test/interrupt/common.sh 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/examples/interrupt_tgt 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/examples/interrupt_tgt 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=232734 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /home/vagrant/spdk_repo/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:41:18.102 12:43:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 232734 /var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@830 -- # '[' -z 232734 ']' 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local max_retries=100 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:18.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@839 -- # xtrace_disable 00:41:18.102 12:43:41 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:41:18.102 [2024-06-07 12:43:41.672580] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:18.102 [2024-06-07 12:43:41.673355] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232734 ] 00:41:18.360 [2024-06-07 12:43:41.850119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:41:18.360 [2024-06-07 12:43:41.944178] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:41:18.360 [2024-06-07 12:43:41.944286] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:18.360 [2024-06-07 12:43:41.944287] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:41:18.618 [2024-06-07 12:43:42.066038] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:41:19.203 12:43:42 reap_unregistered_poller -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:41:19.203 12:43:42 reap_unregistered_poller -- common/autotest_common.sh@863 -- # return 0 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:41:19.203 12:43:42 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:41:19.203 12:43:42 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:41:19.203 12:43:42 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:41:19.203 "name": "app_thread", 00:41:19.203 "id": 1, 00:41:19.203 "active_pollers": [], 00:41:19.203 "timed_pollers": [ 00:41:19.203 { 00:41:19.203 "name": "rpc_subsystem_poll_servers", 00:41:19.203 "id": 1, 00:41:19.203 "state": "waiting", 00:41:19.203 "run_count": 0, 00:41:19.203 "busy_count": 0, 00:41:19.203 "period_ticks": 8400000 00:41:19.203 } 00:41:19.203 ], 00:41:19.203 "paused_pollers": [] 00:41:19.203 }' 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:41:19.203 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/home/vagrant/spdk_repo/spdk/test/interrupt/aiofile bs=2048 count=5000 00:41:19.204 5000+0 records in 00:41:19.204 5000+0 records out 00:41:19.204 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0209735 s, 488 MB/s 00:41:19.204 12:43:42 reap_unregistered_poller -- interrupt/common.sh@77 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_aio_create /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile AIO0 2048 00:41:19.768 AIO0 00:41:19.768 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_wait_for_examine 00:41:20.025 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:41:20.025 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:41:20.025 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:41:20.025 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:41:20.025 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:41:20.025 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:41:20.025 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:41:20.025 "name": "app_thread", 00:41:20.025 "id": 1, 00:41:20.025 "active_pollers": [], 00:41:20.025 "timed_pollers": [ 00:41:20.025 { 00:41:20.025 "name": "rpc_subsystem_poll_servers", 00:41:20.025 "id": 1, 00:41:20.025 "state": "waiting", 00:41:20.025 "run_count": 0, 00:41:20.025 "busy_count": 0, 00:41:20.025 "period_ticks": 8400000 00:41:20.025 } 00:41:20.025 ], 00:41:20.025 "paused_pollers": [] 00:41:20.025 }' 00:41:20.025 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:41:20.026 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:41:20.026 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:41:20.026 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:41:20.284 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:41:20.284 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:41:20.284 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:41:20.284 12:43:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 232734 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@949 -- # '[' -z 232734 ']' 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@953 -- # kill -0 232734 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@954 -- # uname 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 232734 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@967 -- # echo 'killing process with pid 232734' 00:41:20.284 killing process with pid 232734 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@968 -- # kill 232734 00:41:20.284 12:43:43 reap_unregistered_poller -- common/autotest_common.sh@973 -- # wait 232734 00:41:20.542 12:43:44 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:41:20.542 12:43:44 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /home/vagrant/spdk_repo/spdk/test/interrupt/aiofile 00:41:20.542 00:41:20.542 real 0m2.793s 00:41:20.542 user 0m1.822s 00:41:20.542 sys 0m0.637s 00:41:20.542 12:43:44 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:20.542 12:43:44 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:41:20.542 ************************************ 00:41:20.542 END TEST reap_unregistered_poller 00:41:20.542 ************************************ 00:41:20.542 12:43:44 -- spdk/autotest.sh@198 -- # uname -s 00:41:20.542 12:43:44 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:41:20.542 12:43:44 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:41:20.542 12:43:44 -- spdk/autotest.sh@205 -- # [[ 0 -eq 0 ]] 00:41:20.542 12:43:44 -- spdk/autotest.sh@206 -- # run_test spdk_dd /home/vagrant/spdk_repo/spdk/test/dd/dd.sh 00:41:20.542 12:43:44 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:20.542 12:43:44 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:20.542 12:43:44 -- common/autotest_common.sh@10 -- # set +x 00:41:20.800 ************************************ 00:41:20.800 START TEST spdk_dd 00:41:20.800 ************************************ 00:41:20.800 12:43:44 spdk_dd -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/dd.sh 00:41:20.800 * Looking for test storage... 00:41:20.800 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:41:20.800 12:43:44 spdk_dd -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:41:20.800 12:43:44 spdk_dd -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:41:20.800 12:43:44 spdk_dd -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:20.800 12:43:44 spdk_dd -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:20.801 12:43:44 spdk_dd -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:20.801 12:43:44 spdk_dd -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:20.801 12:43:44 spdk_dd -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:20.801 12:43:44 spdk_dd -- paths/export.sh@5 -- # export PATH 00:41:20.801 12:43:44 spdk_dd -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:20.801 12:43:44 spdk_dd -- dd/dd.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:41:21.059 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:41:21.059 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:41:21.059 12:43:44 spdk_dd -- dd/dd.sh@11 -- # nvmes=($(nvme_in_userspace)) 00:41:21.059 12:43:44 spdk_dd -- dd/dd.sh@11 -- # nvme_in_userspace 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@309 -- # local bdf bdfs 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@310 -- # local nvmes 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@312 -- # [[ -n '' ]] 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@315 -- # nvmes=($(iter_pci_class_code 01 08 02)) 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@315 -- # iter_pci_class_code 01 08 02 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@295 -- # local bdf= 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@297 -- # iter_all_pci_class_code 01 08 02 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@230 -- # local class 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@231 -- # local subclass 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@232 -- # local progif 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@233 -- # printf %02x 1 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@233 -- # class=01 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@234 -- # printf %02x 8 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@234 -- # subclass=08 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@235 -- # printf %02x 2 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@235 -- # progif=02 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@237 -- # hash lspci 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@238 -- # '[' 02 '!=' 00 ']' 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@239 -- # lspci -mm -n -D 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@240 -- # grep -i -- -p02 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@241 -- # awk -v 'cc="0108"' -F ' ' '{if (cc ~ $2) print $1}' 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@242 -- # tr -d '"' 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@297 -- # for bdf in $(iter_all_pci_class_code "$@") 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@298 -- # pci_can_use 0000:00:10.0 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@15 -- # local i 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@18 -- # [[ =~ 0000:00:10.0 ]] 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@22 -- # [[ -z '' ]] 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@24 -- # return 0 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@299 -- # echo 0000:00:10.0 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@318 -- # for bdf in "${nvmes[@]}" 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@319 -- # [[ -e /sys/bus/pci/drivers/nvme/0000:00:10.0 ]] 00:41:21.059 12:43:44 spdk_dd -- scripts/common.sh@320 -- # uname -s 00:41:21.060 12:43:44 spdk_dd -- scripts/common.sh@320 -- # [[ Linux == FreeBSD ]] 00:41:21.060 12:43:44 spdk_dd -- scripts/common.sh@323 -- # bdfs+=("$bdf") 00:41:21.060 12:43:44 spdk_dd -- scripts/common.sh@325 -- # (( 1 )) 00:41:21.060 12:43:44 spdk_dd -- scripts/common.sh@326 -- # printf '%s\n' 0000:00:10.0 00:41:21.060 12:43:44 spdk_dd -- dd/dd.sh@13 -- # check_liburing 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@139 -- # local lib so 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@140 -- # local -g liburing_in_use=0 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@137 -- # LD_TRACE_LOADED_OBJECTS=1 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@137 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ linux-vdso.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libasan.so.6 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libnuma.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libibverbs.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ librdmacm.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libuuid.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libssl.so.3 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libcrypto.so.3 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libm.so.6 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libfuse3.so.3 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libkeyutils.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libaio.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libiscsi.so.9 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libc.so.6 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libstdc++.so.6 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libgcc_s.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ /lib64/ld-linux-x86-64.so.2 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libnl-route-3.so.200 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libnl-3.so.200 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libz.so.1 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libgcrypt.so.20 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@143 -- # [[ libgpg-error.so.0 == liburing.so.* ]] 00:41:21.060 12:43:44 spdk_dd -- dd/common.sh@142 -- # read -r lib _ so _ 00:41:21.060 12:43:44 spdk_dd -- dd/dd.sh@15 -- # (( liburing_in_use == 0 && SPDK_TEST_URING == 1 )) 00:41:21.060 12:43:44 spdk_dd -- dd/dd.sh@20 -- # run_test spdk_dd_basic_rw /home/vagrant/spdk_repo/spdk/test/dd/basic_rw.sh 0000:00:10.0 00:41:21.060 12:43:44 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:41:21.060 12:43:44 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:21.060 12:43:44 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:41:21.060 ************************************ 00:41:21.060 START TEST spdk_dd_basic_rw 00:41:21.060 ************************************ 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/basic_rw.sh 0000:00:10.0 00:41:21.319 * Looking for test storage... 00:41:21.319 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- paths/export.sh@5 -- # export PATH 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@80 -- # trap cleanup EXIT 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@82 -- # nvmes=("$@") 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@83 -- # nvme0=Nvme0 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@83 -- # nvme0_pci=0000:00:10.0 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@83 -- # bdev0=Nvme0n1 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@85 -- # method_bdev_nvme_attach_controller_0=(['name']='Nvme0' ['traddr']='0000:00:10.0' ['trtype']='pcie') 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@85 -- # declare -A method_bdev_nvme_attach_controller_0 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@91 -- # test_file0=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@92 -- # test_file1=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@93 -- # get_native_nvme_bs 0000:00:10.0 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@124 -- # local pci=0000:00:10.0 lbaf id 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@126 -- # mapfile -t id 00:41:21.319 12:43:44 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@126 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:pcie traddr:0000:00:10.0' 00:41:21.588 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@129 -- # [[ ===================================================== NVMe Controller at 0000:00:10.0 [1b36:0010] ===================================================== Controller Capabilities/Features ================================ Vendor ID: 1b36 Subsystem Vendor ID: 1af4 Serial Number: 12340 Model Number: QEMU NVMe Ctrl Firmware Version: 8.0.0 Recommended Arb Burst: 6 IEEE OUI Identifier: 00 54 52 Multi-path I/O May have multiple subsystem ports: No May have multiple controllers: No Associated with SR-IOV VF: No Max Data Transfer Size: 524288 Max Number of Namespaces: 256 Max Number of I/O Queues: 64 NVMe Specification Version (VS): 1.4 NVMe Specification Version (Identify): 1.4 Maximum Queue Entries: 2048 Contiguous Queues Required: Yes Arbitration Mechanisms Supported Weighted Round Robin: Not Supported Vendor Specific: Not Supported Reset Timeout: 7500 ms Doorbell Stride: 4 bytes NVM Subsystem Reset: Not Supported Command Sets Supported NVM Command Set: Supported Boot Partition: Not Supported Memory Page Size Minimum: 4096 bytes Memory Page Size Maximum: 65536 bytes Persistent Memory Region: Not Supported Optional Asynchronous Events Supported Namespace Attribute Notices: Supported Firmware Activation Notices: Not Supported ANA Change Notices: Not Supported PLE Aggregate Log Change Notices: Not Supported LBA Status Info Alert Notices: Not Supported EGE Aggregate Log Change Notices: Not Supported Normal NVM Subsystem Shutdown event: Not Supported Zone Descriptor Change Notices: Not Supported Discovery Log Change Notices: Not Supported Controller Attributes 128-bit Host Identifier: Not Supported Non-Operational Permissive Mode: Not Supported NVM Sets: Not Supported Read Recovery Levels: Not Supported Endurance Groups: Not Supported Predictable Latency Mode: Not Supported Traffic Based Keep ALive: Not Supported Namespace Granularity: Not Supported SQ Associations: Not Supported UUID List: Not Supported Multi-Domain Subsystem: Not Supported Fixed Capacity Management: Not Supported Variable Capacity Management: Not Supported Delete Endurance Group: Not Supported Delete NVM Set: Not Supported Extended LBA Formats Supported: Supported Flexible Data Placement Supported: Not Supported Controller Memory Buffer Support ================================ Supported: No Persistent Memory Region Support ================================ Supported: No Admin Command Set Attributes ============================ Security Send/Receive: Not Supported Format NVM: Supported Firmware Activate/Download: Not Supported Namespace Management: Supported Device Self-Test: Not Supported Directives: Supported NVMe-MI: Not Supported Virtualization Management: Not Supported Doorbell Buffer Config: Supported Get LBA Status Capability: Not Supported Command & Feature Lockdown Capability: Not Supported Abort Command Limit: 4 Async Event Request Limit: 4 Number of Firmware Slots: N/A Firmware Slot 1 Read-Only: N/A Firmware Activation Without Reset: N/A Multiple Update Detection Support: N/A Firmware Update Granularity: No Information Provided Per-Namespace SMART Log: Yes Asymmetric Namespace Access Log Page: Not Supported Subsystem NQN: nqn.2019-08.org.qemu:12340 Command Effects Log Page: Supported Get Log Page Extended Data: Supported Telemetry Log Pages: Not Supported Persistent Event Log Pages: Not Supported Supported Log Pages Log Page: May Support Commands Supported & Effects Log Page: Not Supported Feature Identifiers & Effects Log Page:May Support NVMe-MI Commands & Effects Log Page: May Support Data Area 4 for Telemetry Log: Not Supported Error Log Page Entries Supported: 1 Keep Alive: Not Supported NVM Command Set Attributes ========================== Submission Queue Entry Size Max: 64 Min: 64 Completion Queue Entry Size Max: 16 Min: 16 Number of Namespaces: 256 Compare Command: Supported Write Uncorrectable Command: Not Supported Dataset Management Command: Supported Write Zeroes Command: Supported Set Features Save Field: Supported Reservations: Not Supported Timestamp: Supported Copy: Supported Volatile Write Cache: Present Atomic Write Unit (Normal): 1 Atomic Write Unit (PFail): 1 Atomic Compare & Write Unit: 1 Fused Compare & Write: Not Supported Scatter-Gather List SGL Command Set: Supported SGL Keyed: Not Supported SGL Bit Bucket Descriptor: Not Supported SGL Metadata Pointer: Not Supported Oversized SGL: Not Supported SGL Metadata Address: Not Supported SGL Offset: Not Supported Transport SGL Data Block: Not Supported Replay Protected Memory Block: Not Supported Firmware Slot Information ========================= Active slot: 1 Slot 1 Firmware Revision: 1.0 Commands Supported and Effects ============================== Admin Commands -------------- Delete I/O Submission Queue (00h): Supported Create I/O Submission Queue (01h): Supported Get Log Page (02h): Supported Delete I/O Completion Queue (04h): Supported Create I/O Completion Queue (05h): Supported Identify (06h): Supported Abort (08h): Supported Set Features (09h): Supported Get Features (0Ah): Supported Asynchronous Event Request (0Ch): Supported Namespace Attachment (15h): Supported NS-Inventory-Change Directive Send (19h): Supported Directive Receive (1Ah): Supported Virtualization Management (1Ch): Supported Doorbell Buffer Config (7Ch): Supported Format NVM (80h): Supported LBA-Change I/O Commands ------------ Flush (00h): Supported LBA-Change Write (01h): Supported LBA-Change Read (02h): Supported Compare (05h): Supported Write Zeroes (08h): Supported LBA-Change Dataset Management (09h): Supported LBA-Change Unknown (0Ch): Supported Unknown (12h): Supported Copy (19h): Supported LBA-Change Unknown (1Dh): Supported LBA-Change Error Log ========= Arbitration =========== Arbitration Burst: no limit Power Management ================ Number of Power States: 1 Current Power State: Power State #0 Power State #0: Max Power: 25.00 W Non-Operational State: Operational Entry Latency: 16 microseconds Exit Latency: 4 microseconds Relative Read Throughput: 0 Relative Read Latency: 0 Relative Write Throughput: 0 Relative Write Latency: 0 Idle Power: Not Reported Active Power: Not Reported Non-Operational Permissive Mode: Not Supported Health Information ================== Critical Warnings: Available Spare Space: OK Temperature: OK Device Reliability: OK Read Only: No Volatile Memory Backup: OK Current Temperature: 323 Kelvin (50 Celsius) Temperature Threshold: 343 Kelvin (70 Celsius) Available Spare: 0% Available Spare Threshold: 0% Life Percentage Used: 0% Data Units Read: 106 Data Units Written: 7 Host Read Commands: 2315 Host Write Commands: 113 Controller Busy Time: 0 minutes Power Cycles: 0 Power On Hours: 0 hours Unsafe Shutdowns: 0 Unrecoverable Media Errors: 0 Lifetime Error Log Entries: 0 Warning Temperature Time: 0 minutes Critical Temperature Time: 0 minutes Number of Queues ================ Number of I/O Submission Queues: 64 Number of I/O Completion Queues: 64 ZNS Specific Controller Data ============================ Zone Append Size Limit: 0 Active Namespaces ================= Namespace ID:1 Error Recovery Timeout: Unlimited Command Set Identifier: NVM (00h) Deallocate: Supported Deallocated/Unwritten Error: Supported Deallocated Read Value: All 0x00 Deallocate in Write Zeroes: Not Supported Deallocated Guard Field: 0xFFFF Flush: Supported Reservation: Not Supported Namespace Sharing Capabilities: Private Size (in LBAs): 1310720 (5GiB) Capacity (in LBAs): 1310720 (5GiB) Utilization (in LBAs): 1310720 (5GiB) Thin Provisioning: Not Supported Per-NS Atomic Units: No Maximum Single Source Range Length: 128 Maximum Copy Length: 128 Maximum Source Range Count: 128 NGUID/EUI64 Never Reused: No Namespace Write Protected: No Number of LBA Formats: 8 Current LBA Format: LBA Format #04 LBA Format #00: Data Size: 512 Metadata Size: 0 LBA Format #01: Data Size: 512 Metadata Size: 8 LBA Format #02: Data Size: 512 Metadata Size: 16 LBA Format #03: Data Size: 512 Metadata Size: 64 LBA Format #04: Data Size: 4096 Metadata Size: 0 LBA Format #05: Data Size: 4096 Metadata Size: 8 LBA Format #06: Data Size: 4096 Metadata Size: 16 LBA Format #07: Data Size: 4096 Metadata Size: 64 =~ Current LBA Format: *LBA Format #([0-9]+) ]] 00:41:21.588 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@130 -- # lbaf=04 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@131 -- # [[ ===================================================== NVMe Controller at 0000:00:10.0 [1b36:0010] ===================================================== Controller Capabilities/Features ================================ Vendor ID: 1b36 Subsystem Vendor ID: 1af4 Serial Number: 12340 Model Number: QEMU NVMe Ctrl Firmware Version: 8.0.0 Recommended Arb Burst: 6 IEEE OUI Identifier: 00 54 52 Multi-path I/O May have multiple subsystem ports: No May have multiple controllers: No Associated with SR-IOV VF: No Max Data Transfer Size: 524288 Max Number of Namespaces: 256 Max Number of I/O Queues: 64 NVMe Specification Version (VS): 1.4 NVMe Specification Version (Identify): 1.4 Maximum Queue Entries: 2048 Contiguous Queues Required: Yes Arbitration Mechanisms Supported Weighted Round Robin: Not Supported Vendor Specific: Not Supported Reset Timeout: 7500 ms Doorbell Stride: 4 bytes NVM Subsystem Reset: Not Supported Command Sets Supported NVM Command Set: Supported Boot Partition: Not Supported Memory Page Size Minimum: 4096 bytes Memory Page Size Maximum: 65536 bytes Persistent Memory Region: Not Supported Optional Asynchronous Events Supported Namespace Attribute Notices: Supported Firmware Activation Notices: Not Supported ANA Change Notices: Not Supported PLE Aggregate Log Change Notices: Not Supported LBA Status Info Alert Notices: Not Supported EGE Aggregate Log Change Notices: Not Supported Normal NVM Subsystem Shutdown event: Not Supported Zone Descriptor Change Notices: Not Supported Discovery Log Change Notices: Not Supported Controller Attributes 128-bit Host Identifier: Not Supported Non-Operational Permissive Mode: Not Supported NVM Sets: Not Supported Read Recovery Levels: Not Supported Endurance Groups: Not Supported Predictable Latency Mode: Not Supported Traffic Based Keep ALive: Not Supported Namespace Granularity: Not Supported SQ Associations: Not Supported UUID List: Not Supported Multi-Domain Subsystem: Not Supported Fixed Capacity Management: Not Supported Variable Capacity Management: Not Supported Delete Endurance Group: Not Supported Delete NVM Set: Not Supported Extended LBA Formats Supported: Supported Flexible Data Placement Supported: Not Supported Controller Memory Buffer Support ================================ Supported: No Persistent Memory Region Support ================================ Supported: No Admin Command Set Attributes ============================ Security Send/Receive: Not Supported Format NVM: Supported Firmware Activate/Download: Not Supported Namespace Management: Supported Device Self-Test: Not Supported Directives: Supported NVMe-MI: Not Supported Virtualization Management: Not Supported Doorbell Buffer Config: Supported Get LBA Status Capability: Not Supported Command & Feature Lockdown Capability: Not Supported Abort Command Limit: 4 Async Event Request Limit: 4 Number of Firmware Slots: N/A Firmware Slot 1 Read-Only: N/A Firmware Activation Without Reset: N/A Multiple Update Detection Support: N/A Firmware Update Granularity: No Information Provided Per-Namespace SMART Log: Yes Asymmetric Namespace Access Log Page: Not Supported Subsystem NQN: nqn.2019-08.org.qemu:12340 Command Effects Log Page: Supported Get Log Page Extended Data: Supported Telemetry Log Pages: Not Supported Persistent Event Log Pages: Not Supported Supported Log Pages Log Page: May Support Commands Supported & Effects Log Page: Not Supported Feature Identifiers & Effects Log Page:May Support NVMe-MI Commands & Effects Log Page: May Support Data Area 4 for Telemetry Log: Not Supported Error Log Page Entries Supported: 1 Keep Alive: Not Supported NVM Command Set Attributes ========================== Submission Queue Entry Size Max: 64 Min: 64 Completion Queue Entry Size Max: 16 Min: 16 Number of Namespaces: 256 Compare Command: Supported Write Uncorrectable Command: Not Supported Dataset Management Command: Supported Write Zeroes Command: Supported Set Features Save Field: Supported Reservations: Not Supported Timestamp: Supported Copy: Supported Volatile Write Cache: Present Atomic Write Unit (Normal): 1 Atomic Write Unit (PFail): 1 Atomic Compare & Write Unit: 1 Fused Compare & Write: Not Supported Scatter-Gather List SGL Command Set: Supported SGL Keyed: Not Supported SGL Bit Bucket Descriptor: Not Supported SGL Metadata Pointer: Not Supported Oversized SGL: Not Supported SGL Metadata Address: Not Supported SGL Offset: Not Supported Transport SGL Data Block: Not Supported Replay Protected Memory Block: Not Supported Firmware Slot Information ========================= Active slot: 1 Slot 1 Firmware Revision: 1.0 Commands Supported and Effects ============================== Admin Commands -------------- Delete I/O Submission Queue (00h): Supported Create I/O Submission Queue (01h): Supported Get Log Page (02h): Supported Delete I/O Completion Queue (04h): Supported Create I/O Completion Queue (05h): Supported Identify (06h): Supported Abort (08h): Supported Set Features (09h): Supported Get Features (0Ah): Supported Asynchronous Event Request (0Ch): Supported Namespace Attachment (15h): Supported NS-Inventory-Change Directive Send (19h): Supported Directive Receive (1Ah): Supported Virtualization Management (1Ch): Supported Doorbell Buffer Config (7Ch): Supported Format NVM (80h): Supported LBA-Change I/O Commands ------------ Flush (00h): Supported LBA-Change Write (01h): Supported LBA-Change Read (02h): Supported Compare (05h): Supported Write Zeroes (08h): Supported LBA-Change Dataset Management (09h): Supported LBA-Change Unknown (0Ch): Supported Unknown (12h): Supported Copy (19h): Supported LBA-Change Unknown (1Dh): Supported LBA-Change Error Log ========= Arbitration =========== Arbitration Burst: no limit Power Management ================ Number of Power States: 1 Current Power State: Power State #0 Power State #0: Max Power: 25.00 W Non-Operational State: Operational Entry Latency: 16 microseconds Exit Latency: 4 microseconds Relative Read Throughput: 0 Relative Read Latency: 0 Relative Write Throughput: 0 Relative Write Latency: 0 Idle Power: Not Reported Active Power: Not Reported Non-Operational Permissive Mode: Not Supported Health Information ================== Critical Warnings: Available Spare Space: OK Temperature: OK Device Reliability: OK Read Only: No Volatile Memory Backup: OK Current Temperature: 323 Kelvin (50 Celsius) Temperature Threshold: 343 Kelvin (70 Celsius) Available Spare: 0% Available Spare Threshold: 0% Life Percentage Used: 0% Data Units Read: 106 Data Units Written: 7 Host Read Commands: 2315 Host Write Commands: 113 Controller Busy Time: 0 minutes Power Cycles: 0 Power On Hours: 0 hours Unsafe Shutdowns: 0 Unrecoverable Media Errors: 0 Lifetime Error Log Entries: 0 Warning Temperature Time: 0 minutes Critical Temperature Time: 0 minutes Number of Queues ================ Number of I/O Submission Queues: 64 Number of I/O Completion Queues: 64 ZNS Specific Controller Data ============================ Zone Append Size Limit: 0 Active Namespaces ================= Namespace ID:1 Error Recovery Timeout: Unlimited Command Set Identifier: NVM (00h) Deallocate: Supported Deallocated/Unwritten Error: Supported Deallocated Read Value: All 0x00 Deallocate in Write Zeroes: Not Supported Deallocated Guard Field: 0xFFFF Flush: Supported Reservation: Not Supported Namespace Sharing Capabilities: Private Size (in LBAs): 1310720 (5GiB) Capacity (in LBAs): 1310720 (5GiB) Utilization (in LBAs): 1310720 (5GiB) Thin Provisioning: Not Supported Per-NS Atomic Units: No Maximum Single Source Range Length: 128 Maximum Copy Length: 128 Maximum Source Range Count: 128 NGUID/EUI64 Never Reused: No Namespace Write Protected: No Number of LBA Formats: 8 Current LBA Format: LBA Format #04 LBA Format #00: Data Size: 512 Metadata Size: 0 LBA Format #01: Data Size: 512 Metadata Size: 8 LBA Format #02: Data Size: 512 Metadata Size: 16 LBA Format #03: Data Size: 512 Metadata Size: 64 LBA Format #04: Data Size: 4096 Metadata Size: 0 LBA Format #05: Data Size: 4096 Metadata Size: 8 LBA Format #06: Data Size: 4096 Metadata Size: 16 LBA Format #07: Data Size: 4096 Metadata Size: 64 =~ LBA Format #04: Data Size: *([0-9]+) ]] 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@132 -- # lbaf=4096 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@134 -- # echo 4096 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@93 -- # native_bs=4096 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@96 -- # run_test dd_bs_lt_native_bs NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/fd/62 --ob=Nvme0n1 --bs=2048 --json /dev/fd/61 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@96 -- # : 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@96 -- # gen_conf 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:21.589 ************************************ 00:41:21.589 START TEST dd_bs_lt_native_bs 00:41:21.589 ************************************ 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@1124 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/fd/62 --ob=Nvme0n1 --bs=2048 --json /dev/fd/61 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@649 -- # local es=0 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/fd/62 --ob=Nvme0n1 --bs=2048 --json /dev/fd/61 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:41:21.589 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/fd/62 --ob=Nvme0n1 --bs=2048 --json /dev/fd/61 00:41:21.589 { 00:41:21.589 "subsystems": [ 00:41:21.589 { 00:41:21.589 "subsystem": "bdev", 00:41:21.589 "config": [ 00:41:21.589 { 00:41:21.589 "params": { 00:41:21.589 "trtype": "pcie", 00:41:21.589 "traddr": "0000:00:10.0", 00:41:21.589 "name": "Nvme0" 00:41:21.589 }, 00:41:21.589 "method": "bdev_nvme_attach_controller" 00:41:21.589 }, 00:41:21.589 { 00:41:21.589 "method": "bdev_wait_for_examine" 00:41:21.589 } 00:41:21.589 ] 00:41:21.589 } 00:41:21.589 ] 00:41:21.589 } 00:41:21.589 [2024-06-07 12:43:45.087081] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:21.589 [2024-06-07 12:43:45.087832] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233010 ] 00:41:21.875 [2024-06-07 12:43:45.236416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:21.875 [2024-06-07 12:43:45.334557] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:22.132 [2024-06-07 12:43:45.533673] spdk_dd.c:1161:dd_run: *ERROR*: --bs value cannot be less than input (1) neither output (4096) native block size 00:41:22.132 [2024-06-07 12:43:45.534017] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:22.132 [2024-06-07 12:43:45.739288] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@652 -- # es=234 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@661 -- # es=106 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@662 -- # case "$es" in 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@669 -- # es=1 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:22.389 00:41:22.389 real 0m0.881s 00:41:22.389 user 0m0.548s 00:41:22.389 sys 0m0.269s 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_bs_lt_native_bs -- common/autotest_common.sh@10 -- # set +x 00:41:22.389 ************************************ 00:41:22.389 END TEST dd_bs_lt_native_bs 00:41:22.389 ************************************ 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@103 -- # run_test dd_rw basic_rw 4096 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:22.389 12:43:45 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:22.389 ************************************ 00:41:22.389 START TEST dd_rw 00:41:22.389 ************************************ 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@1124 -- # basic_rw 4096 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@11 -- # local native_bs=4096 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@12 -- # local count size 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@13 -- # local qds bss 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@15 -- # qds=(1 64) 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@17 -- # for bs in {0..2} 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@18 -- # bss+=($((native_bs << bs))) 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@17 -- # for bs in {0..2} 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@18 -- # bss+=($((native_bs << bs))) 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@17 -- # for bs in {0..2} 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@18 -- # bss+=($((native_bs << bs))) 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@21 -- # for bs in "${bss[@]}" 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=15 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=15 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=61440 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 61440 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:22.390 12:43:45 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:22.954 12:43:46 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=4096 --qd=1 --json /dev/fd/62 00:41:22.954 12:43:46 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:22.954 12:43:46 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:22.954 12:43:46 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:23.211 [2024-06-07 12:43:46.611456] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:23.211 [2024-06-07 12:43:46.612086] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233049 ] 00:41:23.211 { 00:41:23.211 "subsystems": [ 00:41:23.211 { 00:41:23.211 "subsystem": "bdev", 00:41:23.211 "config": [ 00:41:23.211 { 00:41:23.211 "params": { 00:41:23.211 "trtype": "pcie", 00:41:23.211 "traddr": "0000:00:10.0", 00:41:23.211 "name": "Nvme0" 00:41:23.211 }, 00:41:23.211 "method": "bdev_nvme_attach_controller" 00:41:23.211 }, 00:41:23.211 { 00:41:23.211 "method": "bdev_wait_for_examine" 00:41:23.211 } 00:41:23.211 ] 00:41:23.211 } 00:41:23.211 ] 00:41:23.211 } 00:41:23.211 [2024-06-07 12:43:46.760146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:23.468 [2024-06-07 12:43:46.861997] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:24.033  Copying: 60/60 [kB] (average 29 MBps) 00:41:24.033 00:41:24.033 12:43:47 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:24.033 12:43:47 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=4096 --qd=1 --count=15 --json /dev/fd/62 00:41:24.033 12:43:47 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:24.033 12:43:47 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:24.033 [2024-06-07 12:43:47.482925] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:24.033 [2024-06-07 12:43:47.483703] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233073 ] 00:41:24.033 { 00:41:24.033 "subsystems": [ 00:41:24.033 { 00:41:24.033 "subsystem": "bdev", 00:41:24.033 "config": [ 00:41:24.033 { 00:41:24.033 "params": { 00:41:24.033 "trtype": "pcie", 00:41:24.033 "traddr": "0000:00:10.0", 00:41:24.033 "name": "Nvme0" 00:41:24.033 }, 00:41:24.033 "method": "bdev_nvme_attach_controller" 00:41:24.033 }, 00:41:24.033 { 00:41:24.033 "method": "bdev_wait_for_examine" 00:41:24.033 } 00:41:24.033 ] 00:41:24.033 } 00:41:24.033 ] 00:41:24.033 } 00:41:24.033 [2024-06-07 12:43:47.629531] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:24.331 [2024-06-07 12:43:47.756468] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:24.911  Copying: 60/60 [kB] (average 29 MBps) 00:41:24.911 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 61440 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=61440 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:24.911 12:43:48 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:24.911 [2024-06-07 12:43:48.393484] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:24.911 [2024-06-07 12:43:48.394036] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233094 ] 00:41:24.911 { 00:41:24.911 "subsystems": [ 00:41:24.911 { 00:41:24.911 "subsystem": "bdev", 00:41:24.911 "config": [ 00:41:24.911 { 00:41:24.911 "params": { 00:41:24.911 "trtype": "pcie", 00:41:24.911 "traddr": "0000:00:10.0", 00:41:24.911 "name": "Nvme0" 00:41:24.911 }, 00:41:24.911 "method": "bdev_nvme_attach_controller" 00:41:24.911 }, 00:41:24.911 { 00:41:24.911 "method": "bdev_wait_for_examine" 00:41:24.912 } 00:41:24.912 ] 00:41:24.912 } 00:41:24.912 ] 00:41:24.912 } 00:41:24.912 [2024-06-07 12:43:48.540582] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:25.170 [2024-06-07 12:43:48.640814] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:25.685  Copying: 1024/1024 [kB] (average 1000 MBps) 00:41:25.685 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=15 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=15 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=61440 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 61440 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:25.685 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:26.617 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=4096 --qd=64 --json /dev/fd/62 00:41:26.617 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:26.617 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:26.617 12:43:49 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:26.617 [2024-06-07 12:43:49.993651] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:26.617 [2024-06-07 12:43:49.994243] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233121 ] 00:41:26.617 { 00:41:26.617 "subsystems": [ 00:41:26.617 { 00:41:26.617 "subsystem": "bdev", 00:41:26.617 "config": [ 00:41:26.617 { 00:41:26.617 "params": { 00:41:26.617 "trtype": "pcie", 00:41:26.617 "traddr": "0000:00:10.0", 00:41:26.617 "name": "Nvme0" 00:41:26.617 }, 00:41:26.617 "method": "bdev_nvme_attach_controller" 00:41:26.617 }, 00:41:26.617 { 00:41:26.617 "method": "bdev_wait_for_examine" 00:41:26.617 } 00:41:26.617 ] 00:41:26.617 } 00:41:26.617 ] 00:41:26.617 } 00:41:26.617 [2024-06-07 12:43:50.140173] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:26.617 [2024-06-07 12:43:50.241384] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:27.473  Copying: 60/60 [kB] (average 58 MBps) 00:41:27.473 00:41:27.473 12:43:50 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=4096 --qd=64 --count=15 --json /dev/fd/62 00:41:27.473 12:43:50 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:27.473 12:43:50 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:27.473 12:43:50 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:27.473 [2024-06-07 12:43:50.916464] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:27.473 [2024-06-07 12:43:50.917153] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233139 ] 00:41:27.473 { 00:41:27.473 "subsystems": [ 00:41:27.473 { 00:41:27.473 "subsystem": "bdev", 00:41:27.473 "config": [ 00:41:27.473 { 00:41:27.473 "params": { 00:41:27.473 "trtype": "pcie", 00:41:27.473 "traddr": "0000:00:10.0", 00:41:27.473 "name": "Nvme0" 00:41:27.473 }, 00:41:27.473 "method": "bdev_nvme_attach_controller" 00:41:27.473 }, 00:41:27.473 { 00:41:27.473 "method": "bdev_wait_for_examine" 00:41:27.473 } 00:41:27.473 ] 00:41:27.473 } 00:41:27.473 ] 00:41:27.473 } 00:41:27.473 [2024-06-07 12:43:51.065864] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:27.731 [2024-06-07 12:43:51.165987] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:28.297  Copying: 60/60 [kB] (average 58 MBps) 00:41:28.297 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 61440 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=61440 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:28.297 12:43:51 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:28.297 [2024-06-07 12:43:51.805816] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:28.297 [2024-06-07 12:43:51.806535] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233150 ] 00:41:28.297 { 00:41:28.297 "subsystems": [ 00:41:28.297 { 00:41:28.297 "subsystem": "bdev", 00:41:28.297 "config": [ 00:41:28.297 { 00:41:28.297 "params": { 00:41:28.297 "trtype": "pcie", 00:41:28.297 "traddr": "0000:00:10.0", 00:41:28.297 "name": "Nvme0" 00:41:28.297 }, 00:41:28.297 "method": "bdev_nvme_attach_controller" 00:41:28.297 }, 00:41:28.297 { 00:41:28.297 "method": "bdev_wait_for_examine" 00:41:28.297 } 00:41:28.297 ] 00:41:28.297 } 00:41:28.297 ] 00:41:28.297 } 00:41:28.555 [2024-06-07 12:43:51.953932] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:28.555 [2024-06-07 12:43:52.065151] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:29.069  Copying: 1024/1024 [kB] (average 500 MBps) 00:41:29.069 00:41:29.069 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@21 -- # for bs in "${bss[@]}" 00:41:29.069 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=7 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=7 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=57344 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 57344 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:29.070 12:43:52 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:30.033 12:43:53 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=8192 --qd=1 --json /dev/fd/62 00:41:30.033 12:43:53 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:30.033 12:43:53 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:30.033 12:43:53 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:30.033 { 00:41:30.033 "subsystems": [ 00:41:30.033 { 00:41:30.033 "subsystem": "bdev", 00:41:30.033 "config": [ 00:41:30.033 { 00:41:30.033 "params": { 00:41:30.033 "trtype": "pcie", 00:41:30.033 "traddr": "0000:00:10.0", 00:41:30.033 "name": "Nvme0" 00:41:30.033 }, 00:41:30.033 "method": "bdev_nvme_attach_controller" 00:41:30.033 }, 00:41:30.033 { 00:41:30.033 "method": "bdev_wait_for_examine" 00:41:30.033 } 00:41:30.033 ] 00:41:30.033 } 00:41:30.033 ] 00:41:30.033 } 00:41:30.033 [2024-06-07 12:43:53.336801] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:30.033 [2024-06-07 12:43:53.337507] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233177 ] 00:41:30.033 [2024-06-07 12:43:53.489089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:30.033 [2024-06-07 12:43:53.607786] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:30.855  Copying: 56/56 [kB] (average 27 MBps) 00:41:30.855 00:41:30.855 12:43:54 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=8192 --qd=1 --count=7 --json /dev/fd/62 00:41:30.855 12:43:54 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:30.855 12:43:54 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:30.855 12:43:54 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:30.855 [2024-06-07 12:43:54.238386] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:30.855 [2024-06-07 12:43:54.238910] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233197 ] 00:41:30.855 { 00:41:30.855 "subsystems": [ 00:41:30.855 { 00:41:30.855 "subsystem": "bdev", 00:41:30.855 "config": [ 00:41:30.855 { 00:41:30.855 "params": { 00:41:30.855 "trtype": "pcie", 00:41:30.855 "traddr": "0000:00:10.0", 00:41:30.855 "name": "Nvme0" 00:41:30.855 }, 00:41:30.855 "method": "bdev_nvme_attach_controller" 00:41:30.855 }, 00:41:30.855 { 00:41:30.855 "method": "bdev_wait_for_examine" 00:41:30.855 } 00:41:30.855 ] 00:41:30.855 } 00:41:30.855 ] 00:41:30.855 } 00:41:30.855 [2024-06-07 12:43:54.380357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:30.855 [2024-06-07 12:43:54.494726] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:31.678  Copying: 56/56 [kB] (average 54 MBps) 00:41:31.678 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 57344 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=57344 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:31.678 12:43:55 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:31.678 [2024-06-07 12:43:55.179748] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:31.678 [2024-06-07 12:43:55.181091] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233217 ] 00:41:31.678 { 00:41:31.678 "subsystems": [ 00:41:31.678 { 00:41:31.678 "subsystem": "bdev", 00:41:31.678 "config": [ 00:41:31.678 { 00:41:31.678 "params": { 00:41:31.678 "trtype": "pcie", 00:41:31.678 "traddr": "0000:00:10.0", 00:41:31.678 "name": "Nvme0" 00:41:31.678 }, 00:41:31.678 "method": "bdev_nvme_attach_controller" 00:41:31.678 }, 00:41:31.678 { 00:41:31.678 "method": "bdev_wait_for_examine" 00:41:31.678 } 00:41:31.678 ] 00:41:31.678 } 00:41:31.678 ] 00:41:31.678 } 00:41:31.935 [2024-06-07 12:43:55.330579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:31.935 [2024-06-07 12:43:55.451137] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:32.449  Copying: 1024/1024 [kB] (average 1000 MBps) 00:41:32.449 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=7 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=7 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=57344 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 57344 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:32.449 12:43:56 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:33.452 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=8192 --qd=64 --json /dev/fd/62 00:41:33.452 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:33.452 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:33.452 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:33.452 [2024-06-07 12:43:57.096533] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:33.452 [2024-06-07 12:43:57.097217] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233245 ] 00:41:33.710 { 00:41:33.710 "subsystems": [ 00:41:33.710 { 00:41:33.710 "subsystem": "bdev", 00:41:33.710 "config": [ 00:41:33.710 { 00:41:33.710 "params": { 00:41:33.710 "trtype": "pcie", 00:41:33.710 "traddr": "0000:00:10.0", 00:41:33.710 "name": "Nvme0" 00:41:33.710 }, 00:41:33.710 "method": "bdev_nvme_attach_controller" 00:41:33.710 }, 00:41:33.710 { 00:41:33.710 "method": "bdev_wait_for_examine" 00:41:33.710 } 00:41:33.710 ] 00:41:33.710 } 00:41:33.710 ] 00:41:33.710 } 00:41:33.710 [2024-06-07 12:43:57.249487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:33.710 [2024-06-07 12:43:57.354245] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:34.532  Copying: 56/56 [kB] (average 54 MBps) 00:41:34.532 00:41:34.532 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=8192 --qd=64 --count=7 --json /dev/fd/62 00:41:34.532 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:34.532 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:34.532 12:43:57 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:34.532 [2024-06-07 12:43:57.984855] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:34.532 [2024-06-07 12:43:57.986033] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233257 ] 00:41:34.532 { 00:41:34.532 "subsystems": [ 00:41:34.532 { 00:41:34.532 "subsystem": "bdev", 00:41:34.532 "config": [ 00:41:34.532 { 00:41:34.532 "params": { 00:41:34.532 "trtype": "pcie", 00:41:34.532 "traddr": "0000:00:10.0", 00:41:34.532 "name": "Nvme0" 00:41:34.532 }, 00:41:34.532 "method": "bdev_nvme_attach_controller" 00:41:34.532 }, 00:41:34.532 { 00:41:34.532 "method": "bdev_wait_for_examine" 00:41:34.532 } 00:41:34.532 ] 00:41:34.532 } 00:41:34.532 ] 00:41:34.532 } 00:41:34.532 [2024-06-07 12:43:58.137020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:34.795 [2024-06-07 12:43:58.255346] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:35.335  Copying: 56/56 [kB] (average 54 MBps) 00:41:35.335 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 57344 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=57344 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:35.335 12:43:58 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:35.335 [2024-06-07 12:43:58.923718] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:35.335 [2024-06-07 12:43:58.924314] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233274 ] 00:41:35.335 { 00:41:35.335 "subsystems": [ 00:41:35.335 { 00:41:35.335 "subsystem": "bdev", 00:41:35.335 "config": [ 00:41:35.335 { 00:41:35.335 "params": { 00:41:35.335 "trtype": "pcie", 00:41:35.335 "traddr": "0000:00:10.0", 00:41:35.335 "name": "Nvme0" 00:41:35.335 }, 00:41:35.335 "method": "bdev_nvme_attach_controller" 00:41:35.335 }, 00:41:35.335 { 00:41:35.335 "method": "bdev_wait_for_examine" 00:41:35.335 } 00:41:35.335 ] 00:41:35.335 } 00:41:35.335 ] 00:41:35.335 } 00:41:35.592 [2024-06-07 12:43:59.067092] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:35.592 [2024-06-07 12:43:59.176387] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:36.414  Copying: 1024/1024 [kB] (average 500 MBps) 00:41:36.414 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@21 -- # for bs in "${bss[@]}" 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=3 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=3 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=49152 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 49152 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:36.414 12:43:59 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:36.979 12:44:00 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=16384 --qd=1 --json /dev/fd/62 00:41:36.979 12:44:00 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:36.979 12:44:00 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:36.979 12:44:00 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:36.979 [2024-06-07 12:44:00.417809] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:36.979 [2024-06-07 12:44:00.418345] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233301 ] 00:41:36.979 { 00:41:36.979 "subsystems": [ 00:41:36.979 { 00:41:36.979 "subsystem": "bdev", 00:41:36.979 "config": [ 00:41:36.979 { 00:41:36.979 "params": { 00:41:36.979 "trtype": "pcie", 00:41:36.979 "traddr": "0000:00:10.0", 00:41:36.979 "name": "Nvme0" 00:41:36.979 }, 00:41:36.979 "method": "bdev_nvme_attach_controller" 00:41:36.979 }, 00:41:36.979 { 00:41:36.979 "method": "bdev_wait_for_examine" 00:41:36.979 } 00:41:36.979 ] 00:41:36.979 } 00:41:36.979 ] 00:41:36.979 } 00:41:36.979 [2024-06-07 12:44:00.557452] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:37.237 [2024-06-07 12:44:00.666208] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:37.803  Copying: 48/48 [kB] (average 46 MBps) 00:41:37.803 00:41:37.803 12:44:01 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:37.803 12:44:01 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=16384 --qd=1 --count=3 --json /dev/fd/62 00:41:37.803 12:44:01 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:37.803 12:44:01 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:37.803 [2024-06-07 12:44:01.313957] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:37.803 [2024-06-07 12:44:01.314628] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233321 ] 00:41:37.803 { 00:41:37.803 "subsystems": [ 00:41:37.803 { 00:41:37.803 "subsystem": "bdev", 00:41:37.803 "config": [ 00:41:37.803 { 00:41:37.803 "params": { 00:41:37.803 "trtype": "pcie", 00:41:37.803 "traddr": "0000:00:10.0", 00:41:37.803 "name": "Nvme0" 00:41:37.803 }, 00:41:37.803 "method": "bdev_nvme_attach_controller" 00:41:37.803 }, 00:41:37.803 { 00:41:37.803 "method": "bdev_wait_for_examine" 00:41:37.803 } 00:41:37.803 ] 00:41:37.803 } 00:41:37.803 ] 00:41:37.803 } 00:41:38.061 [2024-06-07 12:44:01.455677] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:38.061 [2024-06-07 12:44:01.550371] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:38.577  Copying: 48/48 [kB] (average 46 MBps) 00:41:38.577 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 49152 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=49152 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:38.577 12:44:02 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:38.577 [2024-06-07 12:44:02.189611] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:38.577 [2024-06-07 12:44:02.190092] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233341 ] 00:41:38.577 { 00:41:38.577 "subsystems": [ 00:41:38.577 { 00:41:38.577 "subsystem": "bdev", 00:41:38.577 "config": [ 00:41:38.578 { 00:41:38.578 "params": { 00:41:38.578 "trtype": "pcie", 00:41:38.578 "traddr": "0000:00:10.0", 00:41:38.578 "name": "Nvme0" 00:41:38.578 }, 00:41:38.578 "method": "bdev_nvme_attach_controller" 00:41:38.578 }, 00:41:38.578 { 00:41:38.578 "method": "bdev_wait_for_examine" 00:41:38.578 } 00:41:38.578 ] 00:41:38.578 } 00:41:38.578 ] 00:41:38.578 } 00:41:38.835 [2024-06-07 12:44:02.331001] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:38.835 [2024-06-07 12:44:02.422870] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:39.351  Copying: 1024/1024 [kB] (average 1000 MBps) 00:41:39.351 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@22 -- # for qd in "${qds[@]}" 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@23 -- # count=3 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@24 -- # count=3 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@25 -- # size=49152 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@27 -- # gen_bytes 49152 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@98 -- # xtrace_disable 00:41:39.614 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:40.177 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --bs=16384 --qd=64 --json /dev/fd/62 00:41:40.177 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@30 -- # gen_conf 00:41:40.177 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:40.177 12:44:03 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:40.177 [2024-06-07 12:44:03.603701] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:40.177 [2024-06-07 12:44:03.605123] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233361 ] 00:41:40.177 { 00:41:40.177 "subsystems": [ 00:41:40.177 { 00:41:40.177 "subsystem": "bdev", 00:41:40.177 "config": [ 00:41:40.177 { 00:41:40.177 "params": { 00:41:40.177 "trtype": "pcie", 00:41:40.177 "traddr": "0000:00:10.0", 00:41:40.177 "name": "Nvme0" 00:41:40.177 }, 00:41:40.177 "method": "bdev_nvme_attach_controller" 00:41:40.177 }, 00:41:40.177 { 00:41:40.177 "method": "bdev_wait_for_examine" 00:41:40.177 } 00:41:40.177 ] 00:41:40.177 } 00:41:40.177 ] 00:41:40.177 } 00:41:40.177 [2024-06-07 12:44:03.748034] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:40.435 [2024-06-07 12:44:03.865601] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:41.000  Copying: 48/48 [kB] (average 46 MBps) 00:41:41.000 00:41:41.000 12:44:04 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=16384 --qd=64 --count=3 --json /dev/fd/62 00:41:41.000 12:44:04 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@37 -- # gen_conf 00:41:41.000 12:44:04 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:41.000 12:44:04 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:41.000 [2024-06-07 12:44:04.525770] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:41.000 [2024-06-07 12:44:04.526995] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233377 ] 00:41:41.000 { 00:41:41.000 "subsystems": [ 00:41:41.000 { 00:41:41.000 "subsystem": "bdev", 00:41:41.000 "config": [ 00:41:41.000 { 00:41:41.000 "params": { 00:41:41.000 "trtype": "pcie", 00:41:41.000 "traddr": "0000:00:10.0", 00:41:41.000 "name": "Nvme0" 00:41:41.000 }, 00:41:41.000 "method": "bdev_nvme_attach_controller" 00:41:41.000 }, 00:41:41.000 { 00:41:41.000 "method": "bdev_wait_for_examine" 00:41:41.000 } 00:41:41.000 ] 00:41:41.000 } 00:41:41.000 ] 00:41:41.000 } 00:41:41.257 [2024-06-07 12:44:04.667898] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:41.257 [2024-06-07 12:44:04.759899] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:41.773  Copying: 48/48 [kB] (average 46 MBps) 00:41:41.773 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@44 -- # diff -q /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/basic_rw.sh@45 -- # clear_nvme Nvme0n1 '' 49152 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@12 -- # local size=49152 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@15 -- # local count=1 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@18 -- # gen_conf 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:41.773 12:44:05 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:42.040 [2024-06-07 12:44:05.430266] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:42.040 [2024-06-07 12:44:05.430557] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233398 ] 00:41:42.040 { 00:41:42.040 "subsystems": [ 00:41:42.040 { 00:41:42.040 "subsystem": "bdev", 00:41:42.040 "config": [ 00:41:42.040 { 00:41:42.040 "params": { 00:41:42.040 "trtype": "pcie", 00:41:42.040 "traddr": "0000:00:10.0", 00:41:42.040 "name": "Nvme0" 00:41:42.040 }, 00:41:42.040 "method": "bdev_nvme_attach_controller" 00:41:42.040 }, 00:41:42.040 { 00:41:42.041 "method": "bdev_wait_for_examine" 00:41:42.041 } 00:41:42.041 ] 00:41:42.041 } 00:41:42.041 ] 00:41:42.041 } 00:41:42.041 [2024-06-07 12:44:05.581947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:42.041 [2024-06-07 12:44:05.679181] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:42.869  Copying: 1024/1024 [kB] (average 1000 MBps) 00:41:42.869 00:41:42.869 00:41:42.869 real 0m20.296s 00:41:42.869 user 0m13.453s 00:41:42.869 sys 0m5.520s 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw -- common/autotest_common.sh@10 -- # set +x 00:41:42.869 ************************************ 00:41:42.869 END TEST dd_rw 00:41:42.869 ************************************ 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@104 -- # run_test dd_rw_offset basic_offset 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:42.869 ************************************ 00:41:42.869 START TEST dd_rw_offset 00:41:42.869 ************************************ 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@1124 -- # basic_offset 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@52 -- # local count seek skip data data_check 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@54 -- # gen_bytes 4096 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/common.sh@98 -- # xtrace_disable 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@10 -- # set +x 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@55 -- # (( count = seek = skip = 1 )) 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@56 -- # data=ttozfo95fqoks5xbw5xowtjj23xrp7a2fdhlhnvho6a7qqa5acvmda036r7z0aj232gl8ihl6e6q843n24v9ajwm9v2x2sjwzgiz6lm9b6aolql60d5sqvdxe23daof93rxtzxsx7vbdfy14ignc5xsnotrg61zem79c5d25hmt4apca0xfpqyjgc0hrn7k3lhohvtqrg34paahuu7bratmdgi63yu0u974ffmnjzpheth19avcfsn0q3unk01a5nxnm16vr988moo4w0pkb5mryzwczecyh4yi63ffyykhdp2calj2yptimdev1aloeiy51djnlttbd1xg59z8p1wqzv2hp78kh9mvew0218gjfp2fc24vy9rt992wkwrsz78gn43tremql3zm6nb0n9vzzjfu8n0rpyusruemqro0abw08k7lgtk632qlcczd8c77bk89rfhlwaw2wqp3o0zub25ephgiaapmalo5ve33yhy8b5ipp9emnxzmixjoor54jlpk4k2y6pxnjothg94o23gnb11k16l4szshh1ebzacnnv50xlmteg1cryhsrllpe8q4tw6z3kk2kl26ivc1zkrn3zeqpz9w2obly38yx7vyrtu0sgcsi12p9dijlsh92okp32ezrtan9q1gqtdazsro55mvxugqiqnbj6ehgwj3kd9be0ih6d0ccrxxqj55u5sf1j20kh2uwuz5iagvuo1xb0mcytafny0w6v3lywu9l903cevmi020gt5ijjyj03wszpyl111oc4dxh735nzyd0sqrrep08tez1ef4poiz3o0kk7zgx4vhgyxgrnxr0lpo7cpf3ldtuk7qlu88lqxvmyc0ckm8gwf5ls44qwip6y8n79efm4pcduyisn16pemxy8vd8genipblfs9zhh97y697io8y93k3vu45wrix4y4ny703jfldhx4ziwnz7429uvqr5k4385tqg0jynjn9qaiuu4kj7bimyzi56lmwgz5593d87jb8ril08eo8nfsoqz654th8i8s2w1jtxysgtc1ekziq8pwtr4f37cydv0ggyxfc4rmrb8vwm9fnup5imz6uoqhplknqwpmaovk77ononm3atjm19xuzvuo8hj6klazdmd117j0g609jbu4ok8qw4n9do9dqtawhw8fk950b390kw24ym1ju1mv3ysmqsqd27a3ja6bcizxffx7w21xcwkdq3c0yezrbeyljy7lkatnu861eew7quxw1o2gc4m3ijcnys182vzu540l1o8ns11z86qh9w9b2k3zboqwc0i5pjj69cmgcscaddxllljjwulz2f61c8smlwy20tnhj0r0gmg7gkx5phc8rvhnkm4zu1gq8lvc37tqijkabasmjbf9ye4sw1af4bqtveydxr4ezqyma4pytgrjxnovzbqqvrpinzt0k66rqkpcttj11i5l9e5yjoedobnpgziugct6g5ctvuo8cpie0n7lrdrkxjpgkcomroebaxmkr0ibpbaweip3y72zhjmzt51ajs45x2uu3ag01k65mebtgexazldrnllnr93swv5dh62dah90g3d4bfzmateueu9g3ejx41m51krinz544izjnc8zp3wai2ggl10dvldv5o7qqy29kax8z8t1ys695xx0sk06csklgfqgh52fzz2ubzuljb3c7pox18cwxer909uu5ou22vmqqqt5g7n0dnmagasm2abqncpi2qoobm5t4k60zf5tjntea1rjpakla3ee2swoblicqeunlamzb0pdemtc294wwsup8m3cx1bghdefpxazcpr4328pmcmy8jagquq0t7vxhgedv4oit0rp7g3jnqc2nz0oq2zy981e28q1o8m90kokujj0jtau4k6wwkokl0f2o3sp8s17kfntbweq9w1yi3xhdvcm84y8uj5eetjheq5ks294cadajacsdrl9649jwnyo8x7p2i3ef6e0sii8o3hxeedt984fxojlpqhdscwt7fphpmotc1kh9ffh7bnxvpp2hu9h46mq9fezu9q3jykis416cg501ek5eg2be1lyfznnzaaaf77a2pybxqj3m0rippohhfdh6q6m7is9dy9rk2ot51au7iyfnpzyn6kzx9xqawujvyv5cqvhz9mpbdggw9aq1cncfxq6hrexri2do30vdi5ncd4n4rjyva10o2rlj4qptuxffbytfdmfxvsimop9ip4cth399fyzw1foj0hkzdv23tnza95k9aqzrsy8nhi40s43cljt0mu413ab3dvdyjfkspnpesthxy73avrq3u6a5b9lqkdixhnd43yr6xkrclbxbqplmntdangs6konllov5h4bg8597g2qvnu5h3htecn7tz9ef6p7nvm3hk8iutb5bbvsi77dbqo0vw4x6to43x94338nzwpkn6mso0qsqtav3twd7yi6x9id3xhglt1kxa5zjy0g8bqbfx4j989150va7ujbl544gq0era5yb3rbupea89rtoprygj1ir7k2jco94gofly81mszm6w89j2xfuze16r96ekdi2ujg9oyquelvgj2y52t9ds37f4rzezwo7x8o7cjynvubagolbgvc4y4m3coc7upigf4e4f6iuy0q0lgh720snc0j3qxfisr9r5bsl0a7rggeiro3akuy5r4l8ui9ovtzwjme8p394j6ogrqeyhy8lkxce113p4en04zuod6xd9zomtuk7jph6erh6x4to73cxc80xkx5zwjisqyme57igkfrxkic6pgcp5f3ht16zo2pkik73ukx234r9whg4peu3y4dhb3pem2jyd88nxr9o7jeltuffvotnpumkpf8rcwhxy55waddc60b2k8c6pdvwm2udce05ju20voylgyk03102h55x9yfwlhgo8qut0whf8d12petad9p8rw8jeso0up547tfjou1j0ddux8tfrua68dqvry23q8dsqfg703715qcix67r795pqen705y9z1jesfe1o6iiqpmtihayrauacvr53em7tde1tfog83dvd4rupwmxx6vk0tdwa39hquc6ny0ah4xy1ezz66z7q6no73rtlkbffoitd5bjhhg5l6jxdy3z4sfnf3ox9taoji1amhg85tlwbejfkh8e1f5lbe0ce2w5mv21vtqg52q6kpb09w9ptg4z99thk4fealu1y0agege21ledh9m47dagycey0ijt3ctuwj2boc2ip2ps1zwydnrgfnbqammvnp1a4b6aewib0gd3ojnizw5ergjd0pr9ytewt69ulu7dik3eyqevtbduseyr08932i3nbf0bhgla8jn9ca42jvm91ax91fx43p8v6lka3nkqgxrsyggodijdrnnqcwqqjpmlenpieimzotq77yt29qvaiz4wwexff1a3snrwh8wefauo36bkrg5z3wtmaoyo682qjvr5d2r0fypfhq6pd6tym63o5fu9t9k2v79n09amtk4vu01ywofj2x6f4t19y8ah7tsbfyfp9w8evrnsgq1eurzhevjpv20zq4z1g5rhjo120rhuibs5ybx29hao5lihs0rw6qjq0ixfjiaabbg4ki9p80jwdvkhwtt5xw5y6qav7pw3yncuejutccwg15wfnwsbo1vgjftwhc5pfpuvkzy97ywp6elaxkmra7l1z2bxomzdwvvmv0j4onkzbwbntlr6meu209uanj25qi40nfu8q3pyqe0q108wlfx3e8opkk98lphckyhzbr25bf3eb494t6py3ulgc0itj7crt2ggo7bfz0mhzmh4r6r689f3wxtoo6onrb3utn9tu2g9t5fiozr731g8klcmrue6fl0y37w6hgtzbfz7pvf662qbzfmbhb5t2hiez0yp16baqh6q4agx3ev8ijmzezodc3v5ux9ofqjiyd2n4e7e75kolnbh01uk4dpl92v0asq0y3zsrdd3xblf40svwfb6e8s6m0g7a5ltse56rlefs3ehxamt9sajm422iorqw2ccdij52nuyjrv0nq4yet697iwcjqi6hajoq 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@59 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --seek=1 --json /dev/fd/62 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@59 -- # gen_conf 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/common.sh@31 -- # xtrace_disable 00:41:42.869 12:44:06 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@10 -- # set +x 00:41:42.869 [2024-06-07 12:44:06.447752] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:42.869 [2024-06-07 12:44:06.448562] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233439 ] 00:41:42.869 { 00:41:42.869 "subsystems": [ 00:41:42.869 { 00:41:42.869 "subsystem": "bdev", 00:41:42.869 "config": [ 00:41:42.869 { 00:41:42.869 "params": { 00:41:42.869 "trtype": "pcie", 00:41:42.869 "traddr": "0000:00:10.0", 00:41:42.869 "name": "Nvme0" 00:41:42.869 }, 00:41:42.869 "method": "bdev_nvme_attach_controller" 00:41:42.870 }, 00:41:42.870 { 00:41:42.870 "method": "bdev_wait_for_examine" 00:41:42.870 } 00:41:42.870 ] 00:41:42.870 } 00:41:42.870 ] 00:41:42.870 } 00:41:43.128 [2024-06-07 12:44:06.591140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:43.128 [2024-06-07 12:44:06.687917] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:43.644  Copying: 4096/4096 [B] (average 4000 kBps) 00:41:43.644 00:41:43.903 12:44:07 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@65 -- # gen_conf 00:41:43.903 12:44:07 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@65 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --skip=1 --count=1 --json /dev/fd/62 00:41:43.903 12:44:07 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/common.sh@31 -- # xtrace_disable 00:41:43.903 12:44:07 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@10 -- # set +x 00:41:43.903 [2024-06-07 12:44:07.340566] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:43.903 { 00:41:43.903 "subsystems": [ 00:41:43.903 { 00:41:43.903 "subsystem": "bdev", 00:41:43.903 "config": [ 00:41:43.903 { 00:41:43.903 "params": { 00:41:43.903 "trtype": "pcie", 00:41:43.903 "traddr": "0000:00:10.0", 00:41:43.903 "name": "Nvme0" 00:41:43.903 }, 00:41:43.903 "method": "bdev_nvme_attach_controller" 00:41:43.903 }, 00:41:43.903 { 00:41:43.903 "method": "bdev_wait_for_examine" 00:41:43.903 } 00:41:43.903 ] 00:41:43.903 } 00:41:43.903 ] 00:41:43.903 } 00:41:43.903 [2024-06-07 12:44:07.341065] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233457 ] 00:41:43.903 [2024-06-07 12:44:07.499666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:44.160 [2024-06-07 12:44:07.605459] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:44.677  Copying: 4096/4096 [B] (average 4000 kBps) 00:41:44.677 00:41:44.677 12:44:08 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@71 -- # read -rn4096 data_check 00:41:44.677 12:44:08 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- dd/basic_rw.sh@72 -- # [[ ttozfo95fqoks5xbw5xowtjj23xrp7a2fdhlhnvho6a7qqa5acvmda036r7z0aj232gl8ihl6e6q843n24v9ajwm9v2x2sjwzgiz6lm9b6aolql60d5sqvdxe23daof93rxtzxsx7vbdfy14ignc5xsnotrg61zem79c5d25hmt4apca0xfpqyjgc0hrn7k3lhohvtqrg34paahuu7bratmdgi63yu0u974ffmnjzpheth19avcfsn0q3unk01a5nxnm16vr988moo4w0pkb5mryzwczecyh4yi63ffyykhdp2calj2yptimdev1aloeiy51djnlttbd1xg59z8p1wqzv2hp78kh9mvew0218gjfp2fc24vy9rt992wkwrsz78gn43tremql3zm6nb0n9vzzjfu8n0rpyusruemqro0abw08k7lgtk632qlcczd8c77bk89rfhlwaw2wqp3o0zub25ephgiaapmalo5ve33yhy8b5ipp9emnxzmixjoor54jlpk4k2y6pxnjothg94o23gnb11k16l4szshh1ebzacnnv50xlmteg1cryhsrllpe8q4tw6z3kk2kl26ivc1zkrn3zeqpz9w2obly38yx7vyrtu0sgcsi12p9dijlsh92okp32ezrtan9q1gqtdazsro55mvxugqiqnbj6ehgwj3kd9be0ih6d0ccrxxqj55u5sf1j20kh2uwuz5iagvuo1xb0mcytafny0w6v3lywu9l903cevmi020gt5ijjyj03wszpyl111oc4dxh735nzyd0sqrrep08tez1ef4poiz3o0kk7zgx4vhgyxgrnxr0lpo7cpf3ldtuk7qlu88lqxvmyc0ckm8gwf5ls44qwip6y8n79efm4pcduyisn16pemxy8vd8genipblfs9zhh97y697io8y93k3vu45wrix4y4ny703jfldhx4ziwnz7429uvqr5k4385tqg0jynjn9qaiuu4kj7bimyzi56lmwgz5593d87jb8ril08eo8nfsoqz654th8i8s2w1jtxysgtc1ekziq8pwtr4f37cydv0ggyxfc4rmrb8vwm9fnup5imz6uoqhplknqwpmaovk77ononm3atjm19xuzvuo8hj6klazdmd117j0g609jbu4ok8qw4n9do9dqtawhw8fk950b390kw24ym1ju1mv3ysmqsqd27a3ja6bcizxffx7w21xcwkdq3c0yezrbeyljy7lkatnu861eew7quxw1o2gc4m3ijcnys182vzu540l1o8ns11z86qh9w9b2k3zboqwc0i5pjj69cmgcscaddxllljjwulz2f61c8smlwy20tnhj0r0gmg7gkx5phc8rvhnkm4zu1gq8lvc37tqijkabasmjbf9ye4sw1af4bqtveydxr4ezqyma4pytgrjxnovzbqqvrpinzt0k66rqkpcttj11i5l9e5yjoedobnpgziugct6g5ctvuo8cpie0n7lrdrkxjpgkcomroebaxmkr0ibpbaweip3y72zhjmzt51ajs45x2uu3ag01k65mebtgexazldrnllnr93swv5dh62dah90g3d4bfzmateueu9g3ejx41m51krinz544izjnc8zp3wai2ggl10dvldv5o7qqy29kax8z8t1ys695xx0sk06csklgfqgh52fzz2ubzuljb3c7pox18cwxer909uu5ou22vmqqqt5g7n0dnmagasm2abqncpi2qoobm5t4k60zf5tjntea1rjpakla3ee2swoblicqeunlamzb0pdemtc294wwsup8m3cx1bghdefpxazcpr4328pmcmy8jagquq0t7vxhgedv4oit0rp7g3jnqc2nz0oq2zy981e28q1o8m90kokujj0jtau4k6wwkokl0f2o3sp8s17kfntbweq9w1yi3xhdvcm84y8uj5eetjheq5ks294cadajacsdrl9649jwnyo8x7p2i3ef6e0sii8o3hxeedt984fxojlpqhdscwt7fphpmotc1kh9ffh7bnxvpp2hu9h46mq9fezu9q3jykis416cg501ek5eg2be1lyfznnzaaaf77a2pybxqj3m0rippohhfdh6q6m7is9dy9rk2ot51au7iyfnpzyn6kzx9xqawujvyv5cqvhz9mpbdggw9aq1cncfxq6hrexri2do30vdi5ncd4n4rjyva10o2rlj4qptuxffbytfdmfxvsimop9ip4cth399fyzw1foj0hkzdv23tnza95k9aqzrsy8nhi40s43cljt0mu413ab3dvdyjfkspnpesthxy73avrq3u6a5b9lqkdixhnd43yr6xkrclbxbqplmntdangs6konllov5h4bg8597g2qvnu5h3htecn7tz9ef6p7nvm3hk8iutb5bbvsi77dbqo0vw4x6to43x94338nzwpkn6mso0qsqtav3twd7yi6x9id3xhglt1kxa5zjy0g8bqbfx4j989150va7ujbl544gq0era5yb3rbupea89rtoprygj1ir7k2jco94gofly81mszm6w89j2xfuze16r96ekdi2ujg9oyquelvgj2y52t9ds37f4rzezwo7x8o7cjynvubagolbgvc4y4m3coc7upigf4e4f6iuy0q0lgh720snc0j3qxfisr9r5bsl0a7rggeiro3akuy5r4l8ui9ovtzwjme8p394j6ogrqeyhy8lkxce113p4en04zuod6xd9zomtuk7jph6erh6x4to73cxc80xkx5zwjisqyme57igkfrxkic6pgcp5f3ht16zo2pkik73ukx234r9whg4peu3y4dhb3pem2jyd88nxr9o7jeltuffvotnpumkpf8rcwhxy55waddc60b2k8c6pdvwm2udce05ju20voylgyk03102h55x9yfwlhgo8qut0whf8d12petad9p8rw8jeso0up547tfjou1j0ddux8tfrua68dqvry23q8dsqfg703715qcix67r795pqen705y9z1jesfe1o6iiqpmtihayrauacvr53em7tde1tfog83dvd4rupwmxx6vk0tdwa39hquc6ny0ah4xy1ezz66z7q6no73rtlkbffoitd5bjhhg5l6jxdy3z4sfnf3ox9taoji1amhg85tlwbejfkh8e1f5lbe0ce2w5mv21vtqg52q6kpb09w9ptg4z99thk4fealu1y0agege21ledh9m47dagycey0ijt3ctuwj2boc2ip2ps1zwydnrgfnbqammvnp1a4b6aewib0gd3ojnizw5ergjd0pr9ytewt69ulu7dik3eyqevtbduseyr08932i3nbf0bhgla8jn9ca42jvm91ax91fx43p8v6lka3nkqgxrsyggodijdrnnqcwqqjpmlenpieimzotq77yt29qvaiz4wwexff1a3snrwh8wefauo36bkrg5z3wtmaoyo682qjvr5d2r0fypfhq6pd6tym63o5fu9t9k2v79n09amtk4vu01ywofj2x6f4t19y8ah7tsbfyfp9w8evrnsgq1eurzhevjpv20zq4z1g5rhjo120rhuibs5ybx29hao5lihs0rw6qjq0ixfjiaabbg4ki9p80jwdvkhwtt5xw5y6qav7pw3yncuejutccwg15wfnwsbo1vgjftwhc5pfpuvkzy97ywp6elaxkmra7l1z2bxomzdwvvmv0j4onkzbwbntlr6meu209uanj25qi40nfu8q3pyqe0q108wlfx3e8opkk98lphckyhzbr25bf3eb494t6py3ulgc0itj7crt2ggo7bfz0mhzmh4r6r689f3wxtoo6onrb3utn9tu2g9t5fiozr731g8klcmrue6fl0y37w6hgtzbfz7pvf662qbzfmbhb5t2hiez0yp16baqh6q4agx3ev8ijmzezodc3v5ux9ofqjiyd2n4e7e75kolnbh01uk4dpl92v0asq0y3zsrdd3xblf40svwfb6e8s6m0g7a5ltse56rlefs3ehxamt9sajm422iorqw2ccdij52nuyjrv0nq4yet697iwcjqi6hajoq == \t\t\o\z\f\o\9\5\f\q\o\k\s\5\x\b\w\5\x\o\w\t\j\j\2\3\x\r\p\7\a\2\f\d\h\l\h\n\v\h\o\6\a\7\q\q\a\5\a\c\v\m\d\a\0\3\6\r\7\z\0\a\j\2\3\2\g\l\8\i\h\l\6\e\6\q\8\4\3\n\2\4\v\9\a\j\w\m\9\v\2\x\2\s\j\w\z\g\i\z\6\l\m\9\b\6\a\o\l\q\l\6\0\d\5\s\q\v\d\x\e\2\3\d\a\o\f\9\3\r\x\t\z\x\s\x\7\v\b\d\f\y\1\4\i\g\n\c\5\x\s\n\o\t\r\g\6\1\z\e\m\7\9\c\5\d\2\5\h\m\t\4\a\p\c\a\0\x\f\p\q\y\j\g\c\0\h\r\n\7\k\3\l\h\o\h\v\t\q\r\g\3\4\p\a\a\h\u\u\7\b\r\a\t\m\d\g\i\6\3\y\u\0\u\9\7\4\f\f\m\n\j\z\p\h\e\t\h\1\9\a\v\c\f\s\n\0\q\3\u\n\k\0\1\a\5\n\x\n\m\1\6\v\r\9\8\8\m\o\o\4\w\0\p\k\b\5\m\r\y\z\w\c\z\e\c\y\h\4\y\i\6\3\f\f\y\y\k\h\d\p\2\c\a\l\j\2\y\p\t\i\m\d\e\v\1\a\l\o\e\i\y\5\1\d\j\n\l\t\t\b\d\1\x\g\5\9\z\8\p\1\w\q\z\v\2\h\p\7\8\k\h\9\m\v\e\w\0\2\1\8\g\j\f\p\2\f\c\2\4\v\y\9\r\t\9\9\2\w\k\w\r\s\z\7\8\g\n\4\3\t\r\e\m\q\l\3\z\m\6\n\b\0\n\9\v\z\z\j\f\u\8\n\0\r\p\y\u\s\r\u\e\m\q\r\o\0\a\b\w\0\8\k\7\l\g\t\k\6\3\2\q\l\c\c\z\d\8\c\7\7\b\k\8\9\r\f\h\l\w\a\w\2\w\q\p\3\o\0\z\u\b\2\5\e\p\h\g\i\a\a\p\m\a\l\o\5\v\e\3\3\y\h\y\8\b\5\i\p\p\9\e\m\n\x\z\m\i\x\j\o\o\r\5\4\j\l\p\k\4\k\2\y\6\p\x\n\j\o\t\h\g\9\4\o\2\3\g\n\b\1\1\k\1\6\l\4\s\z\s\h\h\1\e\b\z\a\c\n\n\v\5\0\x\l\m\t\e\g\1\c\r\y\h\s\r\l\l\p\e\8\q\4\t\w\6\z\3\k\k\2\k\l\2\6\i\v\c\1\z\k\r\n\3\z\e\q\p\z\9\w\2\o\b\l\y\3\8\y\x\7\v\y\r\t\u\0\s\g\c\s\i\1\2\p\9\d\i\j\l\s\h\9\2\o\k\p\3\2\e\z\r\t\a\n\9\q\1\g\q\t\d\a\z\s\r\o\5\5\m\v\x\u\g\q\i\q\n\b\j\6\e\h\g\w\j\3\k\d\9\b\e\0\i\h\6\d\0\c\c\r\x\x\q\j\5\5\u\5\s\f\1\j\2\0\k\h\2\u\w\u\z\5\i\a\g\v\u\o\1\x\b\0\m\c\y\t\a\f\n\y\0\w\6\v\3\l\y\w\u\9\l\9\0\3\c\e\v\m\i\0\2\0\g\t\5\i\j\j\y\j\0\3\w\s\z\p\y\l\1\1\1\o\c\4\d\x\h\7\3\5\n\z\y\d\0\s\q\r\r\e\p\0\8\t\e\z\1\e\f\4\p\o\i\z\3\o\0\k\k\7\z\g\x\4\v\h\g\y\x\g\r\n\x\r\0\l\p\o\7\c\p\f\3\l\d\t\u\k\7\q\l\u\8\8\l\q\x\v\m\y\c\0\c\k\m\8\g\w\f\5\l\s\4\4\q\w\i\p\6\y\8\n\7\9\e\f\m\4\p\c\d\u\y\i\s\n\1\6\p\e\m\x\y\8\v\d\8\g\e\n\i\p\b\l\f\s\9\z\h\h\9\7\y\6\9\7\i\o\8\y\9\3\k\3\v\u\4\5\w\r\i\x\4\y\4\n\y\7\0\3\j\f\l\d\h\x\4\z\i\w\n\z\7\4\2\9\u\v\q\r\5\k\4\3\8\5\t\q\g\0\j\y\n\j\n\9\q\a\i\u\u\4\k\j\7\b\i\m\y\z\i\5\6\l\m\w\g\z\5\5\9\3\d\8\7\j\b\8\r\i\l\0\8\e\o\8\n\f\s\o\q\z\6\5\4\t\h\8\i\8\s\2\w\1\j\t\x\y\s\g\t\c\1\e\k\z\i\q\8\p\w\t\r\4\f\3\7\c\y\d\v\0\g\g\y\x\f\c\4\r\m\r\b\8\v\w\m\9\f\n\u\p\5\i\m\z\6\u\o\q\h\p\l\k\n\q\w\p\m\a\o\v\k\7\7\o\n\o\n\m\3\a\t\j\m\1\9\x\u\z\v\u\o\8\h\j\6\k\l\a\z\d\m\d\1\1\7\j\0\g\6\0\9\j\b\u\4\o\k\8\q\w\4\n\9\d\o\9\d\q\t\a\w\h\w\8\f\k\9\5\0\b\3\9\0\k\w\2\4\y\m\1\j\u\1\m\v\3\y\s\m\q\s\q\d\2\7\a\3\j\a\6\b\c\i\z\x\f\f\x\7\w\2\1\x\c\w\k\d\q\3\c\0\y\e\z\r\b\e\y\l\j\y\7\l\k\a\t\n\u\8\6\1\e\e\w\7\q\u\x\w\1\o\2\g\c\4\m\3\i\j\c\n\y\s\1\8\2\v\z\u\5\4\0\l\1\o\8\n\s\1\1\z\8\6\q\h\9\w\9\b\2\k\3\z\b\o\q\w\c\0\i\5\p\j\j\6\9\c\m\g\c\s\c\a\d\d\x\l\l\l\j\j\w\u\l\z\2\f\6\1\c\8\s\m\l\w\y\2\0\t\n\h\j\0\r\0\g\m\g\7\g\k\x\5\p\h\c\8\r\v\h\n\k\m\4\z\u\1\g\q\8\l\v\c\3\7\t\q\i\j\k\a\b\a\s\m\j\b\f\9\y\e\4\s\w\1\a\f\4\b\q\t\v\e\y\d\x\r\4\e\z\q\y\m\a\4\p\y\t\g\r\j\x\n\o\v\z\b\q\q\v\r\p\i\n\z\t\0\k\6\6\r\q\k\p\c\t\t\j\1\1\i\5\l\9\e\5\y\j\o\e\d\o\b\n\p\g\z\i\u\g\c\t\6\g\5\c\t\v\u\o\8\c\p\i\e\0\n\7\l\r\d\r\k\x\j\p\g\k\c\o\m\r\o\e\b\a\x\m\k\r\0\i\b\p\b\a\w\e\i\p\3\y\7\2\z\h\j\m\z\t\5\1\a\j\s\4\5\x\2\u\u\3\a\g\0\1\k\6\5\m\e\b\t\g\e\x\a\z\l\d\r\n\l\l\n\r\9\3\s\w\v\5\d\h\6\2\d\a\h\9\0\g\3\d\4\b\f\z\m\a\t\e\u\e\u\9\g\3\e\j\x\4\1\m\5\1\k\r\i\n\z\5\4\4\i\z\j\n\c\8\z\p\3\w\a\i\2\g\g\l\1\0\d\v\l\d\v\5\o\7\q\q\y\2\9\k\a\x\8\z\8\t\1\y\s\6\9\5\x\x\0\s\k\0\6\c\s\k\l\g\f\q\g\h\5\2\f\z\z\2\u\b\z\u\l\j\b\3\c\7\p\o\x\1\8\c\w\x\e\r\9\0\9\u\u\5\o\u\2\2\v\m\q\q\q\t\5\g\7\n\0\d\n\m\a\g\a\s\m\2\a\b\q\n\c\p\i\2\q\o\o\b\m\5\t\4\k\6\0\z\f\5\t\j\n\t\e\a\1\r\j\p\a\k\l\a\3\e\e\2\s\w\o\b\l\i\c\q\e\u\n\l\a\m\z\b\0\p\d\e\m\t\c\2\9\4\w\w\s\u\p\8\m\3\c\x\1\b\g\h\d\e\f\p\x\a\z\c\p\r\4\3\2\8\p\m\c\m\y\8\j\a\g\q\u\q\0\t\7\v\x\h\g\e\d\v\4\o\i\t\0\r\p\7\g\3\j\n\q\c\2\n\z\0\o\q\2\z\y\9\8\1\e\2\8\q\1\o\8\m\9\0\k\o\k\u\j\j\0\j\t\a\u\4\k\6\w\w\k\o\k\l\0\f\2\o\3\s\p\8\s\1\7\k\f\n\t\b\w\e\q\9\w\1\y\i\3\x\h\d\v\c\m\8\4\y\8\u\j\5\e\e\t\j\h\e\q\5\k\s\2\9\4\c\a\d\a\j\a\c\s\d\r\l\9\6\4\9\j\w\n\y\o\8\x\7\p\2\i\3\e\f\6\e\0\s\i\i\8\o\3\h\x\e\e\d\t\9\8\4\f\x\o\j\l\p\q\h\d\s\c\w\t\7\f\p\h\p\m\o\t\c\1\k\h\9\f\f\h\7\b\n\x\v\p\p\2\h\u\9\h\4\6\m\q\9\f\e\z\u\9\q\3\j\y\k\i\s\4\1\6\c\g\5\0\1\e\k\5\e\g\2\b\e\1\l\y\f\z\n\n\z\a\a\a\f\7\7\a\2\p\y\b\x\q\j\3\m\0\r\i\p\p\o\h\h\f\d\h\6\q\6\m\7\i\s\9\d\y\9\r\k\2\o\t\5\1\a\u\7\i\y\f\n\p\z\y\n\6\k\z\x\9\x\q\a\w\u\j\v\y\v\5\c\q\v\h\z\9\m\p\b\d\g\g\w\9\a\q\1\c\n\c\f\x\q\6\h\r\e\x\r\i\2\d\o\3\0\v\d\i\5\n\c\d\4\n\4\r\j\y\v\a\1\0\o\2\r\l\j\4\q\p\t\u\x\f\f\b\y\t\f\d\m\f\x\v\s\i\m\o\p\9\i\p\4\c\t\h\3\9\9\f\y\z\w\1\f\o\j\0\h\k\z\d\v\2\3\t\n\z\a\9\5\k\9\a\q\z\r\s\y\8\n\h\i\4\0\s\4\3\c\l\j\t\0\m\u\4\1\3\a\b\3\d\v\d\y\j\f\k\s\p\n\p\e\s\t\h\x\y\7\3\a\v\r\q\3\u\6\a\5\b\9\l\q\k\d\i\x\h\n\d\4\3\y\r\6\x\k\r\c\l\b\x\b\q\p\l\m\n\t\d\a\n\g\s\6\k\o\n\l\l\o\v\5\h\4\b\g\8\5\9\7\g\2\q\v\n\u\5\h\3\h\t\e\c\n\7\t\z\9\e\f\6\p\7\n\v\m\3\h\k\8\i\u\t\b\5\b\b\v\s\i\7\7\d\b\q\o\0\v\w\4\x\6\t\o\4\3\x\9\4\3\3\8\n\z\w\p\k\n\6\m\s\o\0\q\s\q\t\a\v\3\t\w\d\7\y\i\6\x\9\i\d\3\x\h\g\l\t\1\k\x\a\5\z\j\y\0\g\8\b\q\b\f\x\4\j\9\8\9\1\5\0\v\a\7\u\j\b\l\5\4\4\g\q\0\e\r\a\5\y\b\3\r\b\u\p\e\a\8\9\r\t\o\p\r\y\g\j\1\i\r\7\k\2\j\c\o\9\4\g\o\f\l\y\8\1\m\s\z\m\6\w\8\9\j\2\x\f\u\z\e\1\6\r\9\6\e\k\d\i\2\u\j\g\9\o\y\q\u\e\l\v\g\j\2\y\5\2\t\9\d\s\3\7\f\4\r\z\e\z\w\o\7\x\8\o\7\c\j\y\n\v\u\b\a\g\o\l\b\g\v\c\4\y\4\m\3\c\o\c\7\u\p\i\g\f\4\e\4\f\6\i\u\y\0\q\0\l\g\h\7\2\0\s\n\c\0\j\3\q\x\f\i\s\r\9\r\5\b\s\l\0\a\7\r\g\g\e\i\r\o\3\a\k\u\y\5\r\4\l\8\u\i\9\o\v\t\z\w\j\m\e\8\p\3\9\4\j\6\o\g\r\q\e\y\h\y\8\l\k\x\c\e\1\1\3\p\4\e\n\0\4\z\u\o\d\6\x\d\9\z\o\m\t\u\k\7\j\p\h\6\e\r\h\6\x\4\t\o\7\3\c\x\c\8\0\x\k\x\5\z\w\j\i\s\q\y\m\e\5\7\i\g\k\f\r\x\k\i\c\6\p\g\c\p\5\f\3\h\t\1\6\z\o\2\p\k\i\k\7\3\u\k\x\2\3\4\r\9\w\h\g\4\p\e\u\3\y\4\d\h\b\3\p\e\m\2\j\y\d\8\8\n\x\r\9\o\7\j\e\l\t\u\f\f\v\o\t\n\p\u\m\k\p\f\8\r\c\w\h\x\y\5\5\w\a\d\d\c\6\0\b\2\k\8\c\6\p\d\v\w\m\2\u\d\c\e\0\5\j\u\2\0\v\o\y\l\g\y\k\0\3\1\0\2\h\5\5\x\9\y\f\w\l\h\g\o\8\q\u\t\0\w\h\f\8\d\1\2\p\e\t\a\d\9\p\8\r\w\8\j\e\s\o\0\u\p\5\4\7\t\f\j\o\u\1\j\0\d\d\u\x\8\t\f\r\u\a\6\8\d\q\v\r\y\2\3\q\8\d\s\q\f\g\7\0\3\7\1\5\q\c\i\x\6\7\r\7\9\5\p\q\e\n\7\0\5\y\9\z\1\j\e\s\f\e\1\o\6\i\i\q\p\m\t\i\h\a\y\r\a\u\a\c\v\r\5\3\e\m\7\t\d\e\1\t\f\o\g\8\3\d\v\d\4\r\u\p\w\m\x\x\6\v\k\0\t\d\w\a\3\9\h\q\u\c\6\n\y\0\a\h\4\x\y\1\e\z\z\6\6\z\7\q\6\n\o\7\3\r\t\l\k\b\f\f\o\i\t\d\5\b\j\h\h\g\5\l\6\j\x\d\y\3\z\4\s\f\n\f\3\o\x\9\t\a\o\j\i\1\a\m\h\g\8\5\t\l\w\b\e\j\f\k\h\8\e\1\f\5\l\b\e\0\c\e\2\w\5\m\v\2\1\v\t\q\g\5\2\q\6\k\p\b\0\9\w\9\p\t\g\4\z\9\9\t\h\k\4\f\e\a\l\u\1\y\0\a\g\e\g\e\2\1\l\e\d\h\9\m\4\7\d\a\g\y\c\e\y\0\i\j\t\3\c\t\u\w\j\2\b\o\c\2\i\p\2\p\s\1\z\w\y\d\n\r\g\f\n\b\q\a\m\m\v\n\p\1\a\4\b\6\a\e\w\i\b\0\g\d\3\o\j\n\i\z\w\5\e\r\g\j\d\0\p\r\9\y\t\e\w\t\6\9\u\l\u\7\d\i\k\3\e\y\q\e\v\t\b\d\u\s\e\y\r\0\8\9\3\2\i\3\n\b\f\0\b\h\g\l\a\8\j\n\9\c\a\4\2\j\v\m\9\1\a\x\9\1\f\x\4\3\p\8\v\6\l\k\a\3\n\k\q\g\x\r\s\y\g\g\o\d\i\j\d\r\n\n\q\c\w\q\q\j\p\m\l\e\n\p\i\e\i\m\z\o\t\q\7\7\y\t\2\9\q\v\a\i\z\4\w\w\e\x\f\f\1\a\3\s\n\r\w\h\8\w\e\f\a\u\o\3\6\b\k\r\g\5\z\3\w\t\m\a\o\y\o\6\8\2\q\j\v\r\5\d\2\r\0\f\y\p\f\h\q\6\p\d\6\t\y\m\6\3\o\5\f\u\9\t\9\k\2\v\7\9\n\0\9\a\m\t\k\4\v\u\0\1\y\w\o\f\j\2\x\6\f\4\t\1\9\y\8\a\h\7\t\s\b\f\y\f\p\9\w\8\e\v\r\n\s\g\q\1\e\u\r\z\h\e\v\j\p\v\2\0\z\q\4\z\1\g\5\r\h\j\o\1\2\0\r\h\u\i\b\s\5\y\b\x\2\9\h\a\o\5\l\i\h\s\0\r\w\6\q\j\q\0\i\x\f\j\i\a\a\b\b\g\4\k\i\9\p\8\0\j\w\d\v\k\h\w\t\t\5\x\w\5\y\6\q\a\v\7\p\w\3\y\n\c\u\e\j\u\t\c\c\w\g\1\5\w\f\n\w\s\b\o\1\v\g\j\f\t\w\h\c\5\p\f\p\u\v\k\z\y\9\7\y\w\p\6\e\l\a\x\k\m\r\a\7\l\1\z\2\b\x\o\m\z\d\w\v\v\m\v\0\j\4\o\n\k\z\b\w\b\n\t\l\r\6\m\e\u\2\0\9\u\a\n\j\2\5\q\i\4\0\n\f\u\8\q\3\p\y\q\e\0\q\1\0\8\w\l\f\x\3\e\8\o\p\k\k\9\8\l\p\h\c\k\y\h\z\b\r\2\5\b\f\3\e\b\4\9\4\t\6\p\y\3\u\l\g\c\0\i\t\j\7\c\r\t\2\g\g\o\7\b\f\z\0\m\h\z\m\h\4\r\6\r\6\8\9\f\3\w\x\t\o\o\6\o\n\r\b\3\u\t\n\9\t\u\2\g\9\t\5\f\i\o\z\r\7\3\1\g\8\k\l\c\m\r\u\e\6\f\l\0\y\3\7\w\6\h\g\t\z\b\f\z\7\p\v\f\6\6\2\q\b\z\f\m\b\h\b\5\t\2\h\i\e\z\0\y\p\1\6\b\a\q\h\6\q\4\a\g\x\3\e\v\8\i\j\m\z\e\z\o\d\c\3\v\5\u\x\9\o\f\q\j\i\y\d\2\n\4\e\7\e\7\5\k\o\l\n\b\h\0\1\u\k\4\d\p\l\9\2\v\0\a\s\q\0\y\3\z\s\r\d\d\3\x\b\l\f\4\0\s\v\w\f\b\6\e\8\s\6\m\0\g\7\a\5\l\t\s\e\5\6\r\l\e\f\s\3\e\h\x\a\m\t\9\s\a\j\m\4\2\2\i\o\r\q\w\2\c\c\d\i\j\5\2\n\u\y\j\r\v\0\n\q\4\y\e\t\6\9\7\i\w\c\j\q\i\6\h\a\j\o\q ]] 00:41:44.677 00:41:44.677 real 0m1.871s 00:41:44.677 user 0m1.130s 00:41:44.677 sys 0m0.604s 00:41:44.677 12:44:08 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:44.677 12:44:08 spdk_dd.spdk_dd_basic_rw.dd_rw_offset -- common/autotest_common.sh@10 -- # set +x 00:41:44.677 ************************************ 00:41:44.677 END TEST dd_rw_offset 00:41:44.677 ************************************ 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@1 -- # cleanup 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@76 -- # clear_nvme Nvme0n1 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@11 -- # local nvme_ref= 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@12 -- # local size=0xffff 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@14 -- # local bs=1048576 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@15 -- # local count=1 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=1 --json /dev/fd/62 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@18 -- # gen_conf 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- dd/common.sh@31 -- # xtrace_disable 00:41:44.678 12:44:08 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:44.678 [2024-06-07 12:44:08.305341] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:44.678 [2024-06-07 12:44:08.306587] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233491 ] 00:41:44.678 { 00:41:44.678 "subsystems": [ 00:41:44.678 { 00:41:44.678 "subsystem": "bdev", 00:41:44.678 "config": [ 00:41:44.678 { 00:41:44.678 "params": { 00:41:44.678 "trtype": "pcie", 00:41:44.678 "traddr": "0000:00:10.0", 00:41:44.678 "name": "Nvme0" 00:41:44.678 }, 00:41:44.678 "method": "bdev_nvme_attach_controller" 00:41:44.678 }, 00:41:44.678 { 00:41:44.678 "method": "bdev_wait_for_examine" 00:41:44.678 } 00:41:44.678 ] 00:41:44.678 } 00:41:44.678 ] 00:41:44.678 } 00:41:44.943 [2024-06-07 12:44:08.462866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:44.943 [2024-06-07 12:44:08.565386] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:45.772  Copying: 1024/1024 [kB] (average 500 MBps) 00:41:45.772 00:41:45.772 12:44:09 spdk_dd.spdk_dd_basic_rw -- dd/basic_rw.sh@77 -- # rm -f /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:45.772 00:41:45.772 real 0m24.472s 00:41:45.772 user 0m15.830s 00:41:45.772 sys 0m6.974s 00:41:45.772 12:44:09 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:45.772 ************************************ 00:41:45.772 12:44:09 spdk_dd.spdk_dd_basic_rw -- common/autotest_common.sh@10 -- # set +x 00:41:45.772 END TEST spdk_dd_basic_rw 00:41:45.772 ************************************ 00:41:45.772 12:44:09 spdk_dd -- dd/dd.sh@21 -- # run_test spdk_dd_posix /home/vagrant/spdk_repo/spdk/test/dd/posix.sh 00:41:45.772 12:44:09 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:45.772 12:44:09 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:45.772 12:44:09 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:41:45.772 ************************************ 00:41:45.772 START TEST spdk_dd_posix 00:41:45.772 ************************************ 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/posix.sh 00:41:45.772 * Looking for test storage... 00:41:45.772 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- paths/export.sh@5 -- # export PATH 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@121 -- # msg[0]=', using AIO' 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@122 -- # msg[1]=', liburing in use' 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@123 -- # msg[2]=', disabling liburing, forcing AIO' 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@125 -- # trap cleanup EXIT 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@127 -- # test_file0=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@128 -- # test_file1=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@130 -- # tests 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@99 -- # printf '* First test run%s\n' ', using AIO' 00:41:45.772 * First test run, using AIO 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- dd/posix.sh@102 -- # run_test dd_flag_append append 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:45.772 ************************************ 00:41:45.772 START TEST dd_flag_append 00:41:45.772 ************************************ 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- common/autotest_common.sh@1124 -- # append 00:41:45.772 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@16 -- # local dump0 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@17 -- # local dump1 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@19 -- # gen_bytes 32 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/common.sh@98 -- # xtrace_disable 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- common/autotest_common.sh@10 -- # set +x 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@19 -- # dump0=3n7skk3vsgf38hvvlhi4kydx7481n35q 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@20 -- # gen_bytes 32 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/common.sh@98 -- # xtrace_disable 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- common/autotest_common.sh@10 -- # set +x 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@20 -- # dump1=wqe9xf4qdz88crnexcte0wmf66nvjvzn 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@22 -- # printf %s 3n7skk3vsgf38hvvlhi4kydx7481n35q 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@23 -- # printf %s wqe9xf4qdz88crnexcte0wmf66nvjvzn 00:41:45.773 12:44:09 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@25 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=append 00:41:45.773 [2024-06-07 12:44:09.401860] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:45.773 [2024-06-07 12:44:09.402719] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233568 ] 00:41:46.030 [2024-06-07 12:44:09.560406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:46.030 [2024-06-07 12:44:09.658775] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:46.546  Copying: 32/32 [B] (average 31 kBps) 00:41:46.546 00:41:46.546 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_append -- dd/posix.sh@27 -- # [[ wqe9xf4qdz88crnexcte0wmf66nvjvzn3n7skk3vsgf38hvvlhi4kydx7481n35q == \w\q\e\9\x\f\4\q\d\z\8\8\c\r\n\e\x\c\t\e\0\w\m\f\6\6\n\v\j\v\z\n\3\n\7\s\k\k\3\v\s\g\f\3\8\h\v\v\l\h\i\4\k\y\d\x\7\4\8\1\n\3\5\q ]] 00:41:46.546 00:41:46.546 real 0m0.818s 00:41:46.546 user 0m0.436s 00:41:46.546 sys 0m0.262s 00:41:46.546 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_append -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:46.546 ************************************ 00:41:46.546 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_append -- common/autotest_common.sh@10 -- # set +x 00:41:46.546 END TEST dd_flag_append 00:41:46.546 ************************************ 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix -- dd/posix.sh@103 -- # run_test dd_flag_directory directory 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:46.804 ************************************ 00:41:46.804 START TEST dd_flag_directory 00:41:46.804 ************************************ 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@1124 -- # directory 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- dd/posix.sh@31 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@649 -- # local es=0 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:41:46.804 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:46.805 [2024-06-07 12:44:10.284312] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:46.805 [2024-06-07 12:44:10.284642] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233594 ] 00:41:46.805 [2024-06-07 12:44:10.420923] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:47.079 [2024-06-07 12:44:10.513470] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:47.079 [2024-06-07 12:44:10.636449] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:41:47.079 [2024-06-07 12:44:10.636567] spdk_dd.c:1083:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:41:47.079 [2024-06-07 12:44:10.636618] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:47.336 [2024-06-07 12:44:10.830121] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@652 -- # es=236 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@661 -- # es=108 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@662 -- # case "$es" in 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@669 -- # es=1 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- dd/posix.sh@32 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@649 -- # local es=0 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:41:47.595 12:44:10 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:41:47.595 [2024-06-07 12:44:11.041688] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:47.595 [2024-06-07 12:44:11.042633] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233615 ] 00:41:47.595 [2024-06-07 12:44:11.193599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:47.853 [2024-06-07 12:44:11.296696] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:47.853 [2024-06-07 12:44:11.426931] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:41:47.853 [2024-06-07 12:44:11.427070] spdk_dd.c:1132:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:41:47.853 [2024-06-07 12:44:11.427135] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:48.111 [2024-06-07 12:44:11.633404] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@652 -- # es=236 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@661 -- # es=108 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@662 -- # case "$es" in 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@669 -- # es=1 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:48.368 00:41:48.368 real 0m1.556s 00:41:48.368 user 0m0.832s 00:41:48.368 sys 0m0.513s 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:48.368 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_directory -- common/autotest_common.sh@10 -- # set +x 00:41:48.368 ************************************ 00:41:48.369 END TEST dd_flag_directory 00:41:48.369 ************************************ 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix -- dd/posix.sh@104 -- # run_test dd_flag_nofollow nofollow 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:48.369 ************************************ 00:41:48.369 START TEST dd_flag_nofollow 00:41:48.369 ************************************ 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@1124 -- # nofollow 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@36 -- # local test_file0_link=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@37 -- # local test_file1_link=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@39 -- # ln -fs /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@40 -- # ln -fs /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@42 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@649 -- # local es=0 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:41:48.369 12:44:11 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:48.369 [2024-06-07 12:44:11.913379] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:48.369 [2024-06-07 12:44:11.913656] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233648 ] 00:41:48.627 [2024-06-07 12:44:12.059739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:48.627 [2024-06-07 12:44:12.157170] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:48.936 [2024-06-07 12:44:12.282952] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link: Too many levels of symbolic links 00:41:48.936 [2024-06-07 12:44:12.283073] spdk_dd.c:1083:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link: Too many levels of symbolic links 00:41:48.936 [2024-06-07 12:44:12.283116] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:48.936 [2024-06-07 12:44:12.487355] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@652 -- # es=216 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@661 -- # es=88 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@662 -- # case "$es" in 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@669 -- # es=1 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@43 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@649 -- # local es=0 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:41:49.194 12:44:12 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:41:49.194 [2024-06-07 12:44:12.716688] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:49.194 [2024-06-07 12:44:12.716940] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233669 ] 00:41:49.451 [2024-06-07 12:44:12.857303] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:49.451 [2024-06-07 12:44:12.961923] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:49.451 [2024-06-07 12:44:13.089360] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link: Too many levels of symbolic links 00:41:49.451 [2024-06-07 12:44:13.089505] spdk_dd.c:1132:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link: Too many levels of symbolic links 00:41:49.451 [2024-06-07 12:44:13.089553] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:49.708 [2024-06-07 12:44:13.287862] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@652 -- # es=216 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@661 -- # es=88 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@662 -- # case "$es" in 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@669 -- # es=1 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@46 -- # gen_bytes 512 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/common.sh@98 -- # xtrace_disable 00:41:49.965 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@10 -- # set +x 00:41:49.966 12:44:13 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@48 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:49.966 [2024-06-07 12:44:13.506284] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:49.966 [2024-06-07 12:44:13.506587] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233683 ] 00:41:50.222 [2024-06-07 12:44:13.653090] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:50.222 [2024-06-07 12:44:13.753350] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:50.738  Copying: 512/512 [B] (average 500 kBps) 00:41:50.738 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- dd/posix.sh@49 -- # [[ p4vyjxjsq1ckc52pley2anht2eeyeafwik5ectyvokz56p9tza87mwg2s88cta705o4gytemzey66fux46s0939o3k3d0oeayy7ridvsxjibsonpbuqnwa721clqevgoh3okv0vmh6s6o1jcrz04qkupbgefl7bhss56rqf3cmchpjxv7rph7u5f2wfqaqdoay3hl3buql1txvth9nex4k3p5qhhdi2avy66i4f6t9yxcbxzoxfocwylb3h8fx1vote1z7s43k5dtpqqc7ysz72scw2plvkwhmxog5cvvlrkdt59o4e7zd93wh4u9j8xepez4ise5uzr4ujbpegu86y8x21aug3bm9ylcvld8lyllgnrpohz0e5p8w7hxyhzc96plrhq50tglpk0hjtoq6s0icd3duuj4n6zyk6fck0020f1krz1kyuzwk28s6npowbmunxsrzybetz2rz466591f98vshvaoe3zpb3b1w0uzuaorbmui7mu6p1tn5ou == \p\4\v\y\j\x\j\s\q\1\c\k\c\5\2\p\l\e\y\2\a\n\h\t\2\e\e\y\e\a\f\w\i\k\5\e\c\t\y\v\o\k\z\5\6\p\9\t\z\a\8\7\m\w\g\2\s\8\8\c\t\a\7\0\5\o\4\g\y\t\e\m\z\e\y\6\6\f\u\x\4\6\s\0\9\3\9\o\3\k\3\d\0\o\e\a\y\y\7\r\i\d\v\s\x\j\i\b\s\o\n\p\b\u\q\n\w\a\7\2\1\c\l\q\e\v\g\o\h\3\o\k\v\0\v\m\h\6\s\6\o\1\j\c\r\z\0\4\q\k\u\p\b\g\e\f\l\7\b\h\s\s\5\6\r\q\f\3\c\m\c\h\p\j\x\v\7\r\p\h\7\u\5\f\2\w\f\q\a\q\d\o\a\y\3\h\l\3\b\u\q\l\1\t\x\v\t\h\9\n\e\x\4\k\3\p\5\q\h\h\d\i\2\a\v\y\6\6\i\4\f\6\t\9\y\x\c\b\x\z\o\x\f\o\c\w\y\l\b\3\h\8\f\x\1\v\o\t\e\1\z\7\s\4\3\k\5\d\t\p\q\q\c\7\y\s\z\7\2\s\c\w\2\p\l\v\k\w\h\m\x\o\g\5\c\v\v\l\r\k\d\t\5\9\o\4\e\7\z\d\9\3\w\h\4\u\9\j\8\x\e\p\e\z\4\i\s\e\5\u\z\r\4\u\j\b\p\e\g\u\8\6\y\8\x\2\1\a\u\g\3\b\m\9\y\l\c\v\l\d\8\l\y\l\l\g\n\r\p\o\h\z\0\e\5\p\8\w\7\h\x\y\h\z\c\9\6\p\l\r\h\q\5\0\t\g\l\p\k\0\h\j\t\o\q\6\s\0\i\c\d\3\d\u\u\j\4\n\6\z\y\k\6\f\c\k\0\0\2\0\f\1\k\r\z\1\k\y\u\z\w\k\2\8\s\6\n\p\o\w\b\m\u\n\x\s\r\z\y\b\e\t\z\2\r\z\4\6\6\5\9\1\f\9\8\v\s\h\v\a\o\e\3\z\p\b\3\b\1\w\0\u\z\u\a\o\r\b\m\u\i\7\m\u\6\p\1\t\n\5\o\u ]] 00:41:50.738 ************************************ 00:41:50.738 END TEST dd_flag_nofollow 00:41:50.738 ************************************ 00:41:50.738 00:41:50.738 real 0m2.394s 00:41:50.738 user 0m1.298s 00:41:50.738 sys 0m0.759s 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_nofollow -- common/autotest_common.sh@10 -- # set +x 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix -- dd/posix.sh@105 -- # run_test dd_flag_noatime noatime 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:50.738 ************************************ 00:41:50.738 START TEST dd_flag_noatime 00:41:50.738 ************************************ 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- common/autotest_common.sh@1124 -- # noatime 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@53 -- # local atime_if 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@54 -- # local atime_of 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@58 -- # gen_bytes 512 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/common.sh@98 -- # xtrace_disable 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- common/autotest_common.sh@10 -- # set +x 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@60 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@60 -- # atime_if=1717764253 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@61 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@61 -- # atime_of=1717764254 00:41:50.738 12:44:14 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@66 -- # sleep 1 00:41:52.120 12:44:15 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@68 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=noatime --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:52.120 [2024-06-07 12:44:15.382089] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:52.120 [2024-06-07 12:44:15.382405] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233729 ] 00:41:52.120 [2024-06-07 12:44:15.523428] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:52.120 [2024-06-07 12:44:15.621573] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:52.686  Copying: 512/512 [B] (average 500 kBps) 00:41:52.686 00:41:52.686 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@69 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:52.686 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@69 -- # (( atime_if == 1717764253 )) 00:41:52.686 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@70 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:52.686 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@70 -- # (( atime_of == 1717764254 )) 00:41:52.686 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@72 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:41:52.686 [2024-06-07 12:44:16.174742] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:52.686 [2024-06-07 12:44:16.175071] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233743 ] 00:41:52.686 [2024-06-07 12:44:16.330725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:52.944 [2024-06-07 12:44:16.428678] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:53.510  Copying: 512/512 [B] (average 500 kBps) 00:41:53.510 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@73 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- dd/posix.sh@73 -- # (( atime_if < 1717764256 )) 00:41:53.510 00:41:53.510 real 0m2.628s 00:41:53.510 user 0m0.846s 00:41:53.510 sys 0m0.541s 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix.dd_flag_noatime -- common/autotest_common.sh@10 -- # set +x 00:41:53.510 ************************************ 00:41:53.510 END TEST dd_flag_noatime 00:41:53.510 ************************************ 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix -- dd/posix.sh@106 -- # run_test dd_flags_misc io 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:53.510 12:44:16 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:53.510 ************************************ 00:41:53.510 START TEST dd_flags_misc 00:41:53.510 ************************************ 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- common/autotest_common.sh@1124 -- # io 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@77 -- # local flags_ro flags_rw flag_ro flag_rw 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@81 -- # flags_ro=(direct nonblock) 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@82 -- # flags_rw=("${flags_ro[@]}" sync dsync) 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@85 -- # for flag_ro in "${flags_ro[@]}" 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@86 -- # gen_bytes 512 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/common.sh@98 -- # xtrace_disable 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- common/autotest_common.sh@10 -- # set +x 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:53.510 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=direct 00:41:53.510 [2024-06-07 12:44:17.049892] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:53.510 [2024-06-07 12:44:17.050620] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233779 ] 00:41:53.768 [2024-06-07 12:44:17.188915] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:53.768 [2024-06-07 12:44:17.283265] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:54.335  Copying: 512/512 [B] (average 500 kBps) 00:41:54.335 00:41:54.335 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ jv9aqo74lf6ja8de2clqll85zyq4qqoufoka88uz7u3wk7w7h6yry5t57eongtv6ejabtlvpcz20e06cnw7u2a3hnotnnpi6qdquwkf6d24lf521lxd4w8ohxzaq094a1ovuuehllfiy8746v0wpg838l6ayaym8vhizku4oep3jq28n8ea24xb9z19b3umnzpr0an6qoqcvx17glpb4vgxfyd7ew2fqjemxo5jrkbi9b6g53prvk45ujyo7z93b30i6di4nft8d6ob1pxthsu8g4i258786lwxdsjph4eb8jxqv68pi7qakse3j27c8455tlh2p7uvb3r1wbfbv6m6419ha94r9egyfgg3s0g5rdqpsx5c5icz9k77c3qvih8vzmxi3jtodhwqfut8zh263l8tjg32insb4lh2h3n6s59skbzix8eebdg45pfdb1lzwqkbfo5yakcuszkg6326smvxe8yz3wl0657dxj0sikqy7uojl119atklk5k5n == \j\v\9\a\q\o\7\4\l\f\6\j\a\8\d\e\2\c\l\q\l\l\8\5\z\y\q\4\q\q\o\u\f\o\k\a\8\8\u\z\7\u\3\w\k\7\w\7\h\6\y\r\y\5\t\5\7\e\o\n\g\t\v\6\e\j\a\b\t\l\v\p\c\z\2\0\e\0\6\c\n\w\7\u\2\a\3\h\n\o\t\n\n\p\i\6\q\d\q\u\w\k\f\6\d\2\4\l\f\5\2\1\l\x\d\4\w\8\o\h\x\z\a\q\0\9\4\a\1\o\v\u\u\e\h\l\l\f\i\y\8\7\4\6\v\0\w\p\g\8\3\8\l\6\a\y\a\y\m\8\v\h\i\z\k\u\4\o\e\p\3\j\q\2\8\n\8\e\a\2\4\x\b\9\z\1\9\b\3\u\m\n\z\p\r\0\a\n\6\q\o\q\c\v\x\1\7\g\l\p\b\4\v\g\x\f\y\d\7\e\w\2\f\q\j\e\m\x\o\5\j\r\k\b\i\9\b\6\g\5\3\p\r\v\k\4\5\u\j\y\o\7\z\9\3\b\3\0\i\6\d\i\4\n\f\t\8\d\6\o\b\1\p\x\t\h\s\u\8\g\4\i\2\5\8\7\8\6\l\w\x\d\s\j\p\h\4\e\b\8\j\x\q\v\6\8\p\i\7\q\a\k\s\e\3\j\2\7\c\8\4\5\5\t\l\h\2\p\7\u\v\b\3\r\1\w\b\f\b\v\6\m\6\4\1\9\h\a\9\4\r\9\e\g\y\f\g\g\3\s\0\g\5\r\d\q\p\s\x\5\c\5\i\c\z\9\k\7\7\c\3\q\v\i\h\8\v\z\m\x\i\3\j\t\o\d\h\w\q\f\u\t\8\z\h\2\6\3\l\8\t\j\g\3\2\i\n\s\b\4\l\h\2\h\3\n\6\s\5\9\s\k\b\z\i\x\8\e\e\b\d\g\4\5\p\f\d\b\1\l\z\w\q\k\b\f\o\5\y\a\k\c\u\s\z\k\g\6\3\2\6\s\m\v\x\e\8\y\z\3\w\l\0\6\5\7\d\x\j\0\s\i\k\q\y\7\u\o\j\l\1\1\9\a\t\k\l\k\5\k\5\n ]] 00:41:54.335 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:54.335 12:44:17 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=nonblock 00:41:54.335 [2024-06-07 12:44:17.842763] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:54.336 [2024-06-07 12:44:17.843176] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233799 ] 00:41:54.594 [2024-06-07 12:44:17.994016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:54.594 [2024-06-07 12:44:18.089733] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:55.163  Copying: 512/512 [B] (average 500 kBps) 00:41:55.163 00:41:55.163 12:44:18 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ jv9aqo74lf6ja8de2clqll85zyq4qqoufoka88uz7u3wk7w7h6yry5t57eongtv6ejabtlvpcz20e06cnw7u2a3hnotnnpi6qdquwkf6d24lf521lxd4w8ohxzaq094a1ovuuehllfiy8746v0wpg838l6ayaym8vhizku4oep3jq28n8ea24xb9z19b3umnzpr0an6qoqcvx17glpb4vgxfyd7ew2fqjemxo5jrkbi9b6g53prvk45ujyo7z93b30i6di4nft8d6ob1pxthsu8g4i258786lwxdsjph4eb8jxqv68pi7qakse3j27c8455tlh2p7uvb3r1wbfbv6m6419ha94r9egyfgg3s0g5rdqpsx5c5icz9k77c3qvih8vzmxi3jtodhwqfut8zh263l8tjg32insb4lh2h3n6s59skbzix8eebdg45pfdb1lzwqkbfo5yakcuszkg6326smvxe8yz3wl0657dxj0sikqy7uojl119atklk5k5n == \j\v\9\a\q\o\7\4\l\f\6\j\a\8\d\e\2\c\l\q\l\l\8\5\z\y\q\4\q\q\o\u\f\o\k\a\8\8\u\z\7\u\3\w\k\7\w\7\h\6\y\r\y\5\t\5\7\e\o\n\g\t\v\6\e\j\a\b\t\l\v\p\c\z\2\0\e\0\6\c\n\w\7\u\2\a\3\h\n\o\t\n\n\p\i\6\q\d\q\u\w\k\f\6\d\2\4\l\f\5\2\1\l\x\d\4\w\8\o\h\x\z\a\q\0\9\4\a\1\o\v\u\u\e\h\l\l\f\i\y\8\7\4\6\v\0\w\p\g\8\3\8\l\6\a\y\a\y\m\8\v\h\i\z\k\u\4\o\e\p\3\j\q\2\8\n\8\e\a\2\4\x\b\9\z\1\9\b\3\u\m\n\z\p\r\0\a\n\6\q\o\q\c\v\x\1\7\g\l\p\b\4\v\g\x\f\y\d\7\e\w\2\f\q\j\e\m\x\o\5\j\r\k\b\i\9\b\6\g\5\3\p\r\v\k\4\5\u\j\y\o\7\z\9\3\b\3\0\i\6\d\i\4\n\f\t\8\d\6\o\b\1\p\x\t\h\s\u\8\g\4\i\2\5\8\7\8\6\l\w\x\d\s\j\p\h\4\e\b\8\j\x\q\v\6\8\p\i\7\q\a\k\s\e\3\j\2\7\c\8\4\5\5\t\l\h\2\p\7\u\v\b\3\r\1\w\b\f\b\v\6\m\6\4\1\9\h\a\9\4\r\9\e\g\y\f\g\g\3\s\0\g\5\r\d\q\p\s\x\5\c\5\i\c\z\9\k\7\7\c\3\q\v\i\h\8\v\z\m\x\i\3\j\t\o\d\h\w\q\f\u\t\8\z\h\2\6\3\l\8\t\j\g\3\2\i\n\s\b\4\l\h\2\h\3\n\6\s\5\9\s\k\b\z\i\x\8\e\e\b\d\g\4\5\p\f\d\b\1\l\z\w\q\k\b\f\o\5\y\a\k\c\u\s\z\k\g\6\3\2\6\s\m\v\x\e\8\y\z\3\w\l\0\6\5\7\d\x\j\0\s\i\k\q\y\7\u\o\j\l\1\1\9\a\t\k\l\k\5\k\5\n ]] 00:41:55.163 12:44:18 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:55.163 12:44:18 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=sync 00:41:55.163 [2024-06-07 12:44:18.617193] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:55.163 [2024-06-07 12:44:18.617422] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233812 ] 00:41:55.163 [2024-06-07 12:44:18.755303] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:55.421 [2024-06-07 12:44:18.849801] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:55.744  Copying: 512/512 [B] (average 100 kBps) 00:41:55.744 00:41:55.744 12:44:19 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ jv9aqo74lf6ja8de2clqll85zyq4qqoufoka88uz7u3wk7w7h6yry5t57eongtv6ejabtlvpcz20e06cnw7u2a3hnotnnpi6qdquwkf6d24lf521lxd4w8ohxzaq094a1ovuuehllfiy8746v0wpg838l6ayaym8vhizku4oep3jq28n8ea24xb9z19b3umnzpr0an6qoqcvx17glpb4vgxfyd7ew2fqjemxo5jrkbi9b6g53prvk45ujyo7z93b30i6di4nft8d6ob1pxthsu8g4i258786lwxdsjph4eb8jxqv68pi7qakse3j27c8455tlh2p7uvb3r1wbfbv6m6419ha94r9egyfgg3s0g5rdqpsx5c5icz9k77c3qvih8vzmxi3jtodhwqfut8zh263l8tjg32insb4lh2h3n6s59skbzix8eebdg45pfdb1lzwqkbfo5yakcuszkg6326smvxe8yz3wl0657dxj0sikqy7uojl119atklk5k5n == \j\v\9\a\q\o\7\4\l\f\6\j\a\8\d\e\2\c\l\q\l\l\8\5\z\y\q\4\q\q\o\u\f\o\k\a\8\8\u\z\7\u\3\w\k\7\w\7\h\6\y\r\y\5\t\5\7\e\o\n\g\t\v\6\e\j\a\b\t\l\v\p\c\z\2\0\e\0\6\c\n\w\7\u\2\a\3\h\n\o\t\n\n\p\i\6\q\d\q\u\w\k\f\6\d\2\4\l\f\5\2\1\l\x\d\4\w\8\o\h\x\z\a\q\0\9\4\a\1\o\v\u\u\e\h\l\l\f\i\y\8\7\4\6\v\0\w\p\g\8\3\8\l\6\a\y\a\y\m\8\v\h\i\z\k\u\4\o\e\p\3\j\q\2\8\n\8\e\a\2\4\x\b\9\z\1\9\b\3\u\m\n\z\p\r\0\a\n\6\q\o\q\c\v\x\1\7\g\l\p\b\4\v\g\x\f\y\d\7\e\w\2\f\q\j\e\m\x\o\5\j\r\k\b\i\9\b\6\g\5\3\p\r\v\k\4\5\u\j\y\o\7\z\9\3\b\3\0\i\6\d\i\4\n\f\t\8\d\6\o\b\1\p\x\t\h\s\u\8\g\4\i\2\5\8\7\8\6\l\w\x\d\s\j\p\h\4\e\b\8\j\x\q\v\6\8\p\i\7\q\a\k\s\e\3\j\2\7\c\8\4\5\5\t\l\h\2\p\7\u\v\b\3\r\1\w\b\f\b\v\6\m\6\4\1\9\h\a\9\4\r\9\e\g\y\f\g\g\3\s\0\g\5\r\d\q\p\s\x\5\c\5\i\c\z\9\k\7\7\c\3\q\v\i\h\8\v\z\m\x\i\3\j\t\o\d\h\w\q\f\u\t\8\z\h\2\6\3\l\8\t\j\g\3\2\i\n\s\b\4\l\h\2\h\3\n\6\s\5\9\s\k\b\z\i\x\8\e\e\b\d\g\4\5\p\f\d\b\1\l\z\w\q\k\b\f\o\5\y\a\k\c\u\s\z\k\g\6\3\2\6\s\m\v\x\e\8\y\z\3\w\l\0\6\5\7\d\x\j\0\s\i\k\q\y\7\u\o\j\l\1\1\9\a\t\k\l\k\5\k\5\n ]] 00:41:55.744 12:44:19 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:55.744 12:44:19 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=dsync 00:41:56.002 [2024-06-07 12:44:19.415283] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:56.002 [2024-06-07 12:44:19.415553] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233821 ] 00:41:56.002 [2024-06-07 12:44:19.560100] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:56.260 [2024-06-07 12:44:19.656354] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:56.518  Copying: 512/512 [B] (average 250 kBps) 00:41:56.518 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ jv9aqo74lf6ja8de2clqll85zyq4qqoufoka88uz7u3wk7w7h6yry5t57eongtv6ejabtlvpcz20e06cnw7u2a3hnotnnpi6qdquwkf6d24lf521lxd4w8ohxzaq094a1ovuuehllfiy8746v0wpg838l6ayaym8vhizku4oep3jq28n8ea24xb9z19b3umnzpr0an6qoqcvx17glpb4vgxfyd7ew2fqjemxo5jrkbi9b6g53prvk45ujyo7z93b30i6di4nft8d6ob1pxthsu8g4i258786lwxdsjph4eb8jxqv68pi7qakse3j27c8455tlh2p7uvb3r1wbfbv6m6419ha94r9egyfgg3s0g5rdqpsx5c5icz9k77c3qvih8vzmxi3jtodhwqfut8zh263l8tjg32insb4lh2h3n6s59skbzix8eebdg45pfdb1lzwqkbfo5yakcuszkg6326smvxe8yz3wl0657dxj0sikqy7uojl119atklk5k5n == \j\v\9\a\q\o\7\4\l\f\6\j\a\8\d\e\2\c\l\q\l\l\8\5\z\y\q\4\q\q\o\u\f\o\k\a\8\8\u\z\7\u\3\w\k\7\w\7\h\6\y\r\y\5\t\5\7\e\o\n\g\t\v\6\e\j\a\b\t\l\v\p\c\z\2\0\e\0\6\c\n\w\7\u\2\a\3\h\n\o\t\n\n\p\i\6\q\d\q\u\w\k\f\6\d\2\4\l\f\5\2\1\l\x\d\4\w\8\o\h\x\z\a\q\0\9\4\a\1\o\v\u\u\e\h\l\l\f\i\y\8\7\4\6\v\0\w\p\g\8\3\8\l\6\a\y\a\y\m\8\v\h\i\z\k\u\4\o\e\p\3\j\q\2\8\n\8\e\a\2\4\x\b\9\z\1\9\b\3\u\m\n\z\p\r\0\a\n\6\q\o\q\c\v\x\1\7\g\l\p\b\4\v\g\x\f\y\d\7\e\w\2\f\q\j\e\m\x\o\5\j\r\k\b\i\9\b\6\g\5\3\p\r\v\k\4\5\u\j\y\o\7\z\9\3\b\3\0\i\6\d\i\4\n\f\t\8\d\6\o\b\1\p\x\t\h\s\u\8\g\4\i\2\5\8\7\8\6\l\w\x\d\s\j\p\h\4\e\b\8\j\x\q\v\6\8\p\i\7\q\a\k\s\e\3\j\2\7\c\8\4\5\5\t\l\h\2\p\7\u\v\b\3\r\1\w\b\f\b\v\6\m\6\4\1\9\h\a\9\4\r\9\e\g\y\f\g\g\3\s\0\g\5\r\d\q\p\s\x\5\c\5\i\c\z\9\k\7\7\c\3\q\v\i\h\8\v\z\m\x\i\3\j\t\o\d\h\w\q\f\u\t\8\z\h\2\6\3\l\8\t\j\g\3\2\i\n\s\b\4\l\h\2\h\3\n\6\s\5\9\s\k\b\z\i\x\8\e\e\b\d\g\4\5\p\f\d\b\1\l\z\w\q\k\b\f\o\5\y\a\k\c\u\s\z\k\g\6\3\2\6\s\m\v\x\e\8\y\z\3\w\l\0\6\5\7\d\x\j\0\s\i\k\q\y\7\u\o\j\l\1\1\9\a\t\k\l\k\5\k\5\n ]] 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@85 -- # for flag_ro in "${flags_ro[@]}" 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@86 -- # gen_bytes 512 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/common.sh@98 -- # xtrace_disable 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- common/autotest_common.sh@10 -- # set +x 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:56.776 12:44:20 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=direct 00:41:56.776 [2024-06-07 12:44:20.222713] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:56.776 [2024-06-07 12:44:20.223090] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233834 ] 00:41:56.776 [2024-06-07 12:44:20.369957] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:57.032 [2024-06-07 12:44:20.469797] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:57.598  Copying: 512/512 [B] (average 500 kBps) 00:41:57.598 00:41:57.598 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ wb244vb41qeawvpp6wpiozsfj538ae3sj526mxzfnp0k9zhdwpt13qf4ghhmfowt6dbmmv5hssvx8djhe1ewvizxjir7h0x4kdbc31wltjm6f8spu1sxghe8vakuoaqg799qfyr4m5q4oxnfvv79t0kjpeyo2fby4qvgq84w32j6mywr06rq0gxofhnsidz6vqevoqhczhhjzpgrhqdeck44t700jnz4975b1dmnn48w1o0acemqg9k71j41cswg51sa05kpglsp43yw46wn9xyicuwvy4cmvxu0rygyctondn6kbtdqioq93802em98q7id9tmc98tipjlium5nc5r1jjvynvtezhbibnwfy2xsrl63fxzkzts09plk8qzekzonjkc4nnjdifnqj3rkhbm6euai65f59875hwt7vpuqf9dhifojtp2nykx7csj0ou6rs2n83c6q01ntf60o3cboyt0244vdp844yeiar3tpldih4y69gk19595rcz7a == \w\b\2\4\4\v\b\4\1\q\e\a\w\v\p\p\6\w\p\i\o\z\s\f\j\5\3\8\a\e\3\s\j\5\2\6\m\x\z\f\n\p\0\k\9\z\h\d\w\p\t\1\3\q\f\4\g\h\h\m\f\o\w\t\6\d\b\m\m\v\5\h\s\s\v\x\8\d\j\h\e\1\e\w\v\i\z\x\j\i\r\7\h\0\x\4\k\d\b\c\3\1\w\l\t\j\m\6\f\8\s\p\u\1\s\x\g\h\e\8\v\a\k\u\o\a\q\g\7\9\9\q\f\y\r\4\m\5\q\4\o\x\n\f\v\v\7\9\t\0\k\j\p\e\y\o\2\f\b\y\4\q\v\g\q\8\4\w\3\2\j\6\m\y\w\r\0\6\r\q\0\g\x\o\f\h\n\s\i\d\z\6\v\q\e\v\o\q\h\c\z\h\h\j\z\p\g\r\h\q\d\e\c\k\4\4\t\7\0\0\j\n\z\4\9\7\5\b\1\d\m\n\n\4\8\w\1\o\0\a\c\e\m\q\g\9\k\7\1\j\4\1\c\s\w\g\5\1\s\a\0\5\k\p\g\l\s\p\4\3\y\w\4\6\w\n\9\x\y\i\c\u\w\v\y\4\c\m\v\x\u\0\r\y\g\y\c\t\o\n\d\n\6\k\b\t\d\q\i\o\q\9\3\8\0\2\e\m\9\8\q\7\i\d\9\t\m\c\9\8\t\i\p\j\l\i\u\m\5\n\c\5\r\1\j\j\v\y\n\v\t\e\z\h\b\i\b\n\w\f\y\2\x\s\r\l\6\3\f\x\z\k\z\t\s\0\9\p\l\k\8\q\z\e\k\z\o\n\j\k\c\4\n\n\j\d\i\f\n\q\j\3\r\k\h\b\m\6\e\u\a\i\6\5\f\5\9\8\7\5\h\w\t\7\v\p\u\q\f\9\d\h\i\f\o\j\t\p\2\n\y\k\x\7\c\s\j\0\o\u\6\r\s\2\n\8\3\c\6\q\0\1\n\t\f\6\0\o\3\c\b\o\y\t\0\2\4\4\v\d\p\8\4\4\y\e\i\a\r\3\t\p\l\d\i\h\4\y\6\9\g\k\1\9\5\9\5\r\c\z\7\a ]] 00:41:57.598 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:57.598 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=nonblock 00:41:57.598 [2024-06-07 12:44:21.047672] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:57.598 [2024-06-07 12:44:21.047967] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233851 ] 00:41:57.598 [2024-06-07 12:44:21.198009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:57.856 [2024-06-07 12:44:21.297387] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:58.421  Copying: 512/512 [B] (average 500 kBps) 00:41:58.421 00:41:58.422 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ wb244vb41qeawvpp6wpiozsfj538ae3sj526mxzfnp0k9zhdwpt13qf4ghhmfowt6dbmmv5hssvx8djhe1ewvizxjir7h0x4kdbc31wltjm6f8spu1sxghe8vakuoaqg799qfyr4m5q4oxnfvv79t0kjpeyo2fby4qvgq84w32j6mywr06rq0gxofhnsidz6vqevoqhczhhjzpgrhqdeck44t700jnz4975b1dmnn48w1o0acemqg9k71j41cswg51sa05kpglsp43yw46wn9xyicuwvy4cmvxu0rygyctondn6kbtdqioq93802em98q7id9tmc98tipjlium5nc5r1jjvynvtezhbibnwfy2xsrl63fxzkzts09plk8qzekzonjkc4nnjdifnqj3rkhbm6euai65f59875hwt7vpuqf9dhifojtp2nykx7csj0ou6rs2n83c6q01ntf60o3cboyt0244vdp844yeiar3tpldih4y69gk19595rcz7a == \w\b\2\4\4\v\b\4\1\q\e\a\w\v\p\p\6\w\p\i\o\z\s\f\j\5\3\8\a\e\3\s\j\5\2\6\m\x\z\f\n\p\0\k\9\z\h\d\w\p\t\1\3\q\f\4\g\h\h\m\f\o\w\t\6\d\b\m\m\v\5\h\s\s\v\x\8\d\j\h\e\1\e\w\v\i\z\x\j\i\r\7\h\0\x\4\k\d\b\c\3\1\w\l\t\j\m\6\f\8\s\p\u\1\s\x\g\h\e\8\v\a\k\u\o\a\q\g\7\9\9\q\f\y\r\4\m\5\q\4\o\x\n\f\v\v\7\9\t\0\k\j\p\e\y\o\2\f\b\y\4\q\v\g\q\8\4\w\3\2\j\6\m\y\w\r\0\6\r\q\0\g\x\o\f\h\n\s\i\d\z\6\v\q\e\v\o\q\h\c\z\h\h\j\z\p\g\r\h\q\d\e\c\k\4\4\t\7\0\0\j\n\z\4\9\7\5\b\1\d\m\n\n\4\8\w\1\o\0\a\c\e\m\q\g\9\k\7\1\j\4\1\c\s\w\g\5\1\s\a\0\5\k\p\g\l\s\p\4\3\y\w\4\6\w\n\9\x\y\i\c\u\w\v\y\4\c\m\v\x\u\0\r\y\g\y\c\t\o\n\d\n\6\k\b\t\d\q\i\o\q\9\3\8\0\2\e\m\9\8\q\7\i\d\9\t\m\c\9\8\t\i\p\j\l\i\u\m\5\n\c\5\r\1\j\j\v\y\n\v\t\e\z\h\b\i\b\n\w\f\y\2\x\s\r\l\6\3\f\x\z\k\z\t\s\0\9\p\l\k\8\q\z\e\k\z\o\n\j\k\c\4\n\n\j\d\i\f\n\q\j\3\r\k\h\b\m\6\e\u\a\i\6\5\f\5\9\8\7\5\h\w\t\7\v\p\u\q\f\9\d\h\i\f\o\j\t\p\2\n\y\k\x\7\c\s\j\0\o\u\6\r\s\2\n\8\3\c\6\q\0\1\n\t\f\6\0\o\3\c\b\o\y\t\0\2\4\4\v\d\p\8\4\4\y\e\i\a\r\3\t\p\l\d\i\h\4\y\6\9\g\k\1\9\5\9\5\r\c\z\7\a ]] 00:41:58.422 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:58.422 12:44:21 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=sync 00:41:58.422 [2024-06-07 12:44:21.843858] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:58.422 [2024-06-07 12:44:21.844164] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233863 ] 00:41:58.422 [2024-06-07 12:44:21.993875] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:58.680 [2024-06-07 12:44:22.084301] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:58.938  Copying: 512/512 [B] (average 250 kBps) 00:41:58.938 00:41:58.939 12:44:22 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ wb244vb41qeawvpp6wpiozsfj538ae3sj526mxzfnp0k9zhdwpt13qf4ghhmfowt6dbmmv5hssvx8djhe1ewvizxjir7h0x4kdbc31wltjm6f8spu1sxghe8vakuoaqg799qfyr4m5q4oxnfvv79t0kjpeyo2fby4qvgq84w32j6mywr06rq0gxofhnsidz6vqevoqhczhhjzpgrhqdeck44t700jnz4975b1dmnn48w1o0acemqg9k71j41cswg51sa05kpglsp43yw46wn9xyicuwvy4cmvxu0rygyctondn6kbtdqioq93802em98q7id9tmc98tipjlium5nc5r1jjvynvtezhbibnwfy2xsrl63fxzkzts09plk8qzekzonjkc4nnjdifnqj3rkhbm6euai65f59875hwt7vpuqf9dhifojtp2nykx7csj0ou6rs2n83c6q01ntf60o3cboyt0244vdp844yeiar3tpldih4y69gk19595rcz7a == \w\b\2\4\4\v\b\4\1\q\e\a\w\v\p\p\6\w\p\i\o\z\s\f\j\5\3\8\a\e\3\s\j\5\2\6\m\x\z\f\n\p\0\k\9\z\h\d\w\p\t\1\3\q\f\4\g\h\h\m\f\o\w\t\6\d\b\m\m\v\5\h\s\s\v\x\8\d\j\h\e\1\e\w\v\i\z\x\j\i\r\7\h\0\x\4\k\d\b\c\3\1\w\l\t\j\m\6\f\8\s\p\u\1\s\x\g\h\e\8\v\a\k\u\o\a\q\g\7\9\9\q\f\y\r\4\m\5\q\4\o\x\n\f\v\v\7\9\t\0\k\j\p\e\y\o\2\f\b\y\4\q\v\g\q\8\4\w\3\2\j\6\m\y\w\r\0\6\r\q\0\g\x\o\f\h\n\s\i\d\z\6\v\q\e\v\o\q\h\c\z\h\h\j\z\p\g\r\h\q\d\e\c\k\4\4\t\7\0\0\j\n\z\4\9\7\5\b\1\d\m\n\n\4\8\w\1\o\0\a\c\e\m\q\g\9\k\7\1\j\4\1\c\s\w\g\5\1\s\a\0\5\k\p\g\l\s\p\4\3\y\w\4\6\w\n\9\x\y\i\c\u\w\v\y\4\c\m\v\x\u\0\r\y\g\y\c\t\o\n\d\n\6\k\b\t\d\q\i\o\q\9\3\8\0\2\e\m\9\8\q\7\i\d\9\t\m\c\9\8\t\i\p\j\l\i\u\m\5\n\c\5\r\1\j\j\v\y\n\v\t\e\z\h\b\i\b\n\w\f\y\2\x\s\r\l\6\3\f\x\z\k\z\t\s\0\9\p\l\k\8\q\z\e\k\z\o\n\j\k\c\4\n\n\j\d\i\f\n\q\j\3\r\k\h\b\m\6\e\u\a\i\6\5\f\5\9\8\7\5\h\w\t\7\v\p\u\q\f\9\d\h\i\f\o\j\t\p\2\n\y\k\x\7\c\s\j\0\o\u\6\r\s\2\n\8\3\c\6\q\0\1\n\t\f\6\0\o\3\c\b\o\y\t\0\2\4\4\v\d\p\8\4\4\y\e\i\a\r\3\t\p\l\d\i\h\4\y\6\9\g\k\1\9\5\9\5\r\c\z\7\a ]] 00:41:58.939 12:44:22 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:41:58.939 12:44:22 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=dsync 00:41:59.197 [2024-06-07 12:44:22.605087] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:59.197 [2024-06-07 12:44:22.605748] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233873 ] 00:41:59.197 [2024-06-07 12:44:22.740856] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:59.197 [2024-06-07 12:44:22.830024] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:41:59.713  Copying: 512/512 [B] (average 250 kBps) 00:41:59.713 00:41:59.713 12:44:23 spdk_dd.spdk_dd_posix.dd_flags_misc -- dd/posix.sh@93 -- # [[ wb244vb41qeawvpp6wpiozsfj538ae3sj526mxzfnp0k9zhdwpt13qf4ghhmfowt6dbmmv5hssvx8djhe1ewvizxjir7h0x4kdbc31wltjm6f8spu1sxghe8vakuoaqg799qfyr4m5q4oxnfvv79t0kjpeyo2fby4qvgq84w32j6mywr06rq0gxofhnsidz6vqevoqhczhhjzpgrhqdeck44t700jnz4975b1dmnn48w1o0acemqg9k71j41cswg51sa05kpglsp43yw46wn9xyicuwvy4cmvxu0rygyctondn6kbtdqioq93802em98q7id9tmc98tipjlium5nc5r1jjvynvtezhbibnwfy2xsrl63fxzkzts09plk8qzekzonjkc4nnjdifnqj3rkhbm6euai65f59875hwt7vpuqf9dhifojtp2nykx7csj0ou6rs2n83c6q01ntf60o3cboyt0244vdp844yeiar3tpldih4y69gk19595rcz7a == \w\b\2\4\4\v\b\4\1\q\e\a\w\v\p\p\6\w\p\i\o\z\s\f\j\5\3\8\a\e\3\s\j\5\2\6\m\x\z\f\n\p\0\k\9\z\h\d\w\p\t\1\3\q\f\4\g\h\h\m\f\o\w\t\6\d\b\m\m\v\5\h\s\s\v\x\8\d\j\h\e\1\e\w\v\i\z\x\j\i\r\7\h\0\x\4\k\d\b\c\3\1\w\l\t\j\m\6\f\8\s\p\u\1\s\x\g\h\e\8\v\a\k\u\o\a\q\g\7\9\9\q\f\y\r\4\m\5\q\4\o\x\n\f\v\v\7\9\t\0\k\j\p\e\y\o\2\f\b\y\4\q\v\g\q\8\4\w\3\2\j\6\m\y\w\r\0\6\r\q\0\g\x\o\f\h\n\s\i\d\z\6\v\q\e\v\o\q\h\c\z\h\h\j\z\p\g\r\h\q\d\e\c\k\4\4\t\7\0\0\j\n\z\4\9\7\5\b\1\d\m\n\n\4\8\w\1\o\0\a\c\e\m\q\g\9\k\7\1\j\4\1\c\s\w\g\5\1\s\a\0\5\k\p\g\l\s\p\4\3\y\w\4\6\w\n\9\x\y\i\c\u\w\v\y\4\c\m\v\x\u\0\r\y\g\y\c\t\o\n\d\n\6\k\b\t\d\q\i\o\q\9\3\8\0\2\e\m\9\8\q\7\i\d\9\t\m\c\9\8\t\i\p\j\l\i\u\m\5\n\c\5\r\1\j\j\v\y\n\v\t\e\z\h\b\i\b\n\w\f\y\2\x\s\r\l\6\3\f\x\z\k\z\t\s\0\9\p\l\k\8\q\z\e\k\z\o\n\j\k\c\4\n\n\j\d\i\f\n\q\j\3\r\k\h\b\m\6\e\u\a\i\6\5\f\5\9\8\7\5\h\w\t\7\v\p\u\q\f\9\d\h\i\f\o\j\t\p\2\n\y\k\x\7\c\s\j\0\o\u\6\r\s\2\n\8\3\c\6\q\0\1\n\t\f\6\0\o\3\c\b\o\y\t\0\2\4\4\v\d\p\8\4\4\y\e\i\a\r\3\t\p\l\d\i\h\4\y\6\9\g\k\1\9\5\9\5\r\c\z\7\a ]] 00:41:59.713 ************************************ 00:41:59.713 00:41:59.713 real 0m6.313s 00:41:59.713 user 0m3.314s 00:41:59.713 sys 0m2.006s 00:41:59.713 12:44:23 spdk_dd.spdk_dd_posix.dd_flags_misc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:41:59.713 12:44:23 spdk_dd.spdk_dd_posix.dd_flags_misc -- common/autotest_common.sh@10 -- # set +x 00:41:59.713 END TEST dd_flags_misc 00:41:59.713 ************************************ 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- dd/posix.sh@131 -- # tests_forced_aio 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- dd/posix.sh@110 -- # printf '* Second test run%s\n' ', using AIO' 00:41:59.971 * Second test run, using AIO 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- dd/posix.sh@113 -- # DD_APP+=("--aio") 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- dd/posix.sh@114 -- # run_test dd_flag_append_forced_aio append 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:41:59.971 ************************************ 00:41:59.971 START TEST dd_flag_append_forced_aio 00:41:59.971 ************************************ 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- common/autotest_common.sh@1124 -- # append 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@16 -- # local dump0 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@17 -- # local dump1 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@19 -- # gen_bytes 32 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@19 -- # dump0=cclo6me9wxm133ujmdn8zopgqqcchss4 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@20 -- # gen_bytes 32 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@20 -- # dump1=akqzl1c5c4xy3nvy5c2os1h5qb15qfor 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@22 -- # printf %s cclo6me9wxm133ujmdn8zopgqqcchss4 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@23 -- # printf %s akqzl1c5c4xy3nvy5c2os1h5qb15qfor 00:41:59.971 12:44:23 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@25 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=append 00:41:59.971 [2024-06-07 12:44:23.440684] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:41:59.971 [2024-06-07 12:44:23.440978] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233912 ] 00:41:59.971 [2024-06-07 12:44:23.590500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:00.246 [2024-06-07 12:44:23.685690] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:00.836  Copying: 32/32 [B] (average 31 kBps) 00:42:00.836 00:42:00.836 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- dd/posix.sh@27 -- # [[ akqzl1c5c4xy3nvy5c2os1h5qb15qforcclo6me9wxm133ujmdn8zopgqqcchss4 == \a\k\q\z\l\1\c\5\c\4\x\y\3\n\v\y\5\c\2\o\s\1\h\5\q\b\1\5\q\f\o\r\c\c\l\o\6\m\e\9\w\x\m\1\3\3\u\j\m\d\n\8\z\o\p\g\q\q\c\c\h\s\s\4 ]] 00:42:00.836 00:42:00.836 real 0m0.810s 00:42:00.837 user 0m0.436s 00:42:00.837 sys 0m0.254s 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_append_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:00.837 ************************************ 00:42:00.837 END TEST dd_flag_append_forced_aio 00:42:00.837 ************************************ 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix -- dd/posix.sh@115 -- # run_test dd_flag_directory_forced_aio directory 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:42:00.837 ************************************ 00:42:00.837 START TEST dd_flag_directory_forced_aio 00:42:00.837 ************************************ 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@1124 -- # directory 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- dd/posix.sh@31 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@649 -- # local es=0 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:00.837 12:44:24 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=directory --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:00.837 [2024-06-07 12:44:24.304668] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:00.837 [2024-06-07 12:44:24.305553] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233947 ] 00:42:00.837 [2024-06-07 12:44:24.452074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:01.095 [2024-06-07 12:44:24.565970] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:01.095 [2024-06-07 12:44:24.714219] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:42:01.095 [2024-06-07 12:44:24.714371] spdk_dd.c:1083:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:42:01.095 [2024-06-07 12:44:24.714433] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:01.354 [2024-06-07 12:44:24.909944] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@652 -- # es=236 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@661 -- # es=108 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@662 -- # case "$es" in 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@669 -- # es=1 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- dd/posix.sh@32 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@649 -- # local es=0 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:01.613 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=directory 00:42:01.614 [2024-06-07 12:44:25.118956] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:01.614 [2024-06-07 12:44:25.119217] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233968 ] 00:42:01.872 [2024-06-07 12:44:25.261339] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:01.872 [2024-06-07 12:44:25.359822] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:01.872 [2024-06-07 12:44:25.488249] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:42:01.872 [2024-06-07 12:44:25.488388] spdk_dd.c:1132:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0: Not a directory 00:42:01.872 [2024-06-07 12:44:25.488445] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:02.131 [2024-06-07 12:44:25.692455] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@652 -- # es=236 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@661 -- # es=108 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@662 -- # case "$es" in 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@669 -- # es=1 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:02.389 00:42:02.389 real 0m1.596s 00:42:02.389 user 0m0.854s 00:42:02.389 sys 0m0.529s 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_directory_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:02.389 ************************************ 00:42:02.389 END TEST dd_flag_directory_forced_aio 00:42:02.389 ************************************ 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix -- dd/posix.sh@116 -- # run_test dd_flag_nofollow_forced_aio nofollow 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:42:02.389 ************************************ 00:42:02.389 START TEST dd_flag_nofollow_forced_aio 00:42:02.389 ************************************ 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@1124 -- # nofollow 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@36 -- # local test_file0_link=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@37 -- # local test_file1_link=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@39 -- # ln -fs /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@40 -- # ln -fs /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@42 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@649 -- # local es=0 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:02.389 12:44:25 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --iflag=nofollow --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:02.390 [2024-06-07 12:44:25.969039] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:02.390 [2024-06-07 12:44:25.969386] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234001 ] 00:42:02.649 [2024-06-07 12:44:26.115625] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:02.649 [2024-06-07 12:44:26.226986] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:02.907 [2024-06-07 12:44:26.348711] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link: Too many levels of symbolic links 00:42:02.907 [2024-06-07 12:44:26.348831] spdk_dd.c:1083:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link: Too many levels of symbolic links 00:42:02.907 [2024-06-07 12:44:26.348884] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:02.907 [2024-06-07 12:44:26.541611] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@652 -- # es=216 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@661 -- # es=88 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@662 -- # case "$es" in 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@669 -- # es=1 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@43 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@649 -- # local es=0 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:03.165 12:44:26 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link --oflag=nofollow 00:42:03.165 [2024-06-07 12:44:26.750291] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:03.165 [2024-06-07 12:44:26.750607] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234013 ] 00:42:03.423 [2024-06-07 12:44:26.897267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:03.423 [2024-06-07 12:44:26.990566] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:03.681 [2024-06-07 12:44:27.136535] spdk_dd.c: 894:dd_open_file: *ERROR*: Could not open file /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link: Too many levels of symbolic links 00:42:03.681 [2024-06-07 12:44:27.136707] spdk_dd.c:1132:dd_run: *ERROR*: /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link: Too many levels of symbolic links 00:42:03.681 [2024-06-07 12:44:27.136797] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:03.938 [2024-06-07 12:44:27.345173] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@652 -- # es=216 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@661 -- # es=88 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@662 -- # case "$es" in 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@669 -- # es=1 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@46 -- # gen_bytes 512 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:03.938 12:44:27 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@48 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:03.938 [2024-06-07 12:44:27.556334] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:03.938 [2024-06-07 12:44:27.556634] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234029 ] 00:42:04.195 [2024-06-07 12:44:27.705015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:04.195 [2024-06-07 12:44:27.802899] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:04.712  Copying: 512/512 [B] (average 500 kBps) 00:42:04.712 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- dd/posix.sh@49 -- # [[ 6lwwyybrczth9jfkq80x7a3nc5bsp1cwmabtwehp5745dlwp2r6anbrvyt7oditeshbq6b4biihe91mj7zg7vb5o0opsepof67kzxbegm60ctdihppmwnb8ecfxb0crxm5ggqo7t53leri3ennex2l3kgteoahtbbur6zdyp43ym5ukmwt4kfzxcoms27h70j51svaehy5yf8o42h4r66led6y5nab81qbvcgk8c4eznbyg81g83trda4z3fgu6uzeyest73qe49py7m4wmp0rlzjhi015vja1sj84wc27a3lpfy19af9jzgqg985wbxm0oasb2co6yw0911g5ppt653cb6s7zxkq9y5yuuibd8miytbetr0f5e2xtfw6b8ns5k19kfodk4mmobby245bgws2lk6js3n2tue29q31pkr6ysr2j215o6yfmsu392woqq9sldf6qmx6mdg1bp8s09jdr3bou13f4v3j85ked9s82g2yrwnmg9xj9r84nny == \6\l\w\w\y\y\b\r\c\z\t\h\9\j\f\k\q\8\0\x\7\a\3\n\c\5\b\s\p\1\c\w\m\a\b\t\w\e\h\p\5\7\4\5\d\l\w\p\2\r\6\a\n\b\r\v\y\t\7\o\d\i\t\e\s\h\b\q\6\b\4\b\i\i\h\e\9\1\m\j\7\z\g\7\v\b\5\o\0\o\p\s\e\p\o\f\6\7\k\z\x\b\e\g\m\6\0\c\t\d\i\h\p\p\m\w\n\b\8\e\c\f\x\b\0\c\r\x\m\5\g\g\q\o\7\t\5\3\l\e\r\i\3\e\n\n\e\x\2\l\3\k\g\t\e\o\a\h\t\b\b\u\r\6\z\d\y\p\4\3\y\m\5\u\k\m\w\t\4\k\f\z\x\c\o\m\s\2\7\h\7\0\j\5\1\s\v\a\e\h\y\5\y\f\8\o\4\2\h\4\r\6\6\l\e\d\6\y\5\n\a\b\8\1\q\b\v\c\g\k\8\c\4\e\z\n\b\y\g\8\1\g\8\3\t\r\d\a\4\z\3\f\g\u\6\u\z\e\y\e\s\t\7\3\q\e\4\9\p\y\7\m\4\w\m\p\0\r\l\z\j\h\i\0\1\5\v\j\a\1\s\j\8\4\w\c\2\7\a\3\l\p\f\y\1\9\a\f\9\j\z\g\q\g\9\8\5\w\b\x\m\0\o\a\s\b\2\c\o\6\y\w\0\9\1\1\g\5\p\p\t\6\5\3\c\b\6\s\7\z\x\k\q\9\y\5\y\u\u\i\b\d\8\m\i\y\t\b\e\t\r\0\f\5\e\2\x\t\f\w\6\b\8\n\s\5\k\1\9\k\f\o\d\k\4\m\m\o\b\b\y\2\4\5\b\g\w\s\2\l\k\6\j\s\3\n\2\t\u\e\2\9\q\3\1\p\k\r\6\y\s\r\2\j\2\1\5\o\6\y\f\m\s\u\3\9\2\w\o\q\q\9\s\l\d\f\6\q\m\x\6\m\d\g\1\b\p\8\s\0\9\j\d\r\3\b\o\u\1\3\f\4\v\3\j\8\5\k\e\d\9\s\8\2\g\2\y\r\w\n\m\g\9\x\j\9\r\8\4\n\n\y ]] 00:42:04.712 00:42:04.712 real 0m2.383s 00:42:04.712 user 0m1.243s 00:42:04.712 sys 0m0.805s 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_nofollow_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:04.712 ************************************ 00:42:04.712 END TEST dd_flag_nofollow_forced_aio 00:42:04.712 ************************************ 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix -- dd/posix.sh@117 -- # run_test dd_flag_noatime_forced_aio noatime 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:04.712 12:44:28 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:42:04.971 ************************************ 00:42:04.971 START TEST dd_flag_noatime_forced_aio 00:42:04.971 ************************************ 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- common/autotest_common.sh@1124 -- # noatime 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@53 -- # local atime_if 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@54 -- # local atime_of 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@58 -- # gen_bytes 512 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@60 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@60 -- # atime_if=1717764267 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@61 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@61 -- # atime_of=1717764268 00:42:04.971 12:44:28 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@66 -- # sleep 1 00:42:05.906 12:44:29 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@68 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=noatime --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:05.907 [2024-06-07 12:44:29.438212] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:05.907 [2024-06-07 12:44:29.438533] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234082 ] 00:42:06.165 [2024-06-07 12:44:29.582086] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:06.165 [2024-06-07 12:44:29.666857] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:06.730  Copying: 512/512 [B] (average 500 kBps) 00:42:06.730 00:42:06.731 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@69 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:06.731 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@69 -- # (( atime_if == 1717764267 )) 00:42:06.731 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@70 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:06.731 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@70 -- # (( atime_of == 1717764268 )) 00:42:06.731 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@72 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:06.731 [2024-06-07 12:44:30.201492] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:06.731 [2024-06-07 12:44:30.202191] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234097 ] 00:42:06.731 [2024-06-07 12:44:30.346022] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:06.988 [2024-06-07 12:44:30.443490] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:07.593  Copying: 512/512 [B] (average 500 kBps) 00:42:07.593 00:42:07.593 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@73 -- # stat --printf=%X /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:07.593 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- dd/posix.sh@73 -- # (( atime_if < 1717764270 )) 00:42:07.593 00:42:07.593 real 0m2.593s 00:42:07.593 user 0m0.841s 00:42:07.593 sys 0m0.511s 00:42:07.593 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:07.593 12:44:30 spdk_dd.spdk_dd_posix.dd_flag_noatime_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:07.593 ************************************ 00:42:07.593 END TEST dd_flag_noatime_forced_aio 00:42:07.594 ************************************ 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix -- dd/posix.sh@118 -- # run_test dd_flags_misc_forced_aio io 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:42:07.594 ************************************ 00:42:07.594 START TEST dd_flags_misc_forced_aio 00:42:07.594 ************************************ 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- common/autotest_common.sh@1124 -- # io 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@77 -- # local flags_ro flags_rw flag_ro flag_rw 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@81 -- # flags_ro=(direct nonblock) 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@82 -- # flags_rw=("${flags_ro[@]}" sync dsync) 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@85 -- # for flag_ro in "${flags_ro[@]}" 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@86 -- # gen_bytes 512 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:07.594 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=direct 00:42:07.594 [2024-06-07 12:44:31.076428] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:07.594 [2024-06-07 12:44:31.076796] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234132 ] 00:42:07.594 [2024-06-07 12:44:31.235131] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:07.852 [2024-06-07 12:44:31.334123] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:08.418  Copying: 512/512 [B] (average 500 kBps) 00:42:08.418 00:42:08.418 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 274699refzvgnwp4z58f5a4mfs9t6msy7qho120fqos5wncoqk42cia4bc6ao97ht78ukw56hddketehmy64cdtgdq4qyx53m15wrydwidxp2jfngvd6xzvrh9okxq3eom3mydt6y134oxtdzzqfuk146vq47lqb8srfhu4fv3u29eh242o4k1tumt4zey7e9km94bvovyv724w5s4gue147se7kcpdslhonc3klsv52n131eyl0k726887wq7iorek976qkqhsro2b45ckc4acd8a4uk08gny0jujn9rxw09jhhvq9hbh30j34j37kfmssybb1va1wef92x61r39rif7orex43p49cvzcpgwza2tpeu3mththrnpddjzupsg1mlzyq3mrgq76yigaxzx8tzhs3ndywu7r5c318kybwnvyjdpwq6nyxutvpy0fy0uscc35x5ounzorae7st399tyki85pym9ko7ydf47a2u4igde202f1q4944m7c641 == \2\7\4\6\9\9\r\e\f\z\v\g\n\w\p\4\z\5\8\f\5\a\4\m\f\s\9\t\6\m\s\y\7\q\h\o\1\2\0\f\q\o\s\5\w\n\c\o\q\k\4\2\c\i\a\4\b\c\6\a\o\9\7\h\t\7\8\u\k\w\5\6\h\d\d\k\e\t\e\h\m\y\6\4\c\d\t\g\d\q\4\q\y\x\5\3\m\1\5\w\r\y\d\w\i\d\x\p\2\j\f\n\g\v\d\6\x\z\v\r\h\9\o\k\x\q\3\e\o\m\3\m\y\d\t\6\y\1\3\4\o\x\t\d\z\z\q\f\u\k\1\4\6\v\q\4\7\l\q\b\8\s\r\f\h\u\4\f\v\3\u\2\9\e\h\2\4\2\o\4\k\1\t\u\m\t\4\z\e\y\7\e\9\k\m\9\4\b\v\o\v\y\v\7\2\4\w\5\s\4\g\u\e\1\4\7\s\e\7\k\c\p\d\s\l\h\o\n\c\3\k\l\s\v\5\2\n\1\3\1\e\y\l\0\k\7\2\6\8\8\7\w\q\7\i\o\r\e\k\9\7\6\q\k\q\h\s\r\o\2\b\4\5\c\k\c\4\a\c\d\8\a\4\u\k\0\8\g\n\y\0\j\u\j\n\9\r\x\w\0\9\j\h\h\v\q\9\h\b\h\3\0\j\3\4\j\3\7\k\f\m\s\s\y\b\b\1\v\a\1\w\e\f\9\2\x\6\1\r\3\9\r\i\f\7\o\r\e\x\4\3\p\4\9\c\v\z\c\p\g\w\z\a\2\t\p\e\u\3\m\t\h\t\h\r\n\p\d\d\j\z\u\p\s\g\1\m\l\z\y\q\3\m\r\g\q\7\6\y\i\g\a\x\z\x\8\t\z\h\s\3\n\d\y\w\u\7\r\5\c\3\1\8\k\y\b\w\n\v\y\j\d\p\w\q\6\n\y\x\u\t\v\p\y\0\f\y\0\u\s\c\c\3\5\x\5\o\u\n\z\o\r\a\e\7\s\t\3\9\9\t\y\k\i\8\5\p\y\m\9\k\o\7\y\d\f\4\7\a\2\u\4\i\g\d\e\2\0\2\f\1\q\4\9\4\4\m\7\c\6\4\1 ]] 00:42:08.418 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:08.418 12:44:31 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=nonblock 00:42:08.418 [2024-06-07 12:44:31.879388] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:08.418 [2024-06-07 12:44:31.879687] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234143 ] 00:42:08.418 [2024-06-07 12:44:32.018474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:08.676 [2024-06-07 12:44:32.113500] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:09.242  Copying: 512/512 [B] (average 500 kBps) 00:42:09.242 00:42:09.242 12:44:32 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 274699refzvgnwp4z58f5a4mfs9t6msy7qho120fqos5wncoqk42cia4bc6ao97ht78ukw56hddketehmy64cdtgdq4qyx53m15wrydwidxp2jfngvd6xzvrh9okxq3eom3mydt6y134oxtdzzqfuk146vq47lqb8srfhu4fv3u29eh242o4k1tumt4zey7e9km94bvovyv724w5s4gue147se7kcpdslhonc3klsv52n131eyl0k726887wq7iorek976qkqhsro2b45ckc4acd8a4uk08gny0jujn9rxw09jhhvq9hbh30j34j37kfmssybb1va1wef92x61r39rif7orex43p49cvzcpgwza2tpeu3mththrnpddjzupsg1mlzyq3mrgq76yigaxzx8tzhs3ndywu7r5c318kybwnvyjdpwq6nyxutvpy0fy0uscc35x5ounzorae7st399tyki85pym9ko7ydf47a2u4igde202f1q4944m7c641 == \2\7\4\6\9\9\r\e\f\z\v\g\n\w\p\4\z\5\8\f\5\a\4\m\f\s\9\t\6\m\s\y\7\q\h\o\1\2\0\f\q\o\s\5\w\n\c\o\q\k\4\2\c\i\a\4\b\c\6\a\o\9\7\h\t\7\8\u\k\w\5\6\h\d\d\k\e\t\e\h\m\y\6\4\c\d\t\g\d\q\4\q\y\x\5\3\m\1\5\w\r\y\d\w\i\d\x\p\2\j\f\n\g\v\d\6\x\z\v\r\h\9\o\k\x\q\3\e\o\m\3\m\y\d\t\6\y\1\3\4\o\x\t\d\z\z\q\f\u\k\1\4\6\v\q\4\7\l\q\b\8\s\r\f\h\u\4\f\v\3\u\2\9\e\h\2\4\2\o\4\k\1\t\u\m\t\4\z\e\y\7\e\9\k\m\9\4\b\v\o\v\y\v\7\2\4\w\5\s\4\g\u\e\1\4\7\s\e\7\k\c\p\d\s\l\h\o\n\c\3\k\l\s\v\5\2\n\1\3\1\e\y\l\0\k\7\2\6\8\8\7\w\q\7\i\o\r\e\k\9\7\6\q\k\q\h\s\r\o\2\b\4\5\c\k\c\4\a\c\d\8\a\4\u\k\0\8\g\n\y\0\j\u\j\n\9\r\x\w\0\9\j\h\h\v\q\9\h\b\h\3\0\j\3\4\j\3\7\k\f\m\s\s\y\b\b\1\v\a\1\w\e\f\9\2\x\6\1\r\3\9\r\i\f\7\o\r\e\x\4\3\p\4\9\c\v\z\c\p\g\w\z\a\2\t\p\e\u\3\m\t\h\t\h\r\n\p\d\d\j\z\u\p\s\g\1\m\l\z\y\q\3\m\r\g\q\7\6\y\i\g\a\x\z\x\8\t\z\h\s\3\n\d\y\w\u\7\r\5\c\3\1\8\k\y\b\w\n\v\y\j\d\p\w\q\6\n\y\x\u\t\v\p\y\0\f\y\0\u\s\c\c\3\5\x\5\o\u\n\z\o\r\a\e\7\s\t\3\9\9\t\y\k\i\8\5\p\y\m\9\k\o\7\y\d\f\4\7\a\2\u\4\i\g\d\e\2\0\2\f\1\q\4\9\4\4\m\7\c\6\4\1 ]] 00:42:09.242 12:44:32 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:09.242 12:44:32 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=sync 00:42:09.242 [2024-06-07 12:44:32.659493] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:09.242 [2024-06-07 12:44:32.659879] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234158 ] 00:42:09.242 [2024-06-07 12:44:32.805546] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:09.501 [2024-06-07 12:44:32.898914] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:09.759  Copying: 512/512 [B] (average 100 kBps) 00:42:09.759 00:42:10.018 12:44:33 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 274699refzvgnwp4z58f5a4mfs9t6msy7qho120fqos5wncoqk42cia4bc6ao97ht78ukw56hddketehmy64cdtgdq4qyx53m15wrydwidxp2jfngvd6xzvrh9okxq3eom3mydt6y134oxtdzzqfuk146vq47lqb8srfhu4fv3u29eh242o4k1tumt4zey7e9km94bvovyv724w5s4gue147se7kcpdslhonc3klsv52n131eyl0k726887wq7iorek976qkqhsro2b45ckc4acd8a4uk08gny0jujn9rxw09jhhvq9hbh30j34j37kfmssybb1va1wef92x61r39rif7orex43p49cvzcpgwza2tpeu3mththrnpddjzupsg1mlzyq3mrgq76yigaxzx8tzhs3ndywu7r5c318kybwnvyjdpwq6nyxutvpy0fy0uscc35x5ounzorae7st399tyki85pym9ko7ydf47a2u4igde202f1q4944m7c641 == \2\7\4\6\9\9\r\e\f\z\v\g\n\w\p\4\z\5\8\f\5\a\4\m\f\s\9\t\6\m\s\y\7\q\h\o\1\2\0\f\q\o\s\5\w\n\c\o\q\k\4\2\c\i\a\4\b\c\6\a\o\9\7\h\t\7\8\u\k\w\5\6\h\d\d\k\e\t\e\h\m\y\6\4\c\d\t\g\d\q\4\q\y\x\5\3\m\1\5\w\r\y\d\w\i\d\x\p\2\j\f\n\g\v\d\6\x\z\v\r\h\9\o\k\x\q\3\e\o\m\3\m\y\d\t\6\y\1\3\4\o\x\t\d\z\z\q\f\u\k\1\4\6\v\q\4\7\l\q\b\8\s\r\f\h\u\4\f\v\3\u\2\9\e\h\2\4\2\o\4\k\1\t\u\m\t\4\z\e\y\7\e\9\k\m\9\4\b\v\o\v\y\v\7\2\4\w\5\s\4\g\u\e\1\4\7\s\e\7\k\c\p\d\s\l\h\o\n\c\3\k\l\s\v\5\2\n\1\3\1\e\y\l\0\k\7\2\6\8\8\7\w\q\7\i\o\r\e\k\9\7\6\q\k\q\h\s\r\o\2\b\4\5\c\k\c\4\a\c\d\8\a\4\u\k\0\8\g\n\y\0\j\u\j\n\9\r\x\w\0\9\j\h\h\v\q\9\h\b\h\3\0\j\3\4\j\3\7\k\f\m\s\s\y\b\b\1\v\a\1\w\e\f\9\2\x\6\1\r\3\9\r\i\f\7\o\r\e\x\4\3\p\4\9\c\v\z\c\p\g\w\z\a\2\t\p\e\u\3\m\t\h\t\h\r\n\p\d\d\j\z\u\p\s\g\1\m\l\z\y\q\3\m\r\g\q\7\6\y\i\g\a\x\z\x\8\t\z\h\s\3\n\d\y\w\u\7\r\5\c\3\1\8\k\y\b\w\n\v\y\j\d\p\w\q\6\n\y\x\u\t\v\p\y\0\f\y\0\u\s\c\c\3\5\x\5\o\u\n\z\o\r\a\e\7\s\t\3\9\9\t\y\k\i\8\5\p\y\m\9\k\o\7\y\d\f\4\7\a\2\u\4\i\g\d\e\2\0\2\f\1\q\4\9\4\4\m\7\c\6\4\1 ]] 00:42:10.018 12:44:33 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:10.018 12:44:33 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=direct --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=dsync 00:42:10.018 [2024-06-07 12:44:33.438042] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:10.018 [2024-06-07 12:44:33.438335] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234175 ] 00:42:10.018 [2024-06-07 12:44:33.578105] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:10.276 [2024-06-07 12:44:33.673295] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:10.842  Copying: 512/512 [B] (average 250 kBps) 00:42:10.842 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 274699refzvgnwp4z58f5a4mfs9t6msy7qho120fqos5wncoqk42cia4bc6ao97ht78ukw56hddketehmy64cdtgdq4qyx53m15wrydwidxp2jfngvd6xzvrh9okxq3eom3mydt6y134oxtdzzqfuk146vq47lqb8srfhu4fv3u29eh242o4k1tumt4zey7e9km94bvovyv724w5s4gue147se7kcpdslhonc3klsv52n131eyl0k726887wq7iorek976qkqhsro2b45ckc4acd8a4uk08gny0jujn9rxw09jhhvq9hbh30j34j37kfmssybb1va1wef92x61r39rif7orex43p49cvzcpgwza2tpeu3mththrnpddjzupsg1mlzyq3mrgq76yigaxzx8tzhs3ndywu7r5c318kybwnvyjdpwq6nyxutvpy0fy0uscc35x5ounzorae7st399tyki85pym9ko7ydf47a2u4igde202f1q4944m7c641 == \2\7\4\6\9\9\r\e\f\z\v\g\n\w\p\4\z\5\8\f\5\a\4\m\f\s\9\t\6\m\s\y\7\q\h\o\1\2\0\f\q\o\s\5\w\n\c\o\q\k\4\2\c\i\a\4\b\c\6\a\o\9\7\h\t\7\8\u\k\w\5\6\h\d\d\k\e\t\e\h\m\y\6\4\c\d\t\g\d\q\4\q\y\x\5\3\m\1\5\w\r\y\d\w\i\d\x\p\2\j\f\n\g\v\d\6\x\z\v\r\h\9\o\k\x\q\3\e\o\m\3\m\y\d\t\6\y\1\3\4\o\x\t\d\z\z\q\f\u\k\1\4\6\v\q\4\7\l\q\b\8\s\r\f\h\u\4\f\v\3\u\2\9\e\h\2\4\2\o\4\k\1\t\u\m\t\4\z\e\y\7\e\9\k\m\9\4\b\v\o\v\y\v\7\2\4\w\5\s\4\g\u\e\1\4\7\s\e\7\k\c\p\d\s\l\h\o\n\c\3\k\l\s\v\5\2\n\1\3\1\e\y\l\0\k\7\2\6\8\8\7\w\q\7\i\o\r\e\k\9\7\6\q\k\q\h\s\r\o\2\b\4\5\c\k\c\4\a\c\d\8\a\4\u\k\0\8\g\n\y\0\j\u\j\n\9\r\x\w\0\9\j\h\h\v\q\9\h\b\h\3\0\j\3\4\j\3\7\k\f\m\s\s\y\b\b\1\v\a\1\w\e\f\9\2\x\6\1\r\3\9\r\i\f\7\o\r\e\x\4\3\p\4\9\c\v\z\c\p\g\w\z\a\2\t\p\e\u\3\m\t\h\t\h\r\n\p\d\d\j\z\u\p\s\g\1\m\l\z\y\q\3\m\r\g\q\7\6\y\i\g\a\x\z\x\8\t\z\h\s\3\n\d\y\w\u\7\r\5\c\3\1\8\k\y\b\w\n\v\y\j\d\p\w\q\6\n\y\x\u\t\v\p\y\0\f\y\0\u\s\c\c\3\5\x\5\o\u\n\z\o\r\a\e\7\s\t\3\9\9\t\y\k\i\8\5\p\y\m\9\k\o\7\y\d\f\4\7\a\2\u\4\i\g\d\e\2\0\2\f\1\q\4\9\4\4\m\7\c\6\4\1 ]] 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@85 -- # for flag_ro in "${flags_ro[@]}" 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@86 -- # gen_bytes 512 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/common.sh@98 -- # xtrace_disable 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:10.842 12:44:34 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=direct 00:42:10.842 [2024-06-07 12:44:34.250662] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:10.842 [2024-06-07 12:44:34.251043] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234187 ] 00:42:10.842 [2024-06-07 12:44:34.408958] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:11.100 [2024-06-07 12:44:34.501818] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:11.667  Copying: 512/512 [B] (average 500 kBps) 00:42:11.667 00:42:11.667 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 7tehu0laiz5jodn34qcupkfxo1saqnt8k5wnzv0dgp6msv6ykphlv9rex324ygn5381ognizaexh1999atmoharn2nffp7uziaq4ogy53h454eqseayls6mahaql1tzvem98icaftg788943eagan2f3cebqsm06t8tk6mdcbk0dp1fckernczncsaqbp9n3dad4rl4g34jiqromorgmq69qjwhbgw8tnfb63trrm5oz25br32liyvg2hk3c2ppy31riqadvxfaq7xwvalyknj48an8fsmqjzjfyqibz825rk3qwl7nnxj7eyp8lzdjco4d4tl9swgt3b6rvtwlzrnto7ps8tkjbxe5fp18i8zlx8vslxakhmndqu0hnnsynb8gbpmixwlxxz7wi3xdcbtqv728enr6oz0jr6ivu0jmjbm6z3xg0i227ftq1h68vq6me9576fc6bguhtouq4ejj1n265vlg84vlkluin8y4aq0ivyp4jbhznfl72jltm == \7\t\e\h\u\0\l\a\i\z\5\j\o\d\n\3\4\q\c\u\p\k\f\x\o\1\s\a\q\n\t\8\k\5\w\n\z\v\0\d\g\p\6\m\s\v\6\y\k\p\h\l\v\9\r\e\x\3\2\4\y\g\n\5\3\8\1\o\g\n\i\z\a\e\x\h\1\9\9\9\a\t\m\o\h\a\r\n\2\n\f\f\p\7\u\z\i\a\q\4\o\g\y\5\3\h\4\5\4\e\q\s\e\a\y\l\s\6\m\a\h\a\q\l\1\t\z\v\e\m\9\8\i\c\a\f\t\g\7\8\8\9\4\3\e\a\g\a\n\2\f\3\c\e\b\q\s\m\0\6\t\8\t\k\6\m\d\c\b\k\0\d\p\1\f\c\k\e\r\n\c\z\n\c\s\a\q\b\p\9\n\3\d\a\d\4\r\l\4\g\3\4\j\i\q\r\o\m\o\r\g\m\q\6\9\q\j\w\h\b\g\w\8\t\n\f\b\6\3\t\r\r\m\5\o\z\2\5\b\r\3\2\l\i\y\v\g\2\h\k\3\c\2\p\p\y\3\1\r\i\q\a\d\v\x\f\a\q\7\x\w\v\a\l\y\k\n\j\4\8\a\n\8\f\s\m\q\j\z\j\f\y\q\i\b\z\8\2\5\r\k\3\q\w\l\7\n\n\x\j\7\e\y\p\8\l\z\d\j\c\o\4\d\4\t\l\9\s\w\g\t\3\b\6\r\v\t\w\l\z\r\n\t\o\7\p\s\8\t\k\j\b\x\e\5\f\p\1\8\i\8\z\l\x\8\v\s\l\x\a\k\h\m\n\d\q\u\0\h\n\n\s\y\n\b\8\g\b\p\m\i\x\w\l\x\x\z\7\w\i\3\x\d\c\b\t\q\v\7\2\8\e\n\r\6\o\z\0\j\r\6\i\v\u\0\j\m\j\b\m\6\z\3\x\g\0\i\2\2\7\f\t\q\1\h\6\8\v\q\6\m\e\9\5\7\6\f\c\6\b\g\u\h\t\o\u\q\4\e\j\j\1\n\2\6\5\v\l\g\8\4\v\l\k\l\u\i\n\8\y\4\a\q\0\i\v\y\p\4\j\b\h\z\n\f\l\7\2\j\l\t\m ]] 00:42:11.667 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:11.667 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=nonblock 00:42:11.667 [2024-06-07 12:44:35.054955] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:11.667 [2024-06-07 12:44:35.055370] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234204 ] 00:42:11.667 [2024-06-07 12:44:35.199111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:11.667 [2024-06-07 12:44:35.277061] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:12.184  Copying: 512/512 [B] (average 500 kBps) 00:42:12.184 00:42:12.184 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 7tehu0laiz5jodn34qcupkfxo1saqnt8k5wnzv0dgp6msv6ykphlv9rex324ygn5381ognizaexh1999atmoharn2nffp7uziaq4ogy53h454eqseayls6mahaql1tzvem98icaftg788943eagan2f3cebqsm06t8tk6mdcbk0dp1fckernczncsaqbp9n3dad4rl4g34jiqromorgmq69qjwhbgw8tnfb63trrm5oz25br32liyvg2hk3c2ppy31riqadvxfaq7xwvalyknj48an8fsmqjzjfyqibz825rk3qwl7nnxj7eyp8lzdjco4d4tl9swgt3b6rvtwlzrnto7ps8tkjbxe5fp18i8zlx8vslxakhmndqu0hnnsynb8gbpmixwlxxz7wi3xdcbtqv728enr6oz0jr6ivu0jmjbm6z3xg0i227ftq1h68vq6me9576fc6bguhtouq4ejj1n265vlg84vlkluin8y4aq0ivyp4jbhznfl72jltm == \7\t\e\h\u\0\l\a\i\z\5\j\o\d\n\3\4\q\c\u\p\k\f\x\o\1\s\a\q\n\t\8\k\5\w\n\z\v\0\d\g\p\6\m\s\v\6\y\k\p\h\l\v\9\r\e\x\3\2\4\y\g\n\5\3\8\1\o\g\n\i\z\a\e\x\h\1\9\9\9\a\t\m\o\h\a\r\n\2\n\f\f\p\7\u\z\i\a\q\4\o\g\y\5\3\h\4\5\4\e\q\s\e\a\y\l\s\6\m\a\h\a\q\l\1\t\z\v\e\m\9\8\i\c\a\f\t\g\7\8\8\9\4\3\e\a\g\a\n\2\f\3\c\e\b\q\s\m\0\6\t\8\t\k\6\m\d\c\b\k\0\d\p\1\f\c\k\e\r\n\c\z\n\c\s\a\q\b\p\9\n\3\d\a\d\4\r\l\4\g\3\4\j\i\q\r\o\m\o\r\g\m\q\6\9\q\j\w\h\b\g\w\8\t\n\f\b\6\3\t\r\r\m\5\o\z\2\5\b\r\3\2\l\i\y\v\g\2\h\k\3\c\2\p\p\y\3\1\r\i\q\a\d\v\x\f\a\q\7\x\w\v\a\l\y\k\n\j\4\8\a\n\8\f\s\m\q\j\z\j\f\y\q\i\b\z\8\2\5\r\k\3\q\w\l\7\n\n\x\j\7\e\y\p\8\l\z\d\j\c\o\4\d\4\t\l\9\s\w\g\t\3\b\6\r\v\t\w\l\z\r\n\t\o\7\p\s\8\t\k\j\b\x\e\5\f\p\1\8\i\8\z\l\x\8\v\s\l\x\a\k\h\m\n\d\q\u\0\h\n\n\s\y\n\b\8\g\b\p\m\i\x\w\l\x\x\z\7\w\i\3\x\d\c\b\t\q\v\7\2\8\e\n\r\6\o\z\0\j\r\6\i\v\u\0\j\m\j\b\m\6\z\3\x\g\0\i\2\2\7\f\t\q\1\h\6\8\v\q\6\m\e\9\5\7\6\f\c\6\b\g\u\h\t\o\u\q\4\e\j\j\1\n\2\6\5\v\l\g\8\4\v\l\k\l\u\i\n\8\y\4\a\q\0\i\v\y\p\4\j\b\h\z\n\f\l\7\2\j\l\t\m ]] 00:42:12.184 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:12.184 12:44:35 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=sync 00:42:12.184 [2024-06-07 12:44:35.813156] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:12.184 [2024-06-07 12:44:35.813434] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234213 ] 00:42:12.443 [2024-06-07 12:44:35.962901] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:12.443 [2024-06-07 12:44:36.062797] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:12.960  Copying: 512/512 [B] (average 166 kBps) 00:42:12.960 00:42:12.960 12:44:36 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 7tehu0laiz5jodn34qcupkfxo1saqnt8k5wnzv0dgp6msv6ykphlv9rex324ygn5381ognizaexh1999atmoharn2nffp7uziaq4ogy53h454eqseayls6mahaql1tzvem98icaftg788943eagan2f3cebqsm06t8tk6mdcbk0dp1fckernczncsaqbp9n3dad4rl4g34jiqromorgmq69qjwhbgw8tnfb63trrm5oz25br32liyvg2hk3c2ppy31riqadvxfaq7xwvalyknj48an8fsmqjzjfyqibz825rk3qwl7nnxj7eyp8lzdjco4d4tl9swgt3b6rvtwlzrnto7ps8tkjbxe5fp18i8zlx8vslxakhmndqu0hnnsynb8gbpmixwlxxz7wi3xdcbtqv728enr6oz0jr6ivu0jmjbm6z3xg0i227ftq1h68vq6me9576fc6bguhtouq4ejj1n265vlg84vlkluin8y4aq0ivyp4jbhznfl72jltm == \7\t\e\h\u\0\l\a\i\z\5\j\o\d\n\3\4\q\c\u\p\k\f\x\o\1\s\a\q\n\t\8\k\5\w\n\z\v\0\d\g\p\6\m\s\v\6\y\k\p\h\l\v\9\r\e\x\3\2\4\y\g\n\5\3\8\1\o\g\n\i\z\a\e\x\h\1\9\9\9\a\t\m\o\h\a\r\n\2\n\f\f\p\7\u\z\i\a\q\4\o\g\y\5\3\h\4\5\4\e\q\s\e\a\y\l\s\6\m\a\h\a\q\l\1\t\z\v\e\m\9\8\i\c\a\f\t\g\7\8\8\9\4\3\e\a\g\a\n\2\f\3\c\e\b\q\s\m\0\6\t\8\t\k\6\m\d\c\b\k\0\d\p\1\f\c\k\e\r\n\c\z\n\c\s\a\q\b\p\9\n\3\d\a\d\4\r\l\4\g\3\4\j\i\q\r\o\m\o\r\g\m\q\6\9\q\j\w\h\b\g\w\8\t\n\f\b\6\3\t\r\r\m\5\o\z\2\5\b\r\3\2\l\i\y\v\g\2\h\k\3\c\2\p\p\y\3\1\r\i\q\a\d\v\x\f\a\q\7\x\w\v\a\l\y\k\n\j\4\8\a\n\8\f\s\m\q\j\z\j\f\y\q\i\b\z\8\2\5\r\k\3\q\w\l\7\n\n\x\j\7\e\y\p\8\l\z\d\j\c\o\4\d\4\t\l\9\s\w\g\t\3\b\6\r\v\t\w\l\z\r\n\t\o\7\p\s\8\t\k\j\b\x\e\5\f\p\1\8\i\8\z\l\x\8\v\s\l\x\a\k\h\m\n\d\q\u\0\h\n\n\s\y\n\b\8\g\b\p\m\i\x\w\l\x\x\z\7\w\i\3\x\d\c\b\t\q\v\7\2\8\e\n\r\6\o\z\0\j\r\6\i\v\u\0\j\m\j\b\m\6\z\3\x\g\0\i\2\2\7\f\t\q\1\h\6\8\v\q\6\m\e\9\5\7\6\f\c\6\b\g\u\h\t\o\u\q\4\e\j\j\1\n\2\6\5\v\l\g\8\4\v\l\k\l\u\i\n\8\y\4\a\q\0\i\v\y\p\4\j\b\h\z\n\f\l\7\2\j\l\t\m ]] 00:42:12.960 12:44:36 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@87 -- # for flag_rw in "${flags_rw[@]}" 00:42:12.960 12:44:36 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@89 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --aio --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --iflag=nonblock --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=dsync 00:42:12.960 [2024-06-07 12:44:36.599291] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:12.960 [2024-06-07 12:44:36.599613] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234226 ] 00:42:13.218 [2024-06-07 12:44:36.751573] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:13.218 [2024-06-07 12:44:36.845164] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:13.735  Copying: 512/512 [B] (average 250 kBps) 00:42:13.735 00:42:13.735 12:44:37 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- dd/posix.sh@93 -- # [[ 7tehu0laiz5jodn34qcupkfxo1saqnt8k5wnzv0dgp6msv6ykphlv9rex324ygn5381ognizaexh1999atmoharn2nffp7uziaq4ogy53h454eqseayls6mahaql1tzvem98icaftg788943eagan2f3cebqsm06t8tk6mdcbk0dp1fckernczncsaqbp9n3dad4rl4g34jiqromorgmq69qjwhbgw8tnfb63trrm5oz25br32liyvg2hk3c2ppy31riqadvxfaq7xwvalyknj48an8fsmqjzjfyqibz825rk3qwl7nnxj7eyp8lzdjco4d4tl9swgt3b6rvtwlzrnto7ps8tkjbxe5fp18i8zlx8vslxakhmndqu0hnnsynb8gbpmixwlxxz7wi3xdcbtqv728enr6oz0jr6ivu0jmjbm6z3xg0i227ftq1h68vq6me9576fc6bguhtouq4ejj1n265vlg84vlkluin8y4aq0ivyp4jbhznfl72jltm == \7\t\e\h\u\0\l\a\i\z\5\j\o\d\n\3\4\q\c\u\p\k\f\x\o\1\s\a\q\n\t\8\k\5\w\n\z\v\0\d\g\p\6\m\s\v\6\y\k\p\h\l\v\9\r\e\x\3\2\4\y\g\n\5\3\8\1\o\g\n\i\z\a\e\x\h\1\9\9\9\a\t\m\o\h\a\r\n\2\n\f\f\p\7\u\z\i\a\q\4\o\g\y\5\3\h\4\5\4\e\q\s\e\a\y\l\s\6\m\a\h\a\q\l\1\t\z\v\e\m\9\8\i\c\a\f\t\g\7\8\8\9\4\3\e\a\g\a\n\2\f\3\c\e\b\q\s\m\0\6\t\8\t\k\6\m\d\c\b\k\0\d\p\1\f\c\k\e\r\n\c\z\n\c\s\a\q\b\p\9\n\3\d\a\d\4\r\l\4\g\3\4\j\i\q\r\o\m\o\r\g\m\q\6\9\q\j\w\h\b\g\w\8\t\n\f\b\6\3\t\r\r\m\5\o\z\2\5\b\r\3\2\l\i\y\v\g\2\h\k\3\c\2\p\p\y\3\1\r\i\q\a\d\v\x\f\a\q\7\x\w\v\a\l\y\k\n\j\4\8\a\n\8\f\s\m\q\j\z\j\f\y\q\i\b\z\8\2\5\r\k\3\q\w\l\7\n\n\x\j\7\e\y\p\8\l\z\d\j\c\o\4\d\4\t\l\9\s\w\g\t\3\b\6\r\v\t\w\l\z\r\n\t\o\7\p\s\8\t\k\j\b\x\e\5\f\p\1\8\i\8\z\l\x\8\v\s\l\x\a\k\h\m\n\d\q\u\0\h\n\n\s\y\n\b\8\g\b\p\m\i\x\w\l\x\x\z\7\w\i\3\x\d\c\b\t\q\v\7\2\8\e\n\r\6\o\z\0\j\r\6\i\v\u\0\j\m\j\b\m\6\z\3\x\g\0\i\2\2\7\f\t\q\1\h\6\8\v\q\6\m\e\9\5\7\6\f\c\6\b\g\u\h\t\o\u\q\4\e\j\j\1\n\2\6\5\v\l\g\8\4\v\l\k\l\u\i\n\8\y\4\a\q\0\i\v\y\p\4\j\b\h\z\n\f\l\7\2\j\l\t\m ]] 00:42:13.735 00:42:13.735 real 0m6.332s 00:42:13.735 user 0m3.320s 00:42:13.735 sys 0m2.032s 00:42:13.735 12:44:37 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:13.735 12:44:37 spdk_dd.spdk_dd_posix.dd_flags_misc_forced_aio -- common/autotest_common.sh@10 -- # set +x 00:42:13.735 ************************************ 00:42:13.735 END TEST dd_flags_misc_forced_aio 00:42:13.735 ************************************ 00:42:13.993 12:44:37 spdk_dd.spdk_dd_posix -- dd/posix.sh@1 -- # cleanup 00:42:13.993 12:44:37 spdk_dd.spdk_dd_posix -- dd/posix.sh@11 -- # rm -f /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0.link 00:42:13.993 12:44:37 spdk_dd.spdk_dd_posix -- dd/posix.sh@12 -- # rm -f /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1.link 00:42:13.993 00:42:13.993 real 0m28.180s 00:42:13.993 user 0m13.678s 00:42:13.993 sys 0m8.711s 00:42:13.993 12:44:37 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:13.993 12:44:37 spdk_dd.spdk_dd_posix -- common/autotest_common.sh@10 -- # set +x 00:42:13.993 ************************************ 00:42:13.993 END TEST spdk_dd_posix 00:42:13.993 ************************************ 00:42:13.993 12:44:37 spdk_dd -- dd/dd.sh@22 -- # run_test spdk_dd_malloc /home/vagrant/spdk_repo/spdk/test/dd/malloc.sh 00:42:13.993 12:44:37 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:13.993 12:44:37 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:13.993 12:44:37 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:42:13.993 ************************************ 00:42:13.993 START TEST spdk_dd_malloc 00:42:13.993 ************************************ 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/malloc.sh 00:42:13.993 * Looking for test storage... 00:42:13.993 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- paths/export.sh@5 -- # export PATH 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- dd/malloc.sh@38 -- # run_test dd_malloc_copy malloc_copy 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@10 -- # set +x 00:42:13.993 ************************************ 00:42:13.993 START TEST dd_malloc_copy 00:42:13.993 ************************************ 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- common/autotest_common.sh@1124 -- # malloc_copy 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@12 -- # local mbdev0=malloc0 mbdev0_b=1048576 mbdev0_bs=512 00:42:13.993 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@13 -- # local mbdev1=malloc1 mbdev1_b=1048576 mbdev1_bs=512 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@15 -- # method_bdev_malloc_create_0=(['name']='malloc0' ['num_blocks']='1048576' ['block_size']='512') 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@15 -- # local -A method_bdev_malloc_create_0 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@21 -- # method_bdev_malloc_create_1=(['name']='malloc1' ['num_blocks']='1048576' ['block_size']='512') 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@21 -- # local -A method_bdev_malloc_create_1 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@28 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=malloc0 --ob=malloc1 --json /dev/fd/62 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@28 -- # gen_conf 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/common.sh@31 -- # xtrace_disable 00:42:13.994 12:44:37 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- common/autotest_common.sh@10 -- # set +x 00:42:13.994 [2024-06-07 12:44:37.635633] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:13.994 [2024-06-07 12:44:37.635945] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234312 ] 00:42:14.252 { 00:42:14.252 "subsystems": [ 00:42:14.252 { 00:42:14.252 "subsystem": "bdev", 00:42:14.252 "config": [ 00:42:14.252 { 00:42:14.252 "params": { 00:42:14.252 "block_size": 512, 00:42:14.252 "num_blocks": 1048576, 00:42:14.252 "name": "malloc0" 00:42:14.252 }, 00:42:14.252 "method": "bdev_malloc_create" 00:42:14.252 }, 00:42:14.252 { 00:42:14.252 "params": { 00:42:14.252 "block_size": 512, 00:42:14.252 "num_blocks": 1048576, 00:42:14.252 "name": "malloc1" 00:42:14.252 }, 00:42:14.252 "method": "bdev_malloc_create" 00:42:14.252 }, 00:42:14.252 { 00:42:14.252 "method": "bdev_wait_for_examine" 00:42:14.252 } 00:42:14.252 ] 00:42:14.252 } 00:42:14.252 ] 00:42:14.252 } 00:42:14.252 [2024-06-07 12:44:37.782363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:14.510 [2024-06-07 12:44:37.907205] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:17.076  Copying: 426/512 [MB] (426 MBps) Copying: 512/512 [MB] (average 421 MBps) 00:42:17.076 00:42:17.076 12:44:40 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@33 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=malloc1 --ob=malloc0 --json /dev/fd/62 00:42:17.076 12:44:40 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/malloc.sh@33 -- # gen_conf 00:42:17.076 12:44:40 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- dd/common.sh@31 -- # xtrace_disable 00:42:17.076 12:44:40 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- common/autotest_common.sh@10 -- # set +x 00:42:17.076 [2024-06-07 12:44:40.714794] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:17.076 [2024-06-07 12:44:40.715118] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234357 ] 00:42:17.335 { 00:42:17.335 "subsystems": [ 00:42:17.335 { 00:42:17.335 "subsystem": "bdev", 00:42:17.335 "config": [ 00:42:17.335 { 00:42:17.335 "params": { 00:42:17.335 "block_size": 512, 00:42:17.335 "num_blocks": 1048576, 00:42:17.335 "name": "malloc0" 00:42:17.335 }, 00:42:17.335 "method": "bdev_malloc_create" 00:42:17.335 }, 00:42:17.335 { 00:42:17.335 "params": { 00:42:17.335 "block_size": 512, 00:42:17.335 "num_blocks": 1048576, 00:42:17.335 "name": "malloc1" 00:42:17.335 }, 00:42:17.335 "method": "bdev_malloc_create" 00:42:17.335 }, 00:42:17.335 { 00:42:17.335 "method": "bdev_wait_for_examine" 00:42:17.335 } 00:42:17.335 ] 00:42:17.335 } 00:42:17.335 ] 00:42:17.335 } 00:42:17.335 [2024-06-07 12:44:40.855461] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:17.335 [2024-06-07 12:44:40.947606] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:20.187  Copying: 457/512 [MB] (457 MBps) Copying: 512/512 [MB] (average 456 MBps) 00:42:20.187 00:42:20.187 00:42:20.187 real 0m6.024s 00:42:20.187 user 0m4.450s 00:42:20.187 sys 0m1.413s 00:42:20.187 12:44:43 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:20.187 12:44:43 spdk_dd.spdk_dd_malloc.dd_malloc_copy -- common/autotest_common.sh@10 -- # set +x 00:42:20.187 ************************************ 00:42:20.187 END TEST dd_malloc_copy 00:42:20.187 ************************************ 00:42:20.187 00:42:20.187 real 0m6.186s 00:42:20.187 user 0m4.521s 00:42:20.187 sys 0m1.513s 00:42:20.187 12:44:43 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:20.188 12:44:43 spdk_dd.spdk_dd_malloc -- common/autotest_common.sh@10 -- # set +x 00:42:20.188 ************************************ 00:42:20.188 END TEST spdk_dd_malloc 00:42:20.188 ************************************ 00:42:20.188 12:44:43 spdk_dd -- dd/dd.sh@23 -- # run_test spdk_dd_bdev_to_bdev /home/vagrant/spdk_repo/spdk/test/dd/bdev_to_bdev.sh 0000:00:10.0 00:42:20.188 12:44:43 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:42:20.188 12:44:43 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:20.188 12:44:43 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:42:20.188 ************************************ 00:42:20.188 START TEST spdk_dd_bdev_to_bdev 00:42:20.188 ************************************ 00:42:20.188 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/bdev_to_bdev.sh 0000:00:10.0 00:42:20.188 * Looking for test storage... 00:42:20.447 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- paths/export.sh@5 -- # export PATH 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@10 -- # nvmes=("$@") 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@47 -- # trap cleanup EXIT 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@49 -- # bs=1048576 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@51 -- # (( 1 > 1 )) 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@67 -- # nvme0=Nvme0 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@67 -- # bdev0=Nvme0n1 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@67 -- # nvme0_pci=0000:00:10.0 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@68 -- # aio1=/home/vagrant/spdk_repo/spdk/test/dd/aio1 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@68 -- # bdev1=aio1 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@70 -- # method_bdev_nvme_attach_controller_1=(['name']='Nvme0' ['traddr']='0000:00:10.0' ['trtype']='pcie') 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@70 -- # declare -A method_bdev_nvme_attach_controller_1 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@75 -- # method_bdev_aio_create_0=(['name']='aio1' ['filename']='/home/vagrant/spdk_repo/spdk/test/dd/aio1' ['block_size']='4096') 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@75 -- # declare -A method_bdev_aio_create_0 00:42:20.447 12:44:43 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@83 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --of=/home/vagrant/spdk_repo/spdk/test/dd/aio1 --bs=1048576 --count=256 00:42:20.447 [2024-06-07 12:44:43.894932] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:20.447 [2024-06-07 12:44:43.895282] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234457 ] 00:42:20.447 [2024-06-07 12:44:44.050214] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:20.706 [2024-06-07 12:44:44.168644] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:21.530  Copying: 256/256 [MB] (average 1094 MBps) 00:42:21.530 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@89 -- # test_file0=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@90 -- # test_file1=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@92 -- # magic='This Is Our Magic, find it' 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@93 -- # echo 'This Is Our Magic, find it' 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@96 -- # run_test dd_inflate_file /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=append --bs=1048576 --count=64 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:21.530 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:21.530 ************************************ 00:42:21.531 START TEST dd_inflate_file 00:42:21.531 ************************************ 00:42:21.531 12:44:44 spdk_dd.spdk_dd_bdev_to_bdev.dd_inflate_file -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --oflag=append --bs=1048576 --count=64 00:42:21.531 [2024-06-07 12:44:44.946979] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:21.531 [2024-06-07 12:44:44.947293] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234480 ] 00:42:21.531 [2024-06-07 12:44:45.090521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:21.788 [2024-06-07 12:44:45.209812] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:22.354  Copying: 64/64 [MB] (average 842 MBps) 00:42:22.354 00:42:22.354 00:42:22.354 real 0m0.900s 00:42:22.354 user 0m0.440s 00:42:22.354 sys 0m0.331s 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev.dd_inflate_file -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev.dd_inflate_file -- common/autotest_common.sh@10 -- # set +x 00:42:22.354 ************************************ 00:42:22.354 END TEST dd_inflate_file 00:42:22.354 ************************************ 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@104 -- # wc -c 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@104 -- # test_file0_size=67108891 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@107 -- # gen_conf 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@107 -- # run_test dd_copy_to_out_bdev /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --json /dev/fd/62 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@31 -- # xtrace_disable 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:22.354 ************************************ 00:42:22.354 START TEST dd_copy_to_out_bdev 00:42:22.354 ************************************ 00:42:22.354 12:44:45 spdk_dd.spdk_dd_bdev_to_bdev.dd_copy_to_out_bdev -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ob=Nvme0n1 --json /dev/fd/62 00:42:22.354 { 00:42:22.354 "subsystems": [ 00:42:22.354 { 00:42:22.354 "subsystem": "bdev", 00:42:22.354 "config": [ 00:42:22.354 { 00:42:22.354 "params": { 00:42:22.354 "block_size": 4096, 00:42:22.354 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:22.354 "name": "aio1" 00:42:22.354 }, 00:42:22.354 "method": "bdev_aio_create" 00:42:22.354 }, 00:42:22.354 { 00:42:22.354 "params": { 00:42:22.354 "trtype": "pcie", 00:42:22.354 "traddr": "0000:00:10.0", 00:42:22.354 "name": "Nvme0" 00:42:22.354 }, 00:42:22.354 "method": "bdev_nvme_attach_controller" 00:42:22.354 }, 00:42:22.354 { 00:42:22.354 "method": "bdev_wait_for_examine" 00:42:22.354 } 00:42:22.354 ] 00:42:22.354 } 00:42:22.354 ] 00:42:22.354 } 00:42:22.354 [2024-06-07 12:44:45.921462] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:22.354 [2024-06-07 12:44:45.921846] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234528 ] 00:42:22.613 [2024-06-07 12:44:46.074836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:22.613 [2024-06-07 12:44:46.196032] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:24.113  Copying: 64/64 [MB] (average 83 MBps) 00:42:24.113 00:42:24.113 00:42:24.113 real 0m1.729s 00:42:24.113 user 0m1.279s 00:42:24.113 sys 0m0.363s 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_copy_to_out_bdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_copy_to_out_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:24.113 ************************************ 00:42:24.113 END TEST dd_copy_to_out_bdev 00:42:24.113 ************************************ 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@113 -- # count=65 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@115 -- # run_test dd_offset_magic offset_magic 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:24.113 ************************************ 00:42:24.113 START TEST dd_offset_magic 00:42:24.113 ************************************ 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@1124 -- # offset_magic 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@13 -- # local magic_check 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@14 -- # local offsets offset 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@16 -- # offsets=(16 64) 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@18 -- # for offset in "${offsets[@]}" 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@20 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --ob=aio1 --count=65 --seek=16 --bs=1048576 --json /dev/fd/62 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@20 -- # gen_conf 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/common.sh@31 -- # xtrace_disable 00:42:24.113 12:44:47 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@10 -- # set +x 00:42:24.113 [2024-06-07 12:44:47.720063] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:24.113 [2024-06-07 12:44:47.720617] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234573 ] 00:42:24.113 { 00:42:24.113 "subsystems": [ 00:42:24.113 { 00:42:24.113 "subsystem": "bdev", 00:42:24.113 "config": [ 00:42:24.113 { 00:42:24.113 "params": { 00:42:24.113 "block_size": 4096, 00:42:24.113 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:24.113 "name": "aio1" 00:42:24.113 }, 00:42:24.113 "method": "bdev_aio_create" 00:42:24.113 }, 00:42:24.113 { 00:42:24.113 "params": { 00:42:24.113 "trtype": "pcie", 00:42:24.113 "traddr": "0000:00:10.0", 00:42:24.113 "name": "Nvme0" 00:42:24.113 }, 00:42:24.113 "method": "bdev_nvme_attach_controller" 00:42:24.113 }, 00:42:24.113 { 00:42:24.113 "method": "bdev_wait_for_examine" 00:42:24.113 } 00:42:24.113 ] 00:42:24.113 } 00:42:24.113 ] 00:42:24.113 } 00:42:24.371 [2024-06-07 12:44:47.893266] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:24.371 [2024-06-07 12:44:47.988312] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:25.194  Copying: 65/65 [MB] (average 308 MBps) 00:42:25.194 00:42:25.194 12:44:48 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@28 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=aio1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --count=1 --skip=16 --bs=1048576 --json /dev/fd/62 00:42:25.194 12:44:48 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@28 -- # gen_conf 00:42:25.194 12:44:48 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/common.sh@31 -- # xtrace_disable 00:42:25.194 12:44:48 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@10 -- # set +x 00:42:25.452 [2024-06-07 12:44:48.847267] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:25.452 [2024-06-07 12:44:48.847657] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234599 ] 00:42:25.452 { 00:42:25.453 "subsystems": [ 00:42:25.453 { 00:42:25.453 "subsystem": "bdev", 00:42:25.453 "config": [ 00:42:25.453 { 00:42:25.453 "params": { 00:42:25.453 "block_size": 4096, 00:42:25.453 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:25.453 "name": "aio1" 00:42:25.453 }, 00:42:25.453 "method": "bdev_aio_create" 00:42:25.453 }, 00:42:25.453 { 00:42:25.453 "params": { 00:42:25.453 "trtype": "pcie", 00:42:25.453 "traddr": "0000:00:10.0", 00:42:25.453 "name": "Nvme0" 00:42:25.453 }, 00:42:25.453 "method": "bdev_nvme_attach_controller" 00:42:25.453 }, 00:42:25.453 { 00:42:25.453 "method": "bdev_wait_for_examine" 00:42:25.453 } 00:42:25.453 ] 00:42:25.453 } 00:42:25.453 ] 00:42:25.453 } 00:42:25.453 [2024-06-07 12:44:49.005859] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:25.710 [2024-06-07 12:44:49.105307] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:26.275  Copying: 1024/1024 [kB] (average 500 MBps) 00:42:26.275 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@35 -- # read -rn26 magic_check 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@36 -- # [[ This Is Our Magic, find it == \T\h\i\s\ \I\s\ \O\u\r\ \M\a\g\i\c\,\ \f\i\n\d\ \i\t ]] 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@18 -- # for offset in "${offsets[@]}" 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@20 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=Nvme0n1 --ob=aio1 --count=65 --seek=64 --bs=1048576 --json /dev/fd/62 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@20 -- # gen_conf 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/common.sh@31 -- # xtrace_disable 00:42:26.275 12:44:49 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@10 -- # set +x 00:42:26.275 [2024-06-07 12:44:49.782282] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:26.275 [2024-06-07 12:44:49.782563] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234616 ] 00:42:26.275 { 00:42:26.275 "subsystems": [ 00:42:26.275 { 00:42:26.275 "subsystem": "bdev", 00:42:26.275 "config": [ 00:42:26.275 { 00:42:26.275 "params": { 00:42:26.275 "block_size": 4096, 00:42:26.275 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:26.275 "name": "aio1" 00:42:26.275 }, 00:42:26.275 "method": "bdev_aio_create" 00:42:26.275 }, 00:42:26.275 { 00:42:26.275 "params": { 00:42:26.275 "trtype": "pcie", 00:42:26.275 "traddr": "0000:00:10.0", 00:42:26.275 "name": "Nvme0" 00:42:26.275 }, 00:42:26.275 "method": "bdev_nvme_attach_controller" 00:42:26.275 }, 00:42:26.275 { 00:42:26.275 "method": "bdev_wait_for_examine" 00:42:26.275 } 00:42:26.275 ] 00:42:26.275 } 00:42:26.275 ] 00:42:26.275 } 00:42:26.533 [2024-06-07 12:44:49.926949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:26.533 [2024-06-07 12:44:50.040881] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:27.357  Copying: 65/65 [MB] (average 436 MBps) 00:42:27.357 00:42:27.357 12:44:50 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@28 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=aio1 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --count=1 --skip=64 --bs=1048576 --json /dev/fd/62 00:42:27.357 12:44:50 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@28 -- # gen_conf 00:42:27.357 12:44:50 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/common.sh@31 -- # xtrace_disable 00:42:27.357 12:44:50 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@10 -- # set +x 00:42:27.357 [2024-06-07 12:44:50.849658] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:27.357 [2024-06-07 12:44:50.850524] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234638 ] 00:42:27.357 { 00:42:27.357 "subsystems": [ 00:42:27.357 { 00:42:27.357 "subsystem": "bdev", 00:42:27.357 "config": [ 00:42:27.357 { 00:42:27.357 "params": { 00:42:27.357 "block_size": 4096, 00:42:27.357 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:27.357 "name": "aio1" 00:42:27.357 }, 00:42:27.357 "method": "bdev_aio_create" 00:42:27.357 }, 00:42:27.357 { 00:42:27.357 "params": { 00:42:27.357 "trtype": "pcie", 00:42:27.357 "traddr": "0000:00:10.0", 00:42:27.357 "name": "Nvme0" 00:42:27.357 }, 00:42:27.357 "method": "bdev_nvme_attach_controller" 00:42:27.357 }, 00:42:27.357 { 00:42:27.357 "method": "bdev_wait_for_examine" 00:42:27.357 } 00:42:27.357 ] 00:42:27.357 } 00:42:27.357 ] 00:42:27.357 } 00:42:27.357 [2024-06-07 12:44:50.998501] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:27.615 [2024-06-07 12:44:51.088389] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:28.131  Copying: 1024/1024 [kB] (average 1000 MBps) 00:42:28.131 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@35 -- # read -rn26 magic_check 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- dd/bdev_to_bdev.sh@36 -- # [[ This Is Our Magic, find it == \T\h\i\s\ \I\s\ \O\u\r\ \M\a\g\i\c\,\ \f\i\n\d\ \i\t ]] 00:42:28.389 00:42:28.389 real 0m4.116s 00:42:28.389 user 0m2.235s 00:42:28.389 sys 0m1.225s 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev.dd_offset_magic -- common/autotest_common.sh@10 -- # set +x 00:42:28.389 ************************************ 00:42:28.389 END TEST dd_offset_magic 00:42:28.389 ************************************ 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@1 -- # cleanup 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@42 -- # clear_nvme Nvme0n1 '' 4194330 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@10 -- # local bdev=Nvme0n1 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@11 -- # local nvme_ref= 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@12 -- # local size=4194330 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@14 -- # local bs=1048576 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@15 -- # local count=5 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@18 -- # gen_conf 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=Nvme0n1 --count=5 --json /dev/fd/62 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@31 -- # xtrace_disable 00:42:28.389 12:44:51 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:28.389 [2024-06-07 12:44:51.854693] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:28.389 [2024-06-07 12:44:51.855384] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234675 ] 00:42:28.389 { 00:42:28.389 "subsystems": [ 00:42:28.389 { 00:42:28.389 "subsystem": "bdev", 00:42:28.389 "config": [ 00:42:28.389 { 00:42:28.389 "params": { 00:42:28.389 "block_size": 4096, 00:42:28.389 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:28.389 "name": "aio1" 00:42:28.389 }, 00:42:28.389 "method": "bdev_aio_create" 00:42:28.389 }, 00:42:28.389 { 00:42:28.389 "params": { 00:42:28.389 "trtype": "pcie", 00:42:28.389 "traddr": "0000:00:10.0", 00:42:28.389 "name": "Nvme0" 00:42:28.389 }, 00:42:28.389 "method": "bdev_nvme_attach_controller" 00:42:28.389 }, 00:42:28.389 { 00:42:28.389 "method": "bdev_wait_for_examine" 00:42:28.389 } 00:42:28.389 ] 00:42:28.389 } 00:42:28.389 ] 00:42:28.389 } 00:42:28.389 [2024-06-07 12:44:51.996876] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:28.648 [2024-06-07 12:44:52.095333] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:29.165  Copying: 5120/5120 [kB] (average 1250 MBps) 00:42:29.165 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@43 -- # clear_nvme aio1 '' 4194330 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@10 -- # local bdev=aio1 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@11 -- # local nvme_ref= 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@12 -- # local size=4194330 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@14 -- # local bs=1048576 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@15 -- # local count=5 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/zero --bs=1048576 --ob=aio1 --count=5 --json /dev/fd/62 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@18 -- # gen_conf 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- dd/common.sh@31 -- # xtrace_disable 00:42:29.165 12:44:52 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:29.165 [2024-06-07 12:44:52.769424] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:29.165 [2024-06-07 12:44:52.769750] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234696 ] 00:42:29.423 { 00:42:29.423 "subsystems": [ 00:42:29.423 { 00:42:29.423 "subsystem": "bdev", 00:42:29.423 "config": [ 00:42:29.423 { 00:42:29.423 "params": { 00:42:29.423 "block_size": 4096, 00:42:29.423 "filename": "/home/vagrant/spdk_repo/spdk/test/dd/aio1", 00:42:29.423 "name": "aio1" 00:42:29.423 }, 00:42:29.423 "method": "bdev_aio_create" 00:42:29.423 }, 00:42:29.423 { 00:42:29.423 "params": { 00:42:29.423 "trtype": "pcie", 00:42:29.423 "traddr": "0000:00:10.0", 00:42:29.423 "name": "Nvme0" 00:42:29.423 }, 00:42:29.423 "method": "bdev_nvme_attach_controller" 00:42:29.423 }, 00:42:29.423 { 00:42:29.423 "method": "bdev_wait_for_examine" 00:42:29.423 } 00:42:29.423 ] 00:42:29.423 } 00:42:29.423 ] 00:42:29.423 } 00:42:29.423 [2024-06-07 12:44:52.920851] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:29.423 [2024-06-07 12:44:53.018402] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:30.245  Copying: 5120/5120 [kB] (average 238 MBps) 00:42:30.245 00:42:30.245 12:44:53 spdk_dd.spdk_dd_bdev_to_bdev -- dd/bdev_to_bdev.sh@44 -- # rm -f /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 /home/vagrant/spdk_repo/spdk/test/dd/aio1 00:42:30.245 00:42:30.245 real 0m9.988s 00:42:30.245 user 0m5.631s 00:42:30.245 sys 0m3.217s 00:42:30.245 12:44:53 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:30.245 12:44:53 spdk_dd.spdk_dd_bdev_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:30.245 ************************************ 00:42:30.245 END TEST spdk_dd_bdev_to_bdev 00:42:30.245 ************************************ 00:42:30.245 12:44:53 spdk_dd -- dd/dd.sh@24 -- # (( SPDK_TEST_URING == 1 )) 00:42:30.245 12:44:53 spdk_dd -- dd/dd.sh@27 -- # run_test spdk_dd_sparse /home/vagrant/spdk_repo/spdk/test/dd/sparse.sh 00:42:30.245 12:44:53 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:30.245 12:44:53 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:30.245 12:44:53 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:42:30.245 ************************************ 00:42:30.245 START TEST spdk_dd_sparse 00:42:30.245 ************************************ 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/sparse.sh 00:42:30.245 * Looking for test storage... 00:42:30.245 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:30.245 12:44:53 spdk_dd.spdk_dd_sparse -- paths/export.sh@5 -- # export PATH 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@108 -- # aio_disk=dd_sparse_aio_disk 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@109 -- # aio_bdev=dd_aio 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@110 -- # file1=file_zero1 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@111 -- # file2=file_zero2 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@112 -- # file3=file_zero3 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@113 -- # lvstore=dd_lvstore 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@114 -- # lvol=dd_lvol 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@116 -- # trap cleanup EXIT 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@118 -- # prepare 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@18 -- # truncate dd_sparse_aio_disk --size 104857600 00:42:30.246 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@20 -- # dd if=/dev/zero of=file_zero1 bs=4M count=1 00:42:30.504 1+0 records in 00:42:30.504 1+0 records out 00:42:30.504 4194304 bytes (4.2 MB, 4.0 MiB) copied, 0.00447274 s, 938 MB/s 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@21 -- # dd if=/dev/zero of=file_zero1 bs=4M count=1 seek=4 00:42:30.504 1+0 records in 00:42:30.504 1+0 records out 00:42:30.504 4194304 bytes (4.2 MB, 4.0 MiB) copied, 0.00596021 s, 704 MB/s 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@22 -- # dd if=/dev/zero of=file_zero1 bs=4M count=1 seek=8 00:42:30.504 1+0 records in 00:42:30.504 1+0 records out 00:42:30.504 4194304 bytes (4.2 MB, 4.0 MiB) copied, 0.00542676 s, 773 MB/s 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@120 -- # run_test dd_sparse_file_to_file file_to_file 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@10 -- # set +x 00:42:30.504 ************************************ 00:42:30.504 START TEST dd_sparse_file_to_file 00:42:30.504 ************************************ 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- common/autotest_common.sh@1124 -- # file_to_file 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@26 -- # local stat1_s stat1_b 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@27 -- # local stat2_s stat2_b 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@29 -- # method_bdev_aio_create_0=(['filename']='dd_sparse_aio_disk' ['name']='dd_aio' ['block_size']='4096') 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@29 -- # local -A method_bdev_aio_create_0 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@35 -- # method_bdev_lvol_create_lvstore_1=(['bdev_name']='dd_aio' ['lvs_name']='dd_lvstore') 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@35 -- # local -A method_bdev_lvol_create_lvstore_1 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@41 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=file_zero1 --of=file_zero2 --bs=12582912 --sparse --json /dev/fd/62 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@41 -- # gen_conf 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/common.sh@31 -- # xtrace_disable 00:42:30.504 12:44:53 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- common/autotest_common.sh@10 -- # set +x 00:42:30.504 [2024-06-07 12:44:53.985932] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:30.504 [2024-06-07 12:44:53.986744] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234768 ] 00:42:30.504 { 00:42:30.504 "subsystems": [ 00:42:30.504 { 00:42:30.504 "subsystem": "bdev", 00:42:30.504 "config": [ 00:42:30.504 { 00:42:30.504 "params": { 00:42:30.504 "block_size": 4096, 00:42:30.504 "filename": "dd_sparse_aio_disk", 00:42:30.504 "name": "dd_aio" 00:42:30.504 }, 00:42:30.504 "method": "bdev_aio_create" 00:42:30.504 }, 00:42:30.504 { 00:42:30.504 "params": { 00:42:30.504 "lvs_name": "dd_lvstore", 00:42:30.504 "bdev_name": "dd_aio" 00:42:30.504 }, 00:42:30.504 "method": "bdev_lvol_create_lvstore" 00:42:30.504 }, 00:42:30.504 { 00:42:30.504 "method": "bdev_wait_for_examine" 00:42:30.504 } 00:42:30.504 ] 00:42:30.504 } 00:42:30.504 ] 00:42:30.504 } 00:42:30.504 [2024-06-07 12:44:54.134557] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:30.761 [2024-06-07 12:44:54.225680] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:31.299  Copying: 12/36 [MB] (average 800 MBps) 00:42:31.299 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@47 -- # stat --printf=%s file_zero1 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@47 -- # stat1_s=37748736 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@48 -- # stat --printf=%s file_zero2 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@48 -- # stat2_s=37748736 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@50 -- # [[ 37748736 == \3\7\7\4\8\7\3\6 ]] 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@52 -- # stat --printf=%b file_zero1 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@52 -- # stat1_b=24576 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@53 -- # stat --printf=%b file_zero2 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@53 -- # stat2_b=24576 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- dd/sparse.sh@55 -- # [[ 24576 == \2\4\5\7\6 ]] 00:42:31.299 00:42:31.299 real 0m0.931s 00:42:31.299 user 0m0.502s 00:42:31.299 sys 0m0.322s 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_file -- common/autotest_common.sh@10 -- # set +x 00:42:31.299 ************************************ 00:42:31.299 END TEST dd_sparse_file_to_file 00:42:31.299 ************************************ 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@121 -- # run_test dd_sparse_file_to_bdev file_to_bdev 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:31.299 12:44:54 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@10 -- # set +x 00:42:31.557 ************************************ 00:42:31.557 START TEST dd_sparse_file_to_bdev 00:42:31.557 ************************************ 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- common/autotest_common.sh@1124 -- # file_to_bdev 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@59 -- # method_bdev_aio_create_0=(['filename']='dd_sparse_aio_disk' ['name']='dd_aio' ['block_size']='4096') 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@59 -- # local -A method_bdev_aio_create_0 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@65 -- # method_bdev_lvol_create_1=(['lvs_name']='dd_lvstore' ['lvol_name']='dd_lvol' ['size_in_mib']='36' ['thin_provision']='true') 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@65 -- # local -A method_bdev_lvol_create_1 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@73 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=file_zero2 --ob=dd_lvstore/dd_lvol --bs=12582912 --sparse --json /dev/fd/62 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/sparse.sh@73 -- # gen_conf 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- dd/common.sh@31 -- # xtrace_disable 00:42:31.557 12:44:54 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:31.557 [2024-06-07 12:44:54.988045] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:31.557 [2024-06-07 12:44:54.988521] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234821 ] 00:42:31.557 { 00:42:31.557 "subsystems": [ 00:42:31.557 { 00:42:31.557 "subsystem": "bdev", 00:42:31.557 "config": [ 00:42:31.557 { 00:42:31.557 "params": { 00:42:31.557 "block_size": 4096, 00:42:31.557 "filename": "dd_sparse_aio_disk", 00:42:31.557 "name": "dd_aio" 00:42:31.557 }, 00:42:31.557 "method": "bdev_aio_create" 00:42:31.557 }, 00:42:31.557 { 00:42:31.557 "params": { 00:42:31.557 "lvs_name": "dd_lvstore", 00:42:31.557 "lvol_name": "dd_lvol", 00:42:31.557 "size_in_mib": 36, 00:42:31.557 "thin_provision": true 00:42:31.557 }, 00:42:31.557 "method": "bdev_lvol_create" 00:42:31.557 }, 00:42:31.557 { 00:42:31.557 "method": "bdev_wait_for_examine" 00:42:31.557 } 00:42:31.557 ] 00:42:31.557 } 00:42:31.557 ] 00:42:31.557 } 00:42:31.557 [2024-06-07 12:44:55.123988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:31.814 [2024-06-07 12:44:55.220891] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:32.380  Copying: 12/36 [MB] (average 545 MBps) 00:42:32.380 00:42:32.380 00:42:32.380 real 0m0.871s 00:42:32.380 user 0m0.518s 00:42:32.380 sys 0m0.285s 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_file_to_bdev -- common/autotest_common.sh@10 -- # set +x 00:42:32.380 ************************************ 00:42:32.380 END TEST dd_sparse_file_to_bdev 00:42:32.380 ************************************ 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@122 -- # run_test dd_sparse_bdev_to_file bdev_to_file 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@10 -- # set +x 00:42:32.380 ************************************ 00:42:32.380 START TEST dd_sparse_bdev_to_file 00:42:32.380 ************************************ 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- common/autotest_common.sh@1124 -- # bdev_to_file 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@81 -- # local stat2_s stat2_b 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@82 -- # local stat3_s stat3_b 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@84 -- # method_bdev_aio_create_0=(['filename']='dd_sparse_aio_disk' ['name']='dd_aio' ['block_size']='4096') 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@84 -- # local -A method_bdev_aio_create_0 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@91 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=dd_lvstore/dd_lvol --of=file_zero3 --bs=12582912 --sparse --json /dev/fd/62 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@91 -- # gen_conf 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/common.sh@31 -- # xtrace_disable 00:42:32.380 12:44:55 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- common/autotest_common.sh@10 -- # set +x 00:42:32.380 [2024-06-07 12:44:55.932679] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:32.380 [2024-06-07 12:44:55.933306] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid234868 ] 00:42:32.380 { 00:42:32.380 "subsystems": [ 00:42:32.380 { 00:42:32.380 "subsystem": "bdev", 00:42:32.380 "config": [ 00:42:32.380 { 00:42:32.380 "params": { 00:42:32.380 "block_size": 4096, 00:42:32.380 "filename": "dd_sparse_aio_disk", 00:42:32.380 "name": "dd_aio" 00:42:32.380 }, 00:42:32.380 "method": "bdev_aio_create" 00:42:32.380 }, 00:42:32.380 { 00:42:32.380 "method": "bdev_wait_for_examine" 00:42:32.380 } 00:42:32.380 ] 00:42:32.380 } 00:42:32.380 ] 00:42:32.380 } 00:42:32.639 [2024-06-07 12:44:56.085602] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:32.639 [2024-06-07 12:44:56.200680] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:33.154  Copying: 12/36 [MB] (average 1090 MBps) 00:42:33.154 00:42:33.154 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@97 -- # stat --printf=%s file_zero2 00:42:33.154 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@97 -- # stat2_s=37748736 00:42:33.155 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@98 -- # stat --printf=%s file_zero3 00:42:33.155 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@98 -- # stat3_s=37748736 00:42:33.155 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@100 -- # [[ 37748736 == \3\7\7\4\8\7\3\6 ]] 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@102 -- # stat --printf=%b file_zero2 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@102 -- # stat2_b=24576 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@103 -- # stat --printf=%b file_zero3 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@103 -- # stat3_b=24576 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- dd/sparse.sh@105 -- # [[ 24576 == \2\4\5\7\6 ]] 00:42:33.413 00:42:33.413 real 0m0.927s 00:42:33.413 user 0m0.501s 00:42:33.413 sys 0m0.336s 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse.dd_sparse_bdev_to_file -- common/autotest_common.sh@10 -- # set +x 00:42:33.413 ************************************ 00:42:33.413 END TEST dd_sparse_bdev_to_file 00:42:33.413 ************************************ 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@1 -- # cleanup 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@11 -- # rm dd_sparse_aio_disk 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@12 -- # rm file_zero1 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@13 -- # rm file_zero2 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- dd/sparse.sh@14 -- # rm file_zero3 00:42:33.413 00:42:33.413 real 0m3.105s 00:42:33.413 user 0m1.639s 00:42:33.413 sys 0m1.193s 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.413 12:44:56 spdk_dd.spdk_dd_sparse -- common/autotest_common.sh@10 -- # set +x 00:42:33.413 ************************************ 00:42:33.413 END TEST spdk_dd_sparse 00:42:33.413 ************************************ 00:42:33.413 12:44:56 spdk_dd -- dd/dd.sh@28 -- # run_test spdk_dd_negative /home/vagrant/spdk_repo/spdk/test/dd/negative_dd.sh 00:42:33.413 12:44:56 spdk_dd -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:33.413 12:44:56 spdk_dd -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:33.413 12:44:56 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:42:33.413 ************************************ 00:42:33.413 START TEST spdk_dd_negative 00:42:33.413 ************************************ 00:42:33.413 12:44:56 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1124 -- # /home/vagrant/spdk_repo/spdk/test/dd/negative_dd.sh 00:42:33.413 * Looking for test storage... 00:42:33.413 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dd 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- paths/export.sh@2 -- # PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- paths/export.sh@3 -- # PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- paths/export.sh@5 -- # export PATH 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@101 -- # test_file0=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@102 -- # test_file1=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@104 -- # touch /home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:33.413 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@105 -- # touch /home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@107 -- # run_test dd_invalid_arguments invalid_arguments 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:33.671 ************************************ 00:42:33.671 START TEST dd_invalid_arguments 00:42:33.671 ************************************ 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@1124 -- # invalid_arguments 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- dd/negative_dd.sh@12 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ii= --ob= 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@649 -- # local es=0 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ii= --ob= 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.671 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ii= --ob= 00:42:33.672 /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd: unrecognized option '--ii=' 00:42:33.672 /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd [options] 00:42:33.672 00:42:33.672 CPU options: 00:42:33.672 -m, --cpumask core mask (like 0xF) or core list of '[]' embraced for DPDK 00:42:33.672 (like [0,1,10]) 00:42:33.672 --lcores lcore to CPU mapping list. The list is in the format: 00:42:33.672 [<,lcores[@CPUs]>...] 00:42:33.672 lcores and cpus list are grouped by '(' and ')', e.g '--lcores "(5-7)@(10-12)"' 00:42:33.672 Within the group, '-' is used for range separator, 00:42:33.672 ',' is used for single number separator. 00:42:33.672 '( )' can be omitted for single element group, 00:42:33.672 '@' can be omitted if cpus and lcores have the same value 00:42:33.672 --disable-cpumask-locks Disable CPU core lock files. 00:42:33.672 --interrupt-mode set app to interrupt mode (Warning: CPU usage will be reduced only if all 00:42:33.672 pollers in the app support interrupt mode) 00:42:33.672 -p, --main-core main (primary) core for DPDK 00:42:33.672 00:42:33.672 Configuration options: 00:42:33.672 -c, --config, --json JSON config file 00:42:33.672 -r, --rpc-socket RPC listen address (default /var/tmp/spdk.sock) 00:42:33.672 --no-rpc-server skip RPC server initialization. This option ignores '--rpc-socket' value. 00:42:33.672 --wait-for-rpc wait for RPCs to initialize subsystems 00:42:33.672 --rpcs-allowed comma-separated list of permitted RPCS 00:42:33.672 --json-ignore-init-errors don't exit on invalid config entry 00:42:33.672 00:42:33.672 Memory options: 00:42:33.672 --iova-mode set IOVA mode ('pa' for IOVA_PA and 'va' for IOVA_VA) 00:42:33.672 --base-virtaddr the base virtual address for DPDK (default: 0x200000000000) 00:42:33.672 --huge-dir use a specific hugetlbfs mount to reserve memory from 00:42:33.672 -R, --huge-unlink unlink huge files after initialization 00:42:33.672 -n, --mem-channels number of memory channels used for DPDK 00:42:33.672 -s, --mem-size memory size in MB for DPDK (default: 0MB) 00:42:33.672 --msg-mempool-size global message memory pool size in count (default: 262143) 00:42:33.672 --no-huge run without using hugepages 00:42:33.672 -i, --shm-id shared memory ID (optional) 00:42:33.672 -g, --single-file-segments force creating just one hugetlbfs file 00:42:33.672 00:42:33.672 PCI options: 00:42:33.672 -A, --pci-allowed pci addr to allow (-B and -A cannot be used at the same time) 00:42:33.672 -B, --pci-blocked pci addr to block (can be used more than once) 00:42:33.672 -u, --no-pci disable PCI access 00:42:33.672 --vfio-vf-token VF token (UUID) shared between SR-IOV PF and VFs for vfio_pci driver 00:42:33.672 00:42:33.672 Log options: 00:42:33.672 -L, --logflag enable log flag (all, accel, accel_dsa, accel_iaa, accel_ioat, aio, 00:42:33.672 app_config, app_rpc, bdev, bdev_concat, bdev_ftl, bdev_malloc, 00:42:33.672 bdev_null, bdev_nvme, bdev_raid, bdev_raid0, bdev_raid1, bdev_raid_sb, 00:42:33.672 blob, blob_esnap, blob_rw, blobfs, blobfs_bdev, blobfs_bdev_rpc, 00:42:33.672 blobfs_rw, ftl_core, ftl_init, gpt_parse, idxd, ioat, iscsi_init, 00:42:33.672 json_util, keyring, log_rpc, lvol, lvol_rpc, notify_rpc, nvme, 00:42:33.672 nvme_auth, nvme_cuse, opal, reactor, rpc, rpc_client, sock, sock_posix, 00:42:33.672 thread, trace, vbdev_delay, vbdev_gpt, vbdev_lvol, vbdev_opal, 00:42:33.672 vbdev_passthru, vbdev_split, vbdev_zone_block, vfio_pci, vfio_user, 00:42:33.672 virtio, virtio_blk, virtio_dev, virtio_pci, virtio_user, 00:42:33.672 virtio_vfio_user, vmd) 00:42:33.672 --silence-noticelog disable notice level logging to stderr 00:42:33.672 00:42:33.672 Trace options: 00:42:33.672 --num-trace-entries number of trace entries for each core, must be power of 2, 00:42:33.672 setting 0 to disable trace (default 32768) 00:42:33.672 Tracepoints vary in size and can use more than one trace entry. 00:42:33.672 -e, --tpoint-group [:] 00:42:33.672 group_name - tracepoint group name for spdk trace buffers (bdev, ftl, 00:42:33.672 blob[2024-06-07 12:44:57.114353] spdk_dd.c:1480:main: *ERROR*: Invalid arguments 00:42:33.672 fs, dsa, thread, nvme_pcie, iaa, nvme_tcp, bdev_nvme, sock, all). 00:42:33.672 tpoint_mask - tracepoint mask for enabling individual tpoints inside 00:42:33.672 a tracepoint group. First tpoint inside a group can be enabled by 00:42:33.672 setting tpoint_mask to 1 (e.g. bdev:0x1). Groups and masks can be 00:42:33.672 combined (e.g. thread,bdev:0x1). All available tpoints can be found 00:42:33.672 in /include/spdk_internal/trace_defs.h 00:42:33.672 00:42:33.672 Other options: 00:42:33.672 -h, --help show this usage 00:42:33.672 -v, --version print SPDK version 00:42:33.672 -d, --limit-coredump do not set max coredump size to RLIM_INFINITY 00:42:33.672 --env-context Opaque context for use of the env implementation 00:42:33.672 00:42:33.672 Application specific: 00:42:33.672 [--------- DD Options ---------] 00:42:33.672 --if Input file. Must specify either --if or --ib. 00:42:33.672 --ib Input bdev. Must specifier either --if or --ib 00:42:33.672 --of Output file. Must specify either --of or --ob. 00:42:33.672 --ob Output bdev. Must specify either --of or --ob. 00:42:33.672 --iflag Input file flags. 00:42:33.672 --oflag Output file flags. 00:42:33.672 --bs I/O unit size (default: 4096) 00:42:33.672 --qd Queue depth (default: 2) 00:42:33.672 --count I/O unit count. The number of I/O units to copy. (default: all) 00:42:33.672 --skip Skip this many I/O units at start of input. (default: 0) 00:42:33.672 --seek Skip this many I/O units at start of output. (default: 0) 00:42:33.672 --aio Force usage of AIO. (by default io_uring is used if available) 00:42:33.672 --sparse Enable hole skipping in input target 00:42:33.672 Available iflag and oflag values: 00:42:33.672 append - append mode 00:42:33.672 direct - use direct I/O for data 00:42:33.672 directory - fail unless a directory 00:42:33.672 dsync - use synchronized I/O for data 00:42:33.672 noatime - do not update access time 00:42:33.672 noctty - do not assign controlling terminal from file 00:42:33.672 nofollow - do not follow symlinks 00:42:33.672 nonblock - use non-blocking I/O 00:42:33.672 sync - use synchronized I/O for data and metadata 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@652 -- # es=2 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:33.672 00:42:33.672 real 0m0.080s 00:42:33.672 user 0m0.041s 00:42:33.672 sys 0m0.037s 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_invalid_arguments -- common/autotest_common.sh@10 -- # set +x 00:42:33.672 ************************************ 00:42:33.672 END TEST dd_invalid_arguments 00:42:33.672 ************************************ 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@108 -- # run_test dd_double_input double_input 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:33.672 ************************************ 00:42:33.672 START TEST dd_double_input 00:42:33.672 ************************************ 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@1124 -- # double_input 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- dd/negative_dd.sh@19 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ib= --ob= 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@649 -- # local es=0 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ib= --ob= 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.672 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --ib= --ob= 00:42:33.673 [2024-06-07 12:44:57.247467] spdk_dd.c:1487:main: *ERROR*: You may specify either --if or --ib, but not both. 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@652 -- # es=22 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:33.673 00:42:33.673 real 0m0.070s 00:42:33.673 user 0m0.038s 00:42:33.673 sys 0m0.032s 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.673 ************************************ 00:42:33.673 END TEST dd_double_input 00:42:33.673 ************************************ 00:42:33.673 12:44:57 spdk_dd.spdk_dd_negative.dd_double_input -- common/autotest_common.sh@10 -- # set +x 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@109 -- # run_test dd_double_output double_output 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:33.932 ************************************ 00:42:33.932 START TEST dd_double_output 00:42:33.932 ************************************ 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@1124 -- # double_output 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- dd/negative_dd.sh@27 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --ob= 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@649 -- # local es=0 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --ob= 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.932 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --ob= 00:42:33.933 [2024-06-07 12:44:57.380782] spdk_dd.c:1493:main: *ERROR*: You may specify either --of or --ob, but not both. 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@652 -- # es=22 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:33.933 00:42:33.933 real 0m0.071s 00:42:33.933 user 0m0.031s 00:42:33.933 sys 0m0.040s 00:42:33.933 ************************************ 00:42:33.933 END TEST dd_double_output 00:42:33.933 ************************************ 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_double_output -- common/autotest_common.sh@10 -- # set +x 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@110 -- # run_test dd_no_input no_input 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:33.933 ************************************ 00:42:33.933 START TEST dd_no_input 00:42:33.933 ************************************ 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@1124 -- # no_input 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- dd/negative_dd.sh@35 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ob= 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@649 -- # local es=0 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ob= 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ob= 00:42:33.933 [2024-06-07 12:44:57.526995] spdk_dd.c:1499:main: *ERROR*: You must specify either --if or --ib 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@652 -- # es=22 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:33.933 00:42:33.933 real 0m0.078s 00:42:33.933 user 0m0.040s 00:42:33.933 sys 0m0.035s 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:33.933 12:44:57 spdk_dd.spdk_dd_negative.dd_no_input -- common/autotest_common.sh@10 -- # set +x 00:42:33.933 ************************************ 00:42:33.933 END TEST dd_no_input 00:42:33.933 ************************************ 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@111 -- # run_test dd_no_output no_output 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:34.197 ************************************ 00:42:34.197 START TEST dd_no_output 00:42:34.197 ************************************ 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@1124 -- # no_output 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- dd/negative_dd.sh@41 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@649 -- # local es=0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 00:42:34.197 [2024-06-07 12:44:57.675202] spdk_dd.c:1505:main: *ERROR*: You must specify either --of or --ob 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@652 -- # es=22 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:34.197 00:42:34.197 real 0m0.074s 00:42:34.197 user 0m0.035s 00:42:34.197 sys 0m0.038s 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_no_output -- common/autotest_common.sh@10 -- # set +x 00:42:34.197 ************************************ 00:42:34.197 END TEST dd_no_output 00:42:34.197 ************************************ 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@112 -- # run_test dd_wrong_blocksize wrong_blocksize 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:34.197 ************************************ 00:42:34.197 START TEST dd_wrong_blocksize 00:42:34.197 ************************************ 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@1124 -- # wrong_blocksize 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- dd/negative_dd.sh@47 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@649 -- # local es=0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=0 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.197 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=0 00:42:34.198 [2024-06-07 12:44:57.808111] spdk_dd.c:1511:main: *ERROR*: Invalid --bs value 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@652 -- # es=22 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:34.198 00:42:34.198 real 0m0.062s 00:42:34.198 user 0m0.033s 00:42:34.198 sys 0m0.029s 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:34.198 12:44:57 spdk_dd.spdk_dd_negative.dd_wrong_blocksize -- common/autotest_common.sh@10 -- # set +x 00:42:34.198 ************************************ 00:42:34.198 END TEST dd_wrong_blocksize 00:42:34.198 ************************************ 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@113 -- # run_test dd_smaller_blocksize smaller_blocksize 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:34.457 ************************************ 00:42:34.457 START TEST dd_smaller_blocksize 00:42:34.457 ************************************ 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@1124 -- # smaller_blocksize 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- dd/negative_dd.sh@55 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=99999999999999 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@649 -- # local es=0 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=99999999999999 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:34.457 12:44:57 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --bs=99999999999999 00:42:34.457 [2024-06-07 12:44:57.953054] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:34.457 [2024-06-07 12:44:57.953326] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid235122 ] 00:42:34.457 [2024-06-07 12:44:58.096184] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:34.715 [2024-06-07 12:44:58.204389] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:34.973 EAL: eal_memalloc_alloc_seg_bulk(): couldn't find suitable memseg_list 00:42:34.973 [2024-06-07 12:44:58.438208] spdk_dd.c:1184:dd_run: *ERROR*: Cannot allocate memory - try smaller block size value 00:42:34.973 [2024-06-07 12:44:58.438380] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:35.232 [2024-06-07 12:44:58.633088] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@652 -- # es=244 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@661 -- # es=116 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@662 -- # case "$es" in 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@669 -- # es=1 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:35.232 00:42:35.232 real 0m0.886s 00:42:35.232 user 0m0.444s 00:42:35.232 sys 0m0.335s 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_smaller_blocksize -- common/autotest_common.sh@10 -- # set +x 00:42:35.232 ************************************ 00:42:35.232 END TEST dd_smaller_blocksize 00:42:35.232 ************************************ 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@114 -- # run_test dd_invalid_count invalid_count 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:35.232 ************************************ 00:42:35.232 START TEST dd_invalid_count 00:42:35.232 ************************************ 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@1124 -- # invalid_count 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- dd/negative_dd.sh@63 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --count=-9 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@649 -- # local es=0 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --count=-9 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:35.232 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --count=-9 00:42:35.491 [2024-06-07 12:44:58.900404] spdk_dd.c:1517:main: *ERROR*: Invalid --count value 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@652 -- # es=22 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:35.491 00:42:35.491 real 0m0.072s 00:42:35.491 user 0m0.043s 00:42:35.491 sys 0m0.029s 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative.dd_invalid_count -- common/autotest_common.sh@10 -- # set +x 00:42:35.491 ************************************ 00:42:35.491 END TEST dd_invalid_count 00:42:35.491 ************************************ 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@115 -- # run_test dd_invalid_oflag invalid_oflag 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:35.491 12:44:58 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:35.491 ************************************ 00:42:35.491 START TEST dd_invalid_oflag 00:42:35.491 ************************************ 00:42:35.491 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@1124 -- # invalid_oflag 00:42:35.491 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- dd/negative_dd.sh@71 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --oflag=0 00:42:35.491 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@649 -- # local es=0 00:42:35.491 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --oflag=0 00:42:35.491 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --oflag=0 00:42:35.492 [2024-06-07 12:44:59.048066] spdk_dd.c:1523:main: *ERROR*: --oflags may be used only with --of 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@652 -- # es=22 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:35.492 00:42:35.492 real 0m0.078s 00:42:35.492 user 0m0.035s 00:42:35.492 sys 0m0.043s 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_oflag -- common/autotest_common.sh@10 -- # set +x 00:42:35.492 ************************************ 00:42:35.492 END TEST dd_invalid_oflag 00:42:35.492 ************************************ 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@116 -- # run_test dd_invalid_iflag invalid_iflag 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:35.492 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:35.750 ************************************ 00:42:35.750 START TEST dd_invalid_iflag 00:42:35.750 ************************************ 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@1124 -- # invalid_iflag 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- dd/negative_dd.sh@79 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --iflag=0 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@649 -- # local es=0 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --iflag=0 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib= --ob= --iflag=0 00:42:35.750 [2024-06-07 12:44:59.188938] spdk_dd.c:1529:main: *ERROR*: --iflags may be used only with --if 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@652 -- # es=22 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:35.750 00:42:35.750 real 0m0.076s 00:42:35.750 user 0m0.038s 00:42:35.750 sys 0m0.037s 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_iflag -- common/autotest_common.sh@10 -- # set +x 00:42:35.750 ************************************ 00:42:35.750 END TEST dd_invalid_iflag 00:42:35.750 ************************************ 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@117 -- # run_test dd_unknown_flag unknown_flag 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:35.750 ************************************ 00:42:35.750 START TEST dd_unknown_flag 00:42:35.750 ************************************ 00:42:35.750 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@1124 -- # unknown_flag 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- dd/negative_dd.sh@87 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=-1 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@649 -- # local es=0 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=-1 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:35.751 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --oflag=-1 00:42:35.751 [2024-06-07 12:44:59.346367] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:35.751 [2024-06-07 12:44:59.346618] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid235231 ] 00:42:36.010 [2024-06-07 12:44:59.494039] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:36.010 [2024-06-07 12:44:59.554201] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:36.010 [2024-06-07 12:44:59.625139] spdk_dd.c: 986:parse_flags: *ERROR*: Unknown file flag: -1 00:42:36.010 [2024-06-07 12:44:59.625266] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:36.010  Copying: 0/0 [B] (average 0 Bps)[2024-06-07 12:44:59.625476] app.c:1040:app_stop: *NOTICE*: spdk_app_stop called twice 00:42:36.268 [2024-06-07 12:44:59.744079] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:36.268 00:42:36.268 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@652 -- # es=234 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@661 -- # es=106 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@662 -- # case "$es" in 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@669 -- # es=1 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:36.268 00:42:36.268 real 0m0.575s 00:42:36.268 user 0m0.273s 00:42:36.268 sys 0m0.183s 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:36.268 12:44:59 spdk_dd.spdk_dd_negative.dd_unknown_flag -- common/autotest_common.sh@10 -- # set +x 00:42:36.268 ************************************ 00:42:36.268 END TEST dd_unknown_flag 00:42:36.268 ************************************ 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative -- dd/negative_dd.sh@118 -- # run_test dd_invalid_json invalid_json 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1106 -- # xtrace_disable 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:36.526 ************************************ 00:42:36.526 START TEST dd_invalid_json 00:42:36.526 ************************************ 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@1124 -- # invalid_json 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- dd/negative_dd.sh@95 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --json /dev/fd/62 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@649 -- # local es=0 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- dd/negative_dd.sh@95 -- # : 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@651 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --json /dev/fd/62 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@637 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@641 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@643 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@643 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@643 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd ]] 00:42:36.526 12:44:59 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@652 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump0 --of=/home/vagrant/spdk_repo/spdk/test/dd/dd.dump1 --json /dev/fd/62 00:42:36.526 [2024-06-07 12:44:59.988350] Starting SPDK v24.09-pre git sha1 e55c9a812 / DPDK 22.11.4 initialization... 00:42:36.526 [2024-06-07 12:44:59.989281] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid235273 ] 00:42:36.526 [2024-06-07 12:45:00.134052] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:36.803 [2024-06-07 12:45:00.190702] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:42:36.803 [2024-06-07 12:45:00.190799] json_config.c: 535:parse_json: *ERROR*: JSON data cannot be empty 00:42:36.803 [2024-06-07 12:45:00.190843] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:42:36.803 [2024-06-07 12:45:00.190865] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:36.803 [2024-06-07 12:45:00.190951] spdk_dd.c:1536:main: *ERROR*: Error occurred while performing copy 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@652 -- # es=234 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@661 -- # es=106 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@662 -- # case "$es" in 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@669 -- # es=1 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:42:36.803 00:42:36.803 real 0m0.366s 00:42:36.803 user 0m0.161s 00:42:36.803 sys 0m0.104s 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative.dd_invalid_json -- common/autotest_common.sh@10 -- # set +x 00:42:36.803 ************************************ 00:42:36.803 END TEST dd_invalid_json 00:42:36.803 ************************************ 00:42:36.803 00:42:36.803 real 0m3.410s 00:42:36.803 user 0m1.502s 00:42:36.803 sys 0m1.564s 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:36.803 12:45:00 spdk_dd.spdk_dd_negative -- common/autotest_common.sh@10 -- # set +x 00:42:36.803 ************************************ 00:42:36.803 END TEST spdk_dd_negative 00:42:36.803 ************************************ 00:42:36.803 00:42:36.803 real 1m16.206s 00:42:36.803 user 0m43.124s 00:42:36.803 sys 0m23.761s 00:42:36.803 12:45:00 spdk_dd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:42:36.803 12:45:00 spdk_dd -- common/autotest_common.sh@10 -- # set +x 00:42:36.803 ************************************ 00:42:36.803 END TEST spdk_dd 00:42:36.803 ************************************ 00:42:37.093 12:45:00 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@260 -- # timing_exit lib 00:42:37.093 12:45:00 -- common/autotest_common.sh@729 -- # xtrace_disable 00:42:37.093 12:45:00 -- common/autotest_common.sh@10 -- # set +x 00:42:37.093 12:45:00 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:42:37.093 12:45:00 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:42:37.093 12:45:00 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:42:37.093 12:45:00 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:42:37.093 12:45:00 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:42:37.093 12:45:00 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:42:37.093 12:45:00 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:42:37.093 12:45:00 -- common/autotest_common.sh@723 -- # xtrace_disable 00:42:37.093 12:45:00 -- common/autotest_common.sh@10 -- # set +x 00:42:37.093 12:45:00 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:42:37.093 12:45:00 -- common/autotest_common.sh@1391 -- # local autotest_es=0 00:42:37.093 12:45:00 -- common/autotest_common.sh@1392 -- # xtrace_disable 00:42:37.093 12:45:00 -- common/autotest_common.sh@10 -- # set +x 00:42:38.995 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:42:38.995 Waiting for block devices as requested 00:42:38.995 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:42:39.252 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda1,mount@vda:vda2,mount@vda:vda5, so not binding PCI dev 00:42:39.252 Cleaning 00:42:39.252 Removing: /var/run/dpdk/spdk0/config 00:42:39.252 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:42:39.252 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:42:39.252 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:42:39.252 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:42:39.252 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:42:39.252 Removing: /var/run/dpdk/spdk0/hugepage_info 00:42:39.252 Removing: /dev/shm/spdk_tgt_trace.pid190675 00:42:39.252 Removing: /var/run/dpdk/spdk0 00:42:39.252 Removing: /var/run/dpdk/spdk_pid190481 00:42:39.252 Removing: /var/run/dpdk/spdk_pid190675 00:42:39.252 Removing: /var/run/dpdk/spdk_pid190903 00:42:39.252 Removing: /var/run/dpdk/spdk_pid190996 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191036 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191158 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191181 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191320 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191569 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191739 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191821 00:42:39.252 Removing: /var/run/dpdk/spdk_pid191906 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192008 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192101 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192148 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192186 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192264 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192375 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192853 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192913 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192973 00:42:39.252 Removing: /var/run/dpdk/spdk_pid192985 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193073 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193094 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193180 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193201 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193258 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193281 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193326 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193349 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193495 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193533 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193581 00:42:39.252 Removing: /var/run/dpdk/spdk_pid193661 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193738 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193762 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193845 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193896 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193937 00:42:39.511 Removing: /var/run/dpdk/spdk_pid193987 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194033 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194079 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194129 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194174 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194225 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194264 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194313 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194359 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194403 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194454 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194500 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194544 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194590 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194644 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194686 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194737 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194776 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194856 00:42:39.511 Removing: /var/run/dpdk/spdk_pid194948 00:42:39.511 Removing: /var/run/dpdk/spdk_pid195114 00:42:39.511 Removing: /var/run/dpdk/spdk_pid195158 00:42:39.511 Removing: /var/run/dpdk/spdk_pid195191 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196151 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196345 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196531 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196637 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196749 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196798 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196829 00:42:39.511 Removing: /var/run/dpdk/spdk_pid196858 00:42:39.511 Removing: /var/run/dpdk/spdk_pid197317 00:42:39.511 Removing: /var/run/dpdk/spdk_pid197390 00:42:39.511 Removing: /var/run/dpdk/spdk_pid197498 00:42:39.511 Removing: /var/run/dpdk/spdk_pid197535 00:42:39.511 Removing: /var/run/dpdk/spdk_pid198778 00:42:39.511 Removing: /var/run/dpdk/spdk_pid199137 00:42:39.511 Removing: /var/run/dpdk/spdk_pid199321 00:42:39.511 Removing: /var/run/dpdk/spdk_pid200254 00:42:39.511 Removing: /var/run/dpdk/spdk_pid200612 00:42:39.511 Removing: /var/run/dpdk/spdk_pid200788 00:42:39.511 Removing: /var/run/dpdk/spdk_pid201735 00:42:39.511 Removing: /var/run/dpdk/spdk_pid202275 00:42:39.511 Removing: /var/run/dpdk/spdk_pid202458 00:42:39.511 Removing: /var/run/dpdk/spdk_pid204676 00:42:39.511 Removing: /var/run/dpdk/spdk_pid205161 00:42:39.511 Removing: /var/run/dpdk/spdk_pid205353 00:42:39.511 Removing: /var/run/dpdk/spdk_pid207513 00:42:39.511 Removing: /var/run/dpdk/spdk_pid208007 00:42:39.511 Removing: /var/run/dpdk/spdk_pid208198 00:42:39.511 Removing: /var/run/dpdk/spdk_pid210371 00:42:39.511 Removing: /var/run/dpdk/spdk_pid211107 00:42:39.511 Removing: /var/run/dpdk/spdk_pid211306 00:42:39.511 Removing: /var/run/dpdk/spdk_pid213744 00:42:39.511 Removing: /var/run/dpdk/spdk_pid214280 00:42:39.511 Removing: /var/run/dpdk/spdk_pid214483 00:42:39.511 Removing: /var/run/dpdk/spdk_pid216879 00:42:39.511 Removing: /var/run/dpdk/spdk_pid217421 00:42:39.511 Removing: /var/run/dpdk/spdk_pid217615 00:42:39.511 Removing: /var/run/dpdk/spdk_pid220008 00:42:39.511 Removing: /var/run/dpdk/spdk_pid220846 00:42:39.511 Removing: /var/run/dpdk/spdk_pid221040 00:42:39.511 Removing: /var/run/dpdk/spdk_pid221234 00:42:39.511 Removing: /var/run/dpdk/spdk_pid221750 00:42:39.511 Removing: /var/run/dpdk/spdk_pid222667 00:42:39.511 Removing: /var/run/dpdk/spdk_pid223136 00:42:39.770 Removing: /var/run/dpdk/spdk_pid224016 00:42:39.770 Removing: /var/run/dpdk/spdk_pid224527 00:42:39.770 Removing: /var/run/dpdk/spdk_pid225446 00:42:39.770 Removing: /var/run/dpdk/spdk_pid225918 00:42:39.770 Removing: /var/run/dpdk/spdk_pid227178 00:42:39.770 Removing: /var/run/dpdk/spdk_pid227696 00:42:39.770 Removing: /var/run/dpdk/spdk_pid228934 00:42:39.770 Removing: /var/run/dpdk/spdk_pid229460 00:42:39.770 Removing: /var/run/dpdk/spdk_pid230727 00:42:39.770 Removing: /var/run/dpdk/spdk_pid231256 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232121 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232162 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232201 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232247 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232372 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232519 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232734 00:42:39.770 Removing: /var/run/dpdk/spdk_pid232995 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233010 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233049 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233073 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233094 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233121 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233139 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233150 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233177 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233197 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233217 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233245 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233257 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233274 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233301 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233321 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233341 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233361 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233377 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233398 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233439 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233457 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233491 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233568 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233594 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233615 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233648 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233669 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233683 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233729 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233743 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233779 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233799 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233812 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233821 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233834 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233851 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233863 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233873 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233912 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233947 00:42:39.770 Removing: /var/run/dpdk/spdk_pid233968 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234001 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234013 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234029 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234082 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234097 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234132 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234143 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234158 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234175 00:42:39.770 Removing: /var/run/dpdk/spdk_pid234187 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234204 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234213 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234226 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234312 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234357 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234457 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234480 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234528 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234573 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234599 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234616 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234638 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234675 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234696 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234768 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234821 00:42:40.029 Removing: /var/run/dpdk/spdk_pid234868 00:42:40.029 Removing: /var/run/dpdk/spdk_pid235122 00:42:40.029 Removing: /var/run/dpdk/spdk_pid235231 00:42:40.029 Removing: /var/run/dpdk/spdk_pid235273 00:42:40.029 Clean 00:42:40.029 12:45:03 -- common/autotest_common.sh@1450 -- # return 0 00:42:40.029 12:45:03 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:42:40.029 12:45:03 -- common/autotest_common.sh@729 -- # xtrace_disable 00:42:40.029 12:45:03 -- common/autotest_common.sh@10 -- # set +x 00:42:40.029 12:45:03 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:42:40.029 12:45:03 -- common/autotest_common.sh@729 -- # xtrace_disable 00:42:40.029 12:45:03 -- common/autotest_common.sh@10 -- # set +x 00:42:40.029 12:45:03 -- spdk/autotest.sh@387 -- # chmod a+r /home/vagrant/spdk_repo/spdk/../output/timing.txt 00:42:40.029 12:45:03 -- spdk/autotest.sh@389 -- # [[ -f /home/vagrant/spdk_repo/spdk/../output/udev.log ]] 00:42:40.029 12:45:03 -- spdk/autotest.sh@389 -- # rm -f /home/vagrant/spdk_repo/spdk/../output/udev.log 00:42:40.287 12:45:03 -- spdk/autotest.sh@391 -- # hash lcov 00:42:40.287 12:45:03 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:42:40.287 12:45:03 -- spdk/autotest.sh@393 -- # hostname 00:42:40.287 12:45:03 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /home/vagrant/spdk_repo/spdk -t rocky9-cloud-1711172311-2200 -o /home/vagrant/spdk_repo/spdk/../output/cov_test.info 00:42:40.544 geninfo: WARNING: invalid characters removed from testname! 00:43:27.208 12:45:50 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /home/vagrant/spdk_repo/spdk/../output/cov_base.info -a /home/vagrant/spdk_repo/spdk/../output/cov_test.info -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:32.473 12:45:55 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/dpdk/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:35.762 12:45:58 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '/usr/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:38.288 12:46:01 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/examples/vmd/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:41.566 12:46:04 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:44.095 12:46:07 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:43:47.407 12:46:10 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:43:47.407 12:46:10 -- common/autobuild_common.sh@15 -- $ source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:43:47.407 12:46:10 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:43:47.407 12:46:10 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:43:47.407 12:46:10 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:43:47.407 12:46:10 -- paths/export.sh@2 -- $ PATH=/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:43:47.407 12:46:10 -- paths/export.sh@3 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:43:47.407 12:46:10 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:43:47.407 12:46:10 -- paths/export.sh@5 -- $ export PATH 00:43:47.407 12:46:10 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/opt/protoc/21.7/bin:/opt/golangci/1.54.2/bin:/opt/go/1.21.1/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/sbin:/bin:/usr/sbin:/usr/bin 00:43:47.407 12:46:10 -- common/autobuild_common.sh@436 -- $ out=/home/vagrant/spdk_repo/spdk/../output 00:43:47.407 12:46:10 -- common/autobuild_common.sh@437 -- $ date +%s 00:43:47.407 12:46:10 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1717764370.XXXXXX 00:43:47.407 12:46:10 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1717764370.hbqMtB 00:43:47.407 12:46:10 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:43:47.407 12:46:10 -- common/autobuild_common.sh@443 -- $ '[' -n v22.11.4 ']' 00:43:47.407 12:46:10 -- common/autobuild_common.sh@444 -- $ dirname /home/vagrant/spdk_repo/dpdk/build 00:43:47.407 12:46:10 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude=' --exclude /home/vagrant/spdk_repo/dpdk' 00:43:47.407 12:46:10 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp' 00:43:47.407 12:46:10 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /home/vagrant/spdk_repo/spdk/../output/scan-build-tmp --exclude /home/vagrant/spdk_repo/dpdk --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp --status-bugs' 00:43:47.407 12:46:10 -- common/autobuild_common.sh@453 -- $ get_config_params 00:43:47.407 12:46:10 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:43:47.407 12:46:10 -- common/autotest_common.sh@10 -- $ set +x 00:43:47.407 12:46:10 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --enable-asan --enable-coverage --with-dpdk=/home/vagrant/spdk_repo/dpdk/build' 00:43:47.407 12:46:10 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:43:47.407 12:46:10 -- pm/common@17 -- $ local monitor 00:43:47.407 12:46:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:43:47.407 12:46:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:43:47.407 12:46:10 -- pm/common@21 -- $ date +%s 00:43:47.407 12:46:10 -- pm/common@25 -- $ sleep 1 00:43:47.407 12:46:10 -- pm/common@21 -- $ date +%s 00:43:47.407 12:46:10 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-cpu-load -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autopackage.sh.1717764370 00:43:47.407 12:46:10 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-vmstat -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autopackage.sh.1717764370 00:43:47.408 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autopackage.sh.1717764370_collect-vmstat.pm.log 00:43:47.408 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autopackage.sh.1717764370_collect-cpu-load.pm.log 00:43:47.974 12:46:11 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:43:47.974 12:46:11 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j10 00:43:47.974 12:46:11 -- spdk/autopackage.sh@11 -- $ cd /home/vagrant/spdk_repo/spdk 00:43:47.974 12:46:11 -- spdk/autopackage.sh@13 -- $ [[ 1 -eq 1 ]] 00:43:47.974 12:46:11 -- spdk/autopackage.sh@14 -- $ build_packaging 00:43:47.974 12:46:11 -- common/autobuild_common.sh@433 -- $ run_test packaging /home/vagrant/spdk_repo/spdk/test/packaging/packaging.sh 00:43:47.974 12:46:11 -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:43:47.974 12:46:11 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:43:47.974 12:46:11 -- common/autotest_common.sh@10 -- $ set +x 00:43:47.974 ************************************ 00:43:47.974 START TEST packaging 00:43:47.974 ************************************ 00:43:47.974 12:46:11 packaging -- common/autotest_common.sh@1124 -- $ /home/vagrant/spdk_repo/spdk/test/packaging/packaging.sh 00:43:48.231 * Looking for test storage... 00:43:48.231 * Found test storage at /home/vagrant/spdk_repo/spdk/test/packaging 00:43:48.231 12:46:11 packaging -- packaging/packaging.sh@11 -- $ run_test rpm_packaging /home/vagrant/spdk_repo/spdk/test/packaging/rpm/rpm.sh 00:43:48.231 12:46:11 packaging -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:43:48.231 12:46:11 packaging -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:43:48.231 12:46:11 packaging -- common/autotest_common.sh@10 -- $ set +x 00:43:48.231 ************************************ 00:43:48.231 START TEST rpm_packaging 00:43:48.231 ************************************ 00:43:48.231 12:46:11 packaging.rpm_packaging -- common/autotest_common.sh@1124 -- $ /home/vagrant/spdk_repo/spdk/test/packaging/rpm/rpm.sh 00:43:48.231 * Looking for test storage... 00:43:48.231 * Found test storage at /home/vagrant/spdk_repo/spdk/test/packaging/rpm 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@11 -- $ builddir=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@14 -- $ unset -v LD_LIBRARY_PATH 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@17 -- $ BUILDDIR=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@18 -- $ DEPS=no 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@19 -- $ uname -m 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@19 -- $ arch=x86_64 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@21 -- $ export MAKEFLAGS BUILDDIR DEPS 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@193 -- $ trap cleanup EXIT 00:43:48.231 12:46:11 packaging.rpm_packaging -- rpm/rpm.sh@195 -- $ run_test build_shared_rpm build_shared_rpm 00:43:48.231 12:46:11 packaging.rpm_packaging -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:43:48.231 12:46:11 packaging.rpm_packaging -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:43:48.231 12:46:11 packaging.rpm_packaging -- common/autotest_common.sh@10 -- $ set +x 00:43:48.232 ************************************ 00:43:48.232 START TEST build_shared_rpm 00:43:48.232 ************************************ 00:43:48.232 12:46:11 packaging.rpm_packaging.build_shared_rpm -- common/autotest_common.sh@1124 -- $ build_shared_rpm 00:43:48.232 12:46:11 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@130 -- $ build_rpm --with-shared 00:43:48.232 12:46:11 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@116 -- $ GEN_SPEC=yes 00:43:48.232 12:46:11 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@116 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 Name: spdk 00:43:48.524 Version: v24.09 00:43:48.524 Release: 1 00:43:48.524 Summary: Storage Performance Development Kit 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 00:43:48.524 Requires: glibc 00:43:48.524 Requires: libaio 00:43:48.524 Requires: libgcc 00:43:48.524 Requires: libstdc++ 00:43:48.524 Requires: libuuid 00:43:48.524 Requires: ncurses-libs 00:43:48.524 Requires: numactl-libs 00:43:48.524 Requires: openssl-libs 00:43:48.525 Requires: zlib 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 BuildRequires: python3-devel 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 License: BSD 00:43:48.525 URL: https://spdk.io 00:43:48.525 Source: spdk-v24.09.tar.gz 00:43:48.525 00:43:48.525 %description 00:43:48.525 00:43:48.525 The Storage Performance Development Kit (SPDK) provides a set of tools and libraries for 00:43:48.525 writing high performance, scalable, user-mode storage applications. It achieves high 00:43:48.525 performance by moving all of the necessary drivers into userspace and operating in a 00:43:48.525 polled mode instead of relying on interrupts, which avoids kernel context switches and 00:43:48.525 eliminates interrupt handling overhead. 00:43:48.525 00:43:48.525 %prep 00:43:48.525 make clean -j10 &>/dev/null || : 00:43:48.525 %setup 00:43:48.525 00:43:48.525 %build 00:43:48.525 set +x 00:43:48.525 00:43:48.525 cfs() { 00:43:48.525 (($# > 1)) || return 0 00:43:48.525 00:43:48.525 local dst=$1 f 00:43:48.525 00:43:48.525 mkdir -p "$dst" 00:43:48.525 shift; for f; do [[ -e $f ]] && cp -a "$f" "$dst"; done 00:43:48.525 } 00:43:48.525 00:43:48.525 cl() { 00:43:48.525 [[ -e $2 ]] || return 0 00:43:48.525 00:43:48.525 cfs "$1" $(find "$2" -name '*.so*' -type f -o -type l | grep -v .symbols) 00:43:48.525 } 00:43:48.525 00:43:48.525 00:43:48.525 # Rely mainly on CONFIG 00:43:48.525 git submodule update --init 00:43:48.525 ./configure --disable-unit-tests --disable-tests --with-shared 00:43:48.525 make -j10 00:43:48.525 make DESTDIR=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 install -j10 00:43:48.525 # DPDK always builds both static and shared, so we need to remove one or the other 00:43:48.525 # SPDK always builds static, so remove it if we want shared. 00:43:48.525 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/lib*.a 00:43:48.525 # DPDK also installs some python scripts to bin that we do not want to package here 00:43:48.525 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/dpdk-*.py 00:43:48.525 # DPDK examples do not need to be packaged in our RPMs 00:43:48.525 rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk 00:43:48.525 # In case sphinx-build is available, DPDK will leave some files we don't need 00:43:48.525 rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/doc/dpdk 00:43:48.525 00:43:48.525 # The ISA-L install may have installed some binaries that we do not want to package 00:43:48.525 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/igzip 00:43:48.525 rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/man 00:43:48.525 00:43:48.525 # Include libvfio-user libs in case --with-vfio-user is in use together with --with-shared 00:43:48.525 00:43:48.525 # And some useful setup scripts SPDK uses 00:43:48.525 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:43:48.525 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d 00:43:48.525 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d 00:43:48.525 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d 00:43:48.525 00:43:48.525 cat <<-EOF > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d/spdk.conf 00:43:48.525 /usr/local/lib 00:43:48.525 /usr/local/lib/dpdk 00:43:48.525 /usr/local/lib/libvfio-user 00:43:48.525 EOF 00:43:48.525 00:43:48.525 cat <<-'EOF' > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d/spdk_path.sh 00:43:48.525 PATH=$PATH:/usr/libexec/spdk/scripts 00:43:48.525 PATH=$PATH:/usr/libexec/spdk/scripts/vagrant 00:43:48.525 PATH=$PATH:/usr/libexec/spdk/test/common/config 00:43:48.525 export PATH 00:43:48.525 EOF 00:43:48.525 00:43:48.525 cfs /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk scripts 00:43:48.525 ln -s /usr/libexec/spdk/scripts/bash-completion/spdk /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d/ 00:43:48.525 00:43:48.525 # We need to take into the account the fact that most of the scripts depend on being 00:43:48.525 # run directly from the repo. To workaround it, create common root space under dir 00:43:48.525 # like /usr/libexec/spdk and link all potential relative paths the script may try 00:43:48.525 # to reference. 00:43:48.525 00:43:48.525 # setup.sh uses pci_ids.h 00:43:48.525 ln -s /usr/local/include /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:43:48.525 00:43:48.525 %files 00:43:48.525 /usr/local/bin/* 00:43:48.525 /usr/local/lib/python3.9/site-packages/spdk*/* 00:43:48.525 00:43:48.525 %package devel 00:43:48.525 00:43:48.525 Summary: SPDK development libraries and headers 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 %description devel 00:43:48.525 00:43:48.525 SPDK development libraries and header 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 00:43:48.525 %files devel 00:43:48.525 /usr/local/include/* 00:43:48.525 /usr/local/lib/pkgconfig/*.pc 00:43:48.525 /usr/local/lib/*.la 00:43:48.525 /usr/local/lib/*.so* 00:43:48.525 /etc/ld.so.conf.d/spdk.conf 00:43:48.525 /usr/local/lib/dpdk 00:43:48.525 00:43:48.525 %post devel 00:43:48.525 ldconfig 00:43:48.525 00:43:48.525 %package scripts 00:43:48.525 Summary: SPDK scripts and utilities 00:43:48.525 00:43:48.525 %description scripts 00:43:48.525 SPDK scripts and utilities 00:43:48.525 00:43:48.525 %files scripts 00:43:48.525 /usr/libexec/spdk/* 00:43:48.525 /etc/profile.d/* 00:43:48.525 /etc/bash_completion.d/* 00:43:48.525 00:43:48.525 %post scripts 00:43:48.525 ldconfig 00:43:48.525 00:43:48.525 %changelog 00:43:48.525 * Tue Feb 16 2021 Michal Berger 00:43:48.525 - Initial RPM .spec for the SPDK 00:43:48.525 12:46:12 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@118 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared 00:43:48.784 * Starting rpmbuild... 00:43:48.784 setting SOURCE_DATE_EPOCH=1613433600 00:43:48.784 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.1UnOhm 00:43:48.784 + umask 022 00:43:48.784 + cd /home/vagrant/spdk_repo/spdk 00:43:48.784 + make clean -j10 00:43:58.755 + RPM_EC=0 00:43:58.756 ++ jobs -p 00:43:58.756 + exit 0 00:43:58.756 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.LskHZF 00:43:58.756 + umask 022 00:43:58.756 + cd /home/vagrant/spdk_repo/spdk 00:43:58.756 + set +x 00:43:58.756 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:43:58.756 Using default DPDK in /home/vagrant/spdk_repo/spdk/dpdk/build 00:44:14.274 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:44:26.550 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:44:26.550 Creating mk/config.mk...done. 00:44:26.550 Creating mk/cc.flags.mk...done. 00:44:26.550 Type 'make' to build. 00:44:26.550 make[1]: Nothing to be done for 'all'. 00:44:58.726 The Meson build system 00:44:58.726 Version: 1.4.0 00:44:58.726 Source dir: /home/vagrant/spdk_repo/spdk/dpdk 00:44:58.726 Build dir: /home/vagrant/spdk_repo/spdk/dpdk/build-tmp 00:44:58.726 Build type: native build 00:44:58.726 Program cat found: YES (/bin/cat) 00:44:58.726 Project name: DPDK 00:44:58.726 Project version: 24.03.0 00:44:58.726 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:44:58.726 C linker for the host machine: cc ld.bfd 2.35.2-42 00:44:58.726 Host machine cpu family: x86_64 00:44:58.726 Host machine cpu: x86_64 00:44:58.726 Message: ## Building in Developer Mode ## 00:44:58.726 Program pkg-config found: YES (/bin/pkg-config) 00:44:58.726 Program check-symbols.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/check-symbols.sh) 00:44:58.726 Program options-ibverbs-static.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:44:58.726 Program python3 found: YES (/usr/bin/python3) 00:44:58.726 Program cat found: YES (/bin/cat) 00:44:58.726 Compiler for C supports arguments -march=native: YES 00:44:58.726 Checking for size of "void *" : 8 00:44:58.726 Checking for size of "void *" : 8 (cached) 00:44:58.726 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:44:58.726 Library m found: YES 00:44:58.726 Library numa found: YES 00:44:58.726 Has header "numaif.h" : YES 00:44:58.726 Library fdt found: NO 00:44:58.726 Library execinfo found: NO 00:44:58.726 Has header "execinfo.h" : YES 00:44:58.726 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:44:58.726 Run-time dependency libarchive found: NO (tried pkgconfig) 00:44:58.726 Run-time dependency libbsd found: NO (tried pkgconfig) 00:44:58.726 Run-time dependency jansson found: NO (tried pkgconfig) 00:44:58.726 Run-time dependency openssl found: YES 3.0.7 00:44:58.726 Run-time dependency libpcap found: NO (tried pkgconfig) 00:44:58.726 Library pcap found: NO 00:44:58.726 Compiler for C supports arguments -Wcast-qual: YES 00:44:58.726 Compiler for C supports arguments -Wdeprecated: YES 00:44:58.726 Compiler for C supports arguments -Wformat: YES 00:44:58.726 Compiler for C supports arguments -Wformat-nonliteral: YES 00:44:58.726 Compiler for C supports arguments -Wformat-security: YES 00:44:58.726 Compiler for C supports arguments -Wmissing-declarations: YES 00:44:58.726 Compiler for C supports arguments -Wmissing-prototypes: YES 00:44:58.726 Compiler for C supports arguments -Wnested-externs: YES 00:44:58.726 Compiler for C supports arguments -Wold-style-definition: YES 00:44:58.726 Compiler for C supports arguments -Wpointer-arith: YES 00:44:58.726 Compiler for C supports arguments -Wsign-compare: YES 00:44:58.726 Compiler for C supports arguments -Wstrict-prototypes: YES 00:44:58.726 Compiler for C supports arguments -Wundef: YES 00:44:58.726 Compiler for C supports arguments -Wwrite-strings: YES 00:44:58.726 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:44:58.726 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:44:58.726 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:44:58.726 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:44:58.726 Program objdump found: YES (/bin/objdump) 00:44:58.726 Compiler for C supports arguments -mavx512f: YES 00:44:58.726 Checking if "AVX512 checking" compiles: YES 00:44:58.726 Fetching value of define "__SSE4_2__" : 1 00:44:58.726 Fetching value of define "__AES__" : 1 00:44:58.726 Fetching value of define "__AVX__" : 1 00:44:58.726 Fetching value of define "__AVX2__" : 1 00:44:58.726 Fetching value of define "__AVX512BW__" : 1 00:44:58.726 Fetching value of define "__AVX512CD__" : 1 00:44:58.726 Fetching value of define "__AVX512DQ__" : 1 00:44:58.726 Fetching value of define "__AVX512F__" : 1 00:44:58.726 Fetching value of define "__AVX512VL__" : 1 00:44:58.726 Fetching value of define "__PCLMUL__" : 1 00:44:58.726 Fetching value of define "__RDRND__" : 1 00:44:58.726 Fetching value of define "__RDSEED__" : 1 00:44:58.726 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:44:58.726 Fetching value of define "__znver1__" : (undefined) 00:44:58.726 Fetching value of define "__znver2__" : (undefined) 00:44:58.726 Fetching value of define "__znver3__" : (undefined) 00:44:58.726 Fetching value of define "__znver4__" : (undefined) 00:44:58.726 Compiler for C supports arguments -Wno-format-truncation: YES 00:44:58.726 Message: lib/log: Defining dependency "log" 00:44:58.726 Message: lib/kvargs: Defining dependency "kvargs" 00:44:58.726 Message: lib/telemetry: Defining dependency "telemetry" 00:44:58.726 Checking for function "getentropy" : NO 00:44:58.726 Message: lib/eal: Defining dependency "eal" 00:44:58.726 Message: lib/ring: Defining dependency "ring" 00:44:58.726 Message: lib/rcu: Defining dependency "rcu" 00:44:58.726 Message: lib/mempool: Defining dependency "mempool" 00:44:58.726 Message: lib/mbuf: Defining dependency "mbuf" 00:44:58.726 Fetching value of define "__PCLMUL__" : 1 (cached) 00:44:58.726 Fetching value of define "__AVX512F__" : 1 (cached) 00:44:58.726 Fetching value of define "__AVX512BW__" : 1 (cached) 00:44:58.726 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:44:58.726 Fetching value of define "__AVX512VL__" : 1 (cached) 00:44:58.726 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:44:58.726 Compiler for C supports arguments -mpclmul: YES 00:44:58.726 Compiler for C supports arguments -maes: YES 00:44:58.726 Compiler for C supports arguments -mavx512f: YES (cached) 00:44:58.726 Compiler for C supports arguments -mavx512bw: YES 00:44:58.726 Compiler for C supports arguments -mavx512dq: YES 00:44:58.726 Compiler for C supports arguments -mavx512vl: YES 00:44:58.726 Compiler for C supports arguments -mvpclmulqdq: YES 00:44:58.726 Compiler for C supports arguments -mavx2: YES 00:44:58.726 Compiler for C supports arguments -mavx: YES 00:44:58.726 Message: lib/net: Defining dependency "net" 00:44:58.726 Message: lib/meter: Defining dependency "meter" 00:44:58.726 Message: lib/ethdev: Defining dependency "ethdev" 00:44:58.726 Message: lib/pci: Defining dependency "pci" 00:44:58.726 Message: lib/cmdline: Defining dependency "cmdline" 00:44:58.726 Message: lib/hash: Defining dependency "hash" 00:44:58.726 Message: lib/timer: Defining dependency "timer" 00:44:58.727 Message: lib/compressdev: Defining dependency "compressdev" 00:44:58.727 Message: lib/cryptodev: Defining dependency "cryptodev" 00:44:58.727 Message: lib/dmadev: Defining dependency "dmadev" 00:44:58.727 Compiler for C supports arguments -Wno-cast-qual: YES 00:44:58.727 Message: lib/power: Defining dependency "power" 00:44:58.727 Message: lib/reorder: Defining dependency "reorder" 00:44:58.727 Message: lib/security: Defining dependency "security" 00:44:58.727 Has header "linux/userfaultfd.h" : YES 00:44:58.727 Has header "linux/vduse.h" : NO 00:44:58.727 Message: lib/vhost: Defining dependency "vhost" 00:44:58.727 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:44:58.727 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:44:58.727 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:44:58.727 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:44:58.727 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:44:58.727 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:44:58.727 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:44:58.727 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:44:58.727 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:44:58.727 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:44:58.727 Program doxygen found: YES (/bin/doxygen) 00:44:58.727 Configuring doxy-api-html.conf using configuration 00:44:58.727 Configuring doxy-api-man.conf using configuration 00:44:58.727 Program mandb found: YES (/bin/mandb) 00:44:58.727 Program sphinx-build found: NO 00:44:58.727 Configuring rte_build_config.h using configuration 00:44:58.727 Message: 00:44:58.727 ================= 00:44:58.727 Applications Enabled 00:44:58.727 ================= 00:44:58.727 00:44:58.727 apps: 00:44:58.727 00:44:58.727 00:44:58.727 Message: 00:44:58.727 ================= 00:44:58.727 Libraries Enabled 00:44:58.727 ================= 00:44:58.727 00:44:58.727 libs: 00:44:58.727 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:44:58.727 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:44:58.727 cryptodev, dmadev, power, reorder, security, vhost, 00:44:58.727 00:44:58.727 Message: 00:44:58.727 =============== 00:44:58.727 Drivers Enabled 00:44:58.727 =============== 00:44:58.727 00:44:58.727 common: 00:44:58.727 00:44:58.727 bus: 00:44:58.727 pci, vdev, 00:44:58.727 mempool: 00:44:58.727 ring, 00:44:58.727 dma: 00:44:58.727 00:44:58.727 net: 00:44:58.727 00:44:58.727 crypto: 00:44:58.727 00:44:58.727 compress: 00:44:58.727 00:44:58.727 vdpa: 00:44:58.727 00:44:58.727 00:44:58.727 Message: 00:44:58.727 ================= 00:44:58.727 Content Skipped 00:44:58.727 ================= 00:44:58.727 00:44:58.727 apps: 00:44:58.727 dumpcap: explicitly disabled via build config 00:44:58.727 graph: explicitly disabled via build config 00:44:58.727 pdump: explicitly disabled via build config 00:44:58.727 proc-info: explicitly disabled via build config 00:44:58.727 test-acl: explicitly disabled via build config 00:44:58.727 test-bbdev: explicitly disabled via build config 00:44:58.727 test-cmdline: explicitly disabled via build config 00:44:58.727 test-compress-perf: explicitly disabled via build config 00:44:58.727 test-crypto-perf: explicitly disabled via build config 00:44:58.727 test-dma-perf: explicitly disabled via build config 00:44:58.727 test-eventdev: explicitly disabled via build config 00:44:58.727 test-fib: explicitly disabled via build config 00:44:58.727 test-flow-perf: explicitly disabled via build config 00:44:58.727 test-gpudev: explicitly disabled via build config 00:44:58.727 test-mldev: explicitly disabled via build config 00:44:58.727 test-pipeline: explicitly disabled via build config 00:44:58.727 test-pmd: explicitly disabled via build config 00:44:58.727 test-regex: explicitly disabled via build config 00:44:58.727 test-sad: explicitly disabled via build config 00:44:58.727 test-security-perf: explicitly disabled via build config 00:44:58.727 00:44:58.727 libs: 00:44:58.727 argparse: explicitly disabled via build config 00:44:58.727 metrics: explicitly disabled via build config 00:44:58.727 acl: explicitly disabled via build config 00:44:58.727 bbdev: explicitly disabled via build config 00:44:58.727 bitratestats: explicitly disabled via build config 00:44:58.727 bpf: explicitly disabled via build config 00:44:58.727 cfgfile: explicitly disabled via build config 00:44:58.727 distributor: explicitly disabled via build config 00:44:58.727 efd: explicitly disabled via build config 00:44:58.727 eventdev: explicitly disabled via build config 00:44:58.727 dispatcher: explicitly disabled via build config 00:44:58.727 gpudev: explicitly disabled via build config 00:44:58.727 gro: explicitly disabled via build config 00:44:58.727 gso: explicitly disabled via build config 00:44:58.727 ip_frag: explicitly disabled via build config 00:44:58.727 jobstats: explicitly disabled via build config 00:44:58.727 latencystats: explicitly disabled via build config 00:44:58.727 lpm: explicitly disabled via build config 00:44:58.727 member: explicitly disabled via build config 00:44:58.727 pcapng: explicitly disabled via build config 00:44:58.727 rawdev: explicitly disabled via build config 00:44:58.727 regexdev: explicitly disabled via build config 00:44:58.727 mldev: explicitly disabled via build config 00:44:58.727 rib: explicitly disabled via build config 00:44:58.727 sched: explicitly disabled via build config 00:44:58.727 stack: explicitly disabled via build config 00:44:58.727 ipsec: explicitly disabled via build config 00:44:58.727 pdcp: explicitly disabled via build config 00:44:58.727 fib: explicitly disabled via build config 00:44:58.727 port: explicitly disabled via build config 00:44:58.727 pdump: explicitly disabled via build config 00:44:58.727 table: explicitly disabled via build config 00:44:58.727 pipeline: explicitly disabled via build config 00:44:58.727 graph: explicitly disabled via build config 00:44:58.727 node: explicitly disabled via build config 00:44:58.727 00:44:58.727 drivers: 00:44:58.727 common/cpt: not in enabled drivers build config 00:44:58.727 common/dpaax: not in enabled drivers build config 00:44:58.727 common/iavf: not in enabled drivers build config 00:44:58.727 common/idpf: not in enabled drivers build config 00:44:58.727 common/ionic: not in enabled drivers build config 00:44:58.727 common/mvep: not in enabled drivers build config 00:44:58.727 common/octeontx: not in enabled drivers build config 00:44:58.727 bus/auxiliary: not in enabled drivers build config 00:44:58.727 bus/cdx: not in enabled drivers build config 00:44:58.727 bus/dpaa: not in enabled drivers build config 00:44:58.727 bus/fslmc: not in enabled drivers build config 00:44:58.727 bus/ifpga: not in enabled drivers build config 00:44:58.727 bus/platform: not in enabled drivers build config 00:44:58.727 bus/uacce: not in enabled drivers build config 00:44:58.727 bus/vmbus: not in enabled drivers build config 00:44:58.727 common/cnxk: not in enabled drivers build config 00:44:58.727 common/mlx5: not in enabled drivers build config 00:44:58.727 common/nfp: not in enabled drivers build config 00:44:58.727 common/nitrox: not in enabled drivers build config 00:44:58.727 common/qat: not in enabled drivers build config 00:44:58.727 common/sfc_efx: not in enabled drivers build config 00:44:58.727 mempool/bucket: not in enabled drivers build config 00:44:58.727 mempool/cnxk: not in enabled drivers build config 00:44:58.727 mempool/dpaa: not in enabled drivers build config 00:44:58.727 mempool/dpaa2: not in enabled drivers build config 00:44:58.727 mempool/octeontx: not in enabled drivers build config 00:44:58.727 mempool/stack: not in enabled drivers build config 00:44:58.727 dma/cnxk: not in enabled drivers build config 00:44:58.727 dma/dpaa: not in enabled drivers build config 00:44:58.727 dma/dpaa2: not in enabled drivers build config 00:44:58.727 dma/hisilicon: not in enabled drivers build config 00:44:58.727 dma/idxd: not in enabled drivers build config 00:44:58.727 dma/ioat: not in enabled drivers build config 00:44:58.727 dma/skeleton: not in enabled drivers build config 00:44:58.727 net/af_packet: not in enabled drivers build config 00:44:58.727 net/af_xdp: not in enabled drivers build config 00:44:58.727 net/ark: not in enabled drivers build config 00:44:58.727 net/atlantic: not in enabled drivers build config 00:44:58.727 net/avp: not in enabled drivers build config 00:44:58.727 net/axgbe: not in enabled drivers build config 00:44:58.727 net/bnx2x: not in enabled drivers build config 00:44:58.727 net/bnxt: not in enabled drivers build config 00:44:58.727 net/bonding: not in enabled drivers build config 00:44:58.727 net/cnxk: not in enabled drivers build config 00:44:58.727 net/cpfl: not in enabled drivers build config 00:44:58.727 net/cxgbe: not in enabled drivers build config 00:44:58.727 net/dpaa: not in enabled drivers build config 00:44:58.727 net/dpaa2: not in enabled drivers build config 00:44:58.727 net/e1000: not in enabled drivers build config 00:44:58.727 net/ena: not in enabled drivers build config 00:44:58.727 net/enetc: not in enabled drivers build config 00:44:58.727 net/enetfec: not in enabled drivers build config 00:44:58.727 net/enic: not in enabled drivers build config 00:44:58.727 net/failsafe: not in enabled drivers build config 00:44:58.727 net/fm10k: not in enabled drivers build config 00:44:58.727 net/gve: not in enabled drivers build config 00:44:58.727 net/hinic: not in enabled drivers build config 00:44:58.727 net/hns3: not in enabled drivers build config 00:44:58.727 net/i40e: not in enabled drivers build config 00:44:58.727 net/iavf: not in enabled drivers build config 00:44:58.727 net/ice: not in enabled drivers build config 00:44:58.727 net/idpf: not in enabled drivers build config 00:44:58.727 net/igc: not in enabled drivers build config 00:44:58.727 net/ionic: not in enabled drivers build config 00:44:58.727 net/ipn3ke: not in enabled drivers build config 00:44:58.727 net/ixgbe: not in enabled drivers build config 00:44:58.727 net/mana: not in enabled drivers build config 00:44:58.727 net/memif: not in enabled drivers build config 00:44:58.727 net/mlx4: not in enabled drivers build config 00:44:58.727 net/mlx5: not in enabled drivers build config 00:44:58.727 net/mvneta: not in enabled drivers build config 00:44:58.727 net/mvpp2: not in enabled drivers build config 00:44:58.727 net/netvsc: not in enabled drivers build config 00:44:58.728 net/nfb: not in enabled drivers build config 00:44:58.728 net/nfp: not in enabled drivers build config 00:44:58.728 net/ngbe: not in enabled drivers build config 00:44:58.728 net/null: not in enabled drivers build config 00:44:58.728 net/octeontx: not in enabled drivers build config 00:44:58.728 net/octeon_ep: not in enabled drivers build config 00:44:58.728 net/pcap: not in enabled drivers build config 00:44:58.728 net/pfe: not in enabled drivers build config 00:44:58.728 net/qede: not in enabled drivers build config 00:44:58.728 net/ring: not in enabled drivers build config 00:44:58.728 net/sfc: not in enabled drivers build config 00:44:58.728 net/softnic: not in enabled drivers build config 00:44:58.728 net/tap: not in enabled drivers build config 00:44:58.728 net/thunderx: not in enabled drivers build config 00:44:58.728 net/txgbe: not in enabled drivers build config 00:44:58.728 net/vdev_netvsc: not in enabled drivers build config 00:44:58.728 net/vhost: not in enabled drivers build config 00:44:58.728 net/virtio: not in enabled drivers build config 00:44:58.728 net/vmxnet3: not in enabled drivers build config 00:44:58.728 raw/*: missing internal dependency, "rawdev" 00:44:58.728 crypto/armv8: not in enabled drivers build config 00:44:58.728 crypto/bcmfs: not in enabled drivers build config 00:44:58.728 crypto/caam_jr: not in enabled drivers build config 00:44:58.728 crypto/ccp: not in enabled drivers build config 00:44:58.728 crypto/cnxk: not in enabled drivers build config 00:44:58.728 crypto/dpaa_sec: not in enabled drivers build config 00:44:58.728 crypto/dpaa2_sec: not in enabled drivers build config 00:44:58.728 crypto/ipsec_mb: not in enabled drivers build config 00:44:58.728 crypto/mlx5: not in enabled drivers build config 00:44:58.728 crypto/mvsam: not in enabled drivers build config 00:44:58.728 crypto/nitrox: not in enabled drivers build config 00:44:58.728 crypto/null: not in enabled drivers build config 00:44:58.728 crypto/octeontx: not in enabled drivers build config 00:44:58.728 crypto/openssl: not in enabled drivers build config 00:44:58.728 crypto/scheduler: not in enabled drivers build config 00:44:58.728 crypto/uadk: not in enabled drivers build config 00:44:58.728 crypto/virtio: not in enabled drivers build config 00:44:58.728 compress/isal: not in enabled drivers build config 00:44:58.728 compress/mlx5: not in enabled drivers build config 00:44:58.728 compress/nitrox: not in enabled drivers build config 00:44:58.728 compress/octeontx: not in enabled drivers build config 00:44:58.728 compress/zlib: not in enabled drivers build config 00:44:58.728 regex/*: missing internal dependency, "regexdev" 00:44:58.728 ml/*: missing internal dependency, "mldev" 00:44:58.728 vdpa/ifc: not in enabled drivers build config 00:44:58.728 vdpa/mlx5: not in enabled drivers build config 00:44:58.728 vdpa/nfp: not in enabled drivers build config 00:44:58.728 vdpa/sfc: not in enabled drivers build config 00:44:58.728 event/*: missing internal dependency, "eventdev" 00:44:58.728 baseband/*: missing internal dependency, "bbdev" 00:44:58.728 gpu/*: missing internal dependency, "gpudev" 00:44:58.728 00:44:58.728 00:44:58.728 Build targets in project: 85 00:44:58.728 00:44:58.728 DPDK 24.03.0 00:44:58.728 00:44:58.728 User defined options 00:44:58.728 default_library : shared 00:44:58.728 libdir : lib 00:44:58.728 prefix : /home/vagrant/spdk_repo/spdk/dpdk/build 00:44:58.728 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:44:58.728 c_link_args : 00:44:58.728 cpu_instruction_set: native 00:44:58.728 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:44:58.728 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:44:58.728 enable_docs : false 00:44:58.728 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:44:58.728 enable_kmods : false 00:44:58.728 tests : false 00:44:58.728 00:44:58.728 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:44:58.728 ninja: Entering directory `/home/vagrant/spdk_repo/spdk/dpdk/build-tmp' 00:44:58.728 [1/267] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:44:58.728 [2/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:44:58.728 [3/267] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:44:58.728 [4/267] Compiling C object lib/librte_log.a.p/log_log.c.o 00:44:58.728 [5/267] Linking static target lib/librte_kvargs.a 00:44:58.728 [6/267] Linking static target lib/librte_log.a 00:44:58.728 [7/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:44:58.728 [8/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:44:58.728 [9/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:44:58.728 [10/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:44:58.728 [11/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:44:58.728 [12/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:44:58.728 [13/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:44:58.728 [14/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:44:58.728 [15/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:44:58.728 [16/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:44:58.728 [17/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:44:58.728 [18/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:44:58.728 [19/267] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:44:58.728 [20/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:44:58.728 [21/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:44:58.728 [22/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:44:58.728 [23/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:44:58.728 [24/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:44:58.728 [25/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:44:58.728 [26/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:44:58.728 [27/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:44:58.728 [28/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:44:58.728 [29/267] Linking static target lib/librte_telemetry.a 00:44:58.987 [30/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:44:58.987 [31/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:44:58.987 [32/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:44:58.987 [33/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:44:59.244 [34/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:44:59.244 [35/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:44:59.244 [36/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:44:59.244 [37/267] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:44:59.244 [38/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:44:59.244 [39/267] Linking target lib/librte_log.so.24.1 00:44:59.244 [40/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:44:59.502 [41/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:44:59.502 [42/267] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:44:59.502 [43/267] Linking target lib/librte_kvargs.so.24.1 00:44:59.795 [44/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:44:59.795 [45/267] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:44:59.795 [46/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:44:59.795 [47/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:45:00.052 [48/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:45:00.052 [49/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:45:00.052 [50/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:45:00.310 [51/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:45:00.310 [52/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:45:00.310 [53/267] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:45:00.310 [54/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:45:00.310 [55/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:45:00.310 [56/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:45:00.310 [57/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:45:00.567 [58/267] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:45:00.567 [59/267] Linking target lib/librte_telemetry.so.24.1 00:45:00.567 [60/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:45:00.567 [61/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:45:00.567 [62/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:45:00.567 [63/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:45:00.567 [64/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:45:00.825 [65/267] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:45:00.825 [66/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:45:00.825 [67/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:45:00.825 [68/267] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:45:01.085 [69/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:45:01.085 [70/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:45:01.085 [71/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:45:01.085 [72/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:45:01.344 [73/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:45:01.344 [74/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:45:01.344 [75/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:45:01.344 [76/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:45:01.344 [77/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:45:01.602 [78/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:45:01.602 [79/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:45:01.602 [80/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:45:01.602 [81/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:45:01.860 [82/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:45:01.860 [83/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:45:01.860 [84/267] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:45:01.860 [85/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:45:01.860 [86/267] Linking static target lib/librte_ring.a 00:45:02.118 [87/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:45:02.118 [88/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:45:02.118 [89/267] Linking static target lib/librte_eal.a 00:45:02.375 [90/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:45:02.375 [91/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:45:02.375 [92/267] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:45:02.633 [93/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:45:02.633 [94/267] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:45:02.633 [95/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:45:02.633 [96/267] Linking static target lib/librte_rcu.a 00:45:02.633 [97/267] Linking static target lib/librte_mempool.a 00:45:02.633 [98/267] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:45:02.892 [99/267] Linking static target lib/net/libnet_crc_avx512_lib.a 00:45:02.892 [100/267] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:45:02.892 [101/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:45:02.892 [102/267] Linking static target lib/librte_mbuf.a 00:45:03.150 [103/267] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:45:03.150 [104/267] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:45:03.407 [105/267] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:45:03.407 [106/267] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:45:03.407 [107/267] Linking static target lib/librte_net.a 00:45:03.407 [108/267] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:45:03.407 [109/267] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:45:03.407 [110/267] Linking static target lib/librte_meter.a 00:45:03.974 [111/267] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:45:03.974 [112/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:45:03.974 [113/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:45:03.974 [114/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:45:04.232 [115/267] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:45:04.232 [116/267] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:45:04.490 [117/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:45:04.747 [118/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:45:05.312 [119/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:45:05.312 [120/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:45:05.876 [121/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:45:06.133 [122/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:45:06.133 [123/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:45:06.133 [124/267] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:45:06.390 [125/267] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:45:06.390 [126/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:45:06.390 [127/267] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:45:06.390 [128/267] Linking static target lib/librte_pci.a 00:45:06.647 [129/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:45:06.647 [130/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:45:06.904 [131/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:45:06.905 [132/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:45:06.905 [133/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:45:07.163 [134/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:45:07.163 [135/267] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:45:07.163 [136/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:45:07.163 [137/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:45:07.163 [138/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:45:07.163 [139/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:45:07.421 [140/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:45:07.421 [141/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:45:07.421 [142/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:45:07.421 [143/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:45:07.421 [144/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:45:07.421 [145/267] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:45:07.679 [146/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:45:07.679 [147/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:45:07.679 [148/267] Linking static target lib/librte_cmdline.a 00:45:07.936 [149/267] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:45:07.936 [150/267] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:45:08.194 [151/267] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:45:08.194 [152/267] Linking static target lib/librte_timer.a 00:45:08.452 [153/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:45:08.452 [154/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:45:08.452 [155/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:45:08.710 [156/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:45:08.710 [157/267] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:45:08.710 [158/267] Linking static target lib/librte_compressdev.a 00:45:08.968 [159/267] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:45:09.226 [160/267] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:45:09.484 [161/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:45:09.484 [162/267] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:45:09.484 [163/267] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:45:09.484 [164/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:45:09.484 [165/267] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:45:09.484 [166/267] Linking static target lib/librte_ethdev.a 00:45:09.484 [167/267] Linking static target lib/librte_hash.a 00:45:09.741 [168/267] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:45:09.741 [169/267] Linking static target lib/librte_dmadev.a 00:45:10.000 [170/267] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:45:10.000 [171/267] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:45:10.000 [172/267] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:45:10.259 [173/267] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:45:10.517 [174/267] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:45:10.517 [175/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:45:10.517 [176/267] Linking static target lib/librte_cryptodev.a 00:45:10.517 [177/267] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:45:10.779 [178/267] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:45:10.779 [179/267] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:45:11.050 [180/267] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:45:11.050 [181/267] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:45:11.308 [182/267] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:45:11.308 [183/267] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:45:11.308 [184/267] Linking static target lib/librte_power.a 00:45:11.308 [185/267] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:45:11.566 [186/267] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:45:11.566 [187/267] Linking static target lib/librte_security.a 00:45:11.566 [188/267] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:45:11.825 [189/267] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:45:11.825 [190/267] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:45:11.825 [191/267] Linking static target lib/librte_reorder.a 00:45:12.084 [192/267] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:45:12.084 [193/267] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:45:13.019 [194/267] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:45:13.019 [195/267] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:45:13.019 [196/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:45:13.278 [197/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:45:13.278 [198/267] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:45:13.572 [199/267] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:45:13.572 [200/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:45:13.572 [201/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:45:13.572 [202/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:45:13.572 [203/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:45:13.830 [204/267] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:45:14.089 [205/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:45:14.089 [206/267] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:45:14.089 [207/267] Linking static target drivers/libtmp_rte_bus_vdev.a 00:45:14.089 [208/267] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:45:14.089 [209/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:45:14.089 [210/267] Linking static target drivers/libtmp_rte_bus_pci.a 00:45:14.346 [211/267] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:45:14.346 [212/267] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:45:14.346 [213/267] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:45:14.346 [214/267] Linking static target drivers/librte_bus_vdev.a 00:45:14.346 [215/267] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:45:14.346 [216/267] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:45:14.346 [217/267] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:45:14.346 [218/267] Linking static target drivers/librte_bus_pci.a 00:45:14.603 [219/267] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:45:14.603 [220/267] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:45:14.603 [221/267] Linking static target drivers/libtmp_rte_mempool_ring.a 00:45:14.862 [222/267] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:45:14.862 [223/267] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:45:14.862 [224/267] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:45:14.862 [225/267] Linking static target drivers/librte_mempool_ring.a 00:45:14.862 [226/267] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:45:17.390 [227/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:45:18.795 [228/267] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:45:18.795 [229/267] Linking target lib/librte_eal.so.24.1 00:45:18.795 [230/267] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:45:18.795 [231/267] Linking target lib/librte_timer.so.24.1 00:45:18.795 [232/267] Linking target lib/librte_ring.so.24.1 00:45:18.795 [233/267] Linking target lib/librte_dmadev.so.24.1 00:45:18.795 [234/267] Linking target lib/librte_pci.so.24.1 00:45:18.795 [235/267] Linking target drivers/librte_bus_vdev.so.24.1 00:45:18.795 [236/267] Linking target lib/librte_meter.so.24.1 00:45:19.054 [237/267] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:45:19.054 [238/267] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:45:19.054 [239/267] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:45:19.054 [240/267] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:45:19.054 [241/267] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:45:19.054 [242/267] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:45:19.054 [243/267] Linking target lib/librte_rcu.so.24.1 00:45:19.054 [244/267] Linking target lib/librte_mempool.so.24.1 00:45:19.054 [245/267] Linking target drivers/librte_bus_pci.so.24.1 00:45:19.054 [246/267] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:45:19.054 [247/267] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:45:19.311 [248/267] Linking target lib/librte_mbuf.so.24.1 00:45:19.311 [249/267] Linking target drivers/librte_mempool_ring.so.24.1 00:45:19.311 [250/267] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:45:19.673 [251/267] Linking target lib/librte_net.so.24.1 00:45:19.673 [252/267] Linking target lib/librte_compressdev.so.24.1 00:45:19.673 [253/267] Linking target lib/librte_cryptodev.so.24.1 00:45:19.673 [254/267] Linking target lib/librte_reorder.so.24.1 00:45:19.673 [255/267] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:45:19.673 [256/267] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:45:19.673 [257/267] Linking target lib/librte_security.so.24.1 00:45:19.673 [258/267] Linking target lib/librte_hash.so.24.1 00:45:19.673 [259/267] Linking target lib/librte_cmdline.so.24.1 00:45:19.673 [260/267] Linking target lib/librte_ethdev.so.24.1 00:45:19.947 [261/267] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:45:19.947 [262/267] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:45:19.947 [263/267] Linking target lib/librte_power.so.24.1 00:45:25.233 [264/267] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:45:25.233 [265/267] Linking static target lib/librte_vhost.a 00:45:26.609 [266/267] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:45:26.609 [267/267] Linking target lib/librte_vhost.so.24.1 00:45:26.609 INFO: autodetecting backend as ninja 00:45:26.609 INFO: calculating backend command to run: /usr/local/bin/ninja -C /home/vagrant/spdk_repo/spdk/dpdk/build-tmp -j 10 00:45:27.676 CC lib/log/log.o 00:45:27.676 CC lib/log/log_deprecated.o 00:45:27.676 CC lib/log/log_flags.o 00:45:27.676 CC lib/ut_mock/mock.o 00:45:27.935 LIB libspdk_ut_mock.a 00:45:27.935 SO libspdk_ut_mock.so.6.0 00:45:27.935 SYMLINK libspdk_ut_mock.so 00:45:28.194 LIB libspdk_log.a 00:45:28.194 SO libspdk_log.so.7.0 00:45:28.194 SYMLINK libspdk_log.so 00:45:28.451 CC lib/dma/dma.o 00:45:28.451 CC lib/ioat/ioat.o 00:45:28.451 CXX lib/trace_parser/trace.o 00:45:28.451 CC lib/util/base64.o 00:45:28.451 CC lib/util/bit_array.o 00:45:28.451 CC lib/util/crc16.o 00:45:28.451 CC lib/util/cpuset.o 00:45:28.451 CC lib/util/crc32.o 00:45:28.451 CC lib/util/crc32c.o 00:45:28.708 CC lib/vfio_user/host/vfio_user_pci.o 00:45:28.709 CC lib/vfio_user/host/vfio_user.o 00:45:28.709 CC lib/util/crc32_ieee.o 00:45:28.709 CC lib/util/crc64.o 00:45:28.709 LIB libspdk_dma.a 00:45:28.709 CC lib/util/dif.o 00:45:28.709 SO libspdk_dma.so.4.0 00:45:28.709 CC lib/util/fd.o 00:45:28.709 CC lib/util/file.o 00:45:28.709 CC lib/util/hexlify.o 00:45:28.709 SYMLINK libspdk_dma.so 00:45:28.709 CC lib/util/iov.o 00:45:28.967 CC lib/util/math.o 00:45:28.967 LIB libspdk_vfio_user.a 00:45:28.967 LIB libspdk_ioat.a 00:45:28.967 CC lib/util/pipe.o 00:45:28.967 CC lib/util/strerror_tls.o 00:45:28.967 SO libspdk_ioat.so.7.0 00:45:28.967 SO libspdk_vfio_user.so.5.0 00:45:28.967 CC lib/util/string.o 00:45:28.967 SYMLINK libspdk_ioat.so 00:45:28.967 CC lib/util/uuid.o 00:45:28.967 CC lib/util/fd_group.o 00:45:28.967 SYMLINK libspdk_vfio_user.so 00:45:28.967 CC lib/util/xor.o 00:45:28.967 CC lib/util/zipf.o 00:45:29.905 LIB libspdk_trace_parser.a 00:45:29.905 SO libspdk_trace_parser.so.5.0 00:45:29.905 LIB libspdk_util.a 00:45:29.905 SYMLINK libspdk_trace_parser.so 00:45:29.905 SO libspdk_util.so.9.0 00:45:29.905 SYMLINK libspdk_util.so 00:45:30.163 CC lib/env_dpdk/memory.o 00:45:30.164 CC lib/env_dpdk/pci.o 00:45:30.164 CC lib/env_dpdk/env.o 00:45:30.164 CC lib/env_dpdk/threads.o 00:45:30.164 CC lib/env_dpdk/init.o 00:45:30.164 CC lib/env_dpdk/pci_ioat.o 00:45:30.164 CC lib/env_dpdk/pci_virtio.o 00:45:30.164 CC lib/json/json_parse.o 00:45:30.164 CC lib/conf/conf.o 00:45:30.164 CC lib/vmd/vmd.o 00:45:30.164 CC lib/vmd/led.o 00:45:30.164 CC lib/json/json_util.o 00:45:30.164 CC lib/json/json_write.o 00:45:30.422 LIB libspdk_conf.a 00:45:30.422 CC lib/env_dpdk/pci_vmd.o 00:45:30.422 SO libspdk_conf.so.6.0 00:45:30.422 SYMLINK libspdk_conf.so 00:45:30.422 CC lib/env_dpdk/pci_idxd.o 00:45:30.422 CC lib/env_dpdk/pci_event.o 00:45:30.680 CC lib/env_dpdk/sigbus_handler.o 00:45:30.680 CC lib/env_dpdk/pci_dpdk.o 00:45:30.680 CC lib/env_dpdk/pci_dpdk_2207.o 00:45:30.680 CC lib/env_dpdk/pci_dpdk_2211.o 00:45:30.680 LIB libspdk_vmd.a 00:45:30.680 SO libspdk_vmd.so.6.0 00:45:30.938 SYMLINK libspdk_vmd.so 00:45:30.938 LIB libspdk_json.a 00:45:30.938 SO libspdk_json.so.6.0 00:45:30.938 SYMLINK libspdk_json.so 00:45:31.195 CC lib/jsonrpc/jsonrpc_server.o 00:45:31.195 CC lib/jsonrpc/jsonrpc_client.o 00:45:31.195 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:45:31.195 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:45:31.454 LIB libspdk_env_dpdk.a 00:45:31.454 SO libspdk_env_dpdk.so.14.1 00:45:31.454 SYMLINK libspdk_env_dpdk.so 00:45:31.712 LIB libspdk_jsonrpc.a 00:45:31.712 SO libspdk_jsonrpc.so.6.0 00:45:31.712 SYMLINK libspdk_jsonrpc.so 00:45:31.970 CC lib/rpc/rpc.o 00:45:32.228 LIB libspdk_rpc.a 00:45:32.228 SO libspdk_rpc.so.6.0 00:45:32.486 SYMLINK libspdk_rpc.so 00:45:32.816 CC lib/keyring/keyring.o 00:45:32.817 CC lib/keyring/keyring_rpc.o 00:45:32.817 CC lib/trace/trace.o 00:45:32.817 CC lib/trace/trace_flags.o 00:45:32.817 CC lib/trace/trace_rpc.o 00:45:32.817 CC lib/notify/notify.o 00:45:32.817 CC lib/notify/notify_rpc.o 00:45:33.120 LIB libspdk_notify.a 00:45:33.120 SO libspdk_notify.so.6.0 00:45:33.120 LIB libspdk_keyring.a 00:45:33.120 SO libspdk_keyring.so.1.0 00:45:33.120 SYMLINK libspdk_notify.so 00:45:33.120 SYMLINK libspdk_keyring.so 00:45:33.120 LIB libspdk_trace.a 00:45:33.120 SO libspdk_trace.so.10.0 00:45:33.120 SYMLINK libspdk_trace.so 00:45:33.379 CC lib/sock/sock_rpc.o 00:45:33.379 CC lib/sock/sock.o 00:45:33.637 CC lib/thread/thread.o 00:45:33.637 CC lib/thread/iobuf.o 00:45:33.893 LIB libspdk_sock.a 00:45:34.151 SO libspdk_sock.so.9.0 00:45:34.151 SYMLINK libspdk_sock.so 00:45:34.408 CC lib/nvme/nvme_ctrlr.o 00:45:34.408 CC lib/nvme/nvme_ctrlr_cmd.o 00:45:34.408 CC lib/nvme/nvme_fabric.o 00:45:34.408 CC lib/nvme/nvme_ns_cmd.o 00:45:34.408 CC lib/nvme/nvme_ns.o 00:45:34.408 CC lib/nvme/nvme_pcie.o 00:45:34.408 CC lib/nvme/nvme_pcie_common.o 00:45:34.408 CC lib/nvme/nvme_qpair.o 00:45:34.408 CC lib/nvme/nvme.o 00:45:34.973 LIB libspdk_thread.a 00:45:34.973 SO libspdk_thread.so.10.0 00:45:34.973 SYMLINK libspdk_thread.so 00:45:34.973 CC lib/nvme/nvme_quirks.o 00:45:34.973 CC lib/nvme/nvme_transport.o 00:45:35.230 CC lib/nvme/nvme_discovery.o 00:45:35.488 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:45:35.743 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:45:35.743 CC lib/nvme/nvme_tcp.o 00:45:35.743 CC lib/accel/accel.o 00:45:35.743 CC lib/blob/blobstore.o 00:45:35.743 CC lib/blob/request.o 00:45:35.743 CC lib/init/json_config.o 00:45:35.743 CC lib/blob/zeroes.o 00:45:35.743 CC lib/virtio/virtio.o 00:45:36.001 CC lib/nvme/nvme_opal.o 00:45:36.001 CC lib/nvme/nvme_io_msg.o 00:45:36.001 CC lib/init/subsystem.o 00:45:36.260 CC lib/init/subsystem_rpc.o 00:45:36.260 CC lib/init/rpc.o 00:45:36.260 CC lib/virtio/virtio_vhost_user.o 00:45:36.260 CC lib/virtio/virtio_vfio_user.o 00:45:36.260 CC lib/virtio/virtio_pci.o 00:45:36.518 LIB libspdk_init.a 00:45:36.518 SO libspdk_init.so.5.0 00:45:36.518 SYMLINK libspdk_init.so 00:45:36.518 CC lib/nvme/nvme_poll_group.o 00:45:36.519 CC lib/nvme/nvme_zns.o 00:45:36.776 CC lib/nvme/nvme_stubs.o 00:45:36.776 CC lib/blob/blob_bs_dev.o 00:45:36.776 LIB libspdk_virtio.a 00:45:36.776 SO libspdk_virtio.so.7.0 00:45:36.776 CC lib/accel/accel_rpc.o 00:45:36.776 SYMLINK libspdk_virtio.so 00:45:36.776 CC lib/accel/accel_sw.o 00:45:36.776 CC lib/event/app.o 00:45:36.776 CC lib/event/reactor.o 00:45:37.034 CC lib/nvme/nvme_auth.o 00:45:37.034 CC lib/event/log_rpc.o 00:45:37.034 CC lib/event/app_rpc.o 00:45:37.034 LIB libspdk_accel.a 00:45:37.034 CC lib/event/scheduler_static.o 00:45:37.034 SO libspdk_accel.so.15.0 00:45:37.294 CC lib/nvme/nvme_cuse.o 00:45:37.294 SYMLINK libspdk_accel.so 00:45:37.552 LIB libspdk_event.a 00:45:37.552 CC lib/bdev/bdev_rpc.o 00:45:37.552 CC lib/bdev/bdev.o 00:45:37.552 CC lib/bdev/bdev_zone.o 00:45:37.552 CC lib/bdev/part.o 00:45:37.552 CC lib/bdev/scsi_nvme.o 00:45:37.552 SO libspdk_event.so.13.1 00:45:37.552 SYMLINK libspdk_event.so 00:45:38.117 LIB libspdk_nvme.a 00:45:38.117 SO libspdk_nvme.so.13.0 00:45:38.117 SYMLINK libspdk_nvme.so 00:45:39.053 LIB libspdk_blob.a 00:45:39.053 SO libspdk_blob.so.11.0 00:45:39.053 SYMLINK libspdk_blob.so 00:45:39.311 CC lib/lvol/lvol.o 00:45:39.311 CC lib/blobfs/tree.o 00:45:39.311 CC lib/blobfs/blobfs.o 00:45:40.247 LIB libspdk_blobfs.a 00:45:40.247 SO libspdk_blobfs.so.10.0 00:45:40.247 LIB libspdk_lvol.a 00:45:40.247 SO libspdk_lvol.so.10.0 00:45:40.247 SYMLINK libspdk_blobfs.so 00:45:40.247 LIB libspdk_bdev.a 00:45:40.247 SO libspdk_bdev.so.15.0 00:45:40.247 SYMLINK libspdk_lvol.so 00:45:40.247 SYMLINK libspdk_bdev.so 00:45:40.555 CC lib/scsi/dev.o 00:45:40.555 CC lib/scsi/lun.o 00:45:40.555 CC lib/scsi/scsi.o 00:45:40.555 CC lib/scsi/port.o 00:45:40.555 CC lib/scsi/scsi_bdev.o 00:45:40.555 CC lib/scsi/scsi_pr.o 00:45:40.555 CC lib/scsi/scsi_rpc.o 00:45:40.555 CC lib/ftl/ftl_core.o 00:45:40.555 CC lib/nbd/nbd.o 00:45:40.555 CC lib/nvmf/ctrlr.o 00:45:40.813 CC lib/nbd/nbd_rpc.o 00:45:40.813 CC lib/ftl/ftl_init.o 00:45:40.813 CC lib/ftl/ftl_layout.o 00:45:40.813 CC lib/ftl/ftl_debug.o 00:45:41.070 CC lib/ftl/ftl_io.o 00:45:41.070 CC lib/ftl/ftl_sb.o 00:45:41.070 CC lib/nvmf/ctrlr_discovery.o 00:45:41.070 CC lib/nvmf/ctrlr_bdev.o 00:45:41.070 LIB libspdk_nbd.a 00:45:41.070 CC lib/scsi/task.o 00:45:41.070 CC lib/ftl/ftl_l2p.o 00:45:41.070 SO libspdk_nbd.so.7.0 00:45:41.328 SYMLINK libspdk_nbd.so 00:45:41.328 CC lib/ftl/ftl_l2p_flat.o 00:45:41.328 CC lib/ftl/ftl_nv_cache.o 00:45:41.328 CC lib/ftl/ftl_band.o 00:45:41.328 CC lib/ftl/ftl_band_ops.o 00:45:41.328 CC lib/ftl/ftl_writer.o 00:45:41.328 CC lib/ftl/ftl_rq.o 00:45:41.328 LIB libspdk_scsi.a 00:45:41.328 SO libspdk_scsi.so.9.0 00:45:41.328 CC lib/nvmf/subsystem.o 00:45:41.328 SYMLINK libspdk_scsi.so 00:45:41.328 CC lib/nvmf/nvmf.o 00:45:41.586 CC lib/ftl/ftl_reloc.o 00:45:41.586 CC lib/nvmf/nvmf_rpc.o 00:45:41.586 CC lib/ftl/ftl_l2p_cache.o 00:45:41.586 CC lib/ftl/ftl_p2l.o 00:45:41.586 CC lib/iscsi/conn.o 00:45:41.586 CC lib/vhost/vhost.o 00:45:41.844 CC lib/iscsi/init_grp.o 00:45:41.844 CC lib/iscsi/iscsi.o 00:45:41.844 CC lib/ftl/mngt/ftl_mngt.o 00:45:42.103 CC lib/iscsi/md5.o 00:45:42.103 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:45:42.103 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:45:42.103 CC lib/iscsi/param.o 00:45:42.103 CC lib/iscsi/portal_grp.o 00:45:42.103 CC lib/nvmf/transport.o 00:45:42.103 CC lib/iscsi/tgt_node.o 00:45:42.361 CC lib/iscsi/iscsi_subsystem.o 00:45:42.361 CC lib/ftl/mngt/ftl_mngt_startup.o 00:45:42.361 CC lib/vhost/vhost_rpc.o 00:45:42.361 CC lib/vhost/vhost_scsi.o 00:45:42.361 CC lib/vhost/vhost_blk.o 00:45:42.361 CC lib/ftl/mngt/ftl_mngt_md.o 00:45:42.620 CC lib/ftl/mngt/ftl_mngt_misc.o 00:45:42.620 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:45:42.620 CC lib/nvmf/tcp.o 00:45:42.620 CC lib/nvmf/stubs.o 00:45:42.620 CC lib/vhost/rte_vhost_user.o 00:45:42.620 CC lib/iscsi/iscsi_rpc.o 00:45:42.878 CC lib/iscsi/task.o 00:45:42.878 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:45:42.878 CC lib/ftl/mngt/ftl_mngt_band.o 00:45:42.878 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:45:42.878 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:45:43.136 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:45:43.136 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:45:43.136 CC lib/ftl/utils/ftl_conf.o 00:45:43.136 CC lib/nvmf/mdns_server.o 00:45:43.136 CC lib/ftl/utils/ftl_md.o 00:45:43.136 CC lib/nvmf/auth.o 00:45:43.136 CC lib/ftl/utils/ftl_mempool.o 00:45:43.136 CC lib/ftl/utils/ftl_bitmap.o 00:45:43.395 CC lib/ftl/utils/ftl_property.o 00:45:43.395 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:45:43.395 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:45:43.395 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:45:43.395 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:45:43.395 LIB libspdk_iscsi.a 00:45:43.652 SO libspdk_iscsi.so.8.0 00:45:43.652 LIB libspdk_vhost.a 00:45:43.652 SO libspdk_vhost.so.8.0 00:45:43.652 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:45:43.652 SYMLINK libspdk_iscsi.so 00:45:43.652 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:45:43.652 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:45:43.652 CC lib/ftl/upgrade/ftl_sb_v3.o 00:45:43.652 SYMLINK libspdk_vhost.so 00:45:43.652 CC lib/ftl/upgrade/ftl_sb_v5.o 00:45:43.652 CC lib/ftl/nvc/ftl_nvc_dev.o 00:45:43.652 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:45:43.652 CC lib/ftl/base/ftl_base_dev.o 00:45:43.652 CC lib/ftl/base/ftl_base_bdev.o 00:45:43.910 LIB libspdk_nvmf.a 00:45:43.910 SO libspdk_nvmf.so.18.1 00:45:43.910 SYMLINK libspdk_nvmf.so 00:45:43.910 LIB libspdk_ftl.a 00:45:44.168 SO libspdk_ftl.so.9.0 00:45:44.168 SYMLINK libspdk_ftl.so 00:45:44.440 CC module/env_dpdk/env_dpdk_rpc.o 00:45:44.704 CC module/blob/bdev/blob_bdev.o 00:45:44.704 CC module/accel/error/accel_error.o 00:45:44.704 CC module/scheduler/dynamic/scheduler_dynamic.o 00:45:44.704 CC module/accel/ioat/accel_ioat.o 00:45:44.704 CC module/sock/posix/posix.o 00:45:44.704 CC module/keyring/file/keyring.o 00:45:44.704 CC module/keyring/linux/keyring.o 00:45:44.704 CC module/scheduler/gscheduler/gscheduler.o 00:45:44.704 LIB libspdk_env_dpdk_rpc.a 00:45:44.704 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:45:44.704 SO libspdk_env_dpdk_rpc.so.6.0 00:45:44.704 CC module/keyring/file/keyring_rpc.o 00:45:44.704 SYMLINK libspdk_env_dpdk_rpc.so 00:45:44.704 CC module/accel/ioat/accel_ioat_rpc.o 00:45:44.963 CC module/keyring/linux/keyring_rpc.o 00:45:44.963 CC module/accel/error/accel_error_rpc.o 00:45:44.963 LIB libspdk_scheduler_gscheduler.a 00:45:44.963 LIB libspdk_scheduler_dpdk_governor.a 00:45:44.963 SO libspdk_scheduler_gscheduler.so.4.0 00:45:44.963 SO libspdk_scheduler_dpdk_governor.so.4.0 00:45:44.963 SYMLINK libspdk_scheduler_gscheduler.so 00:45:44.963 LIB libspdk_blob_bdev.a 00:45:44.963 SYMLINK libspdk_scheduler_dpdk_governor.so 00:45:44.963 LIB libspdk_scheduler_dynamic.a 00:45:44.963 LIB libspdk_accel_ioat.a 00:45:44.963 SO libspdk_blob_bdev.so.11.0 00:45:44.963 SO libspdk_scheduler_dynamic.so.4.0 00:45:44.963 LIB libspdk_accel_error.a 00:45:44.963 LIB libspdk_keyring_linux.a 00:45:44.963 SO libspdk_accel_ioat.so.6.0 00:45:44.963 LIB libspdk_keyring_file.a 00:45:44.963 SO libspdk_keyring_linux.so.1.0 00:45:44.963 SO libspdk_accel_error.so.2.0 00:45:44.963 SYMLINK libspdk_blob_bdev.so 00:45:44.963 SO libspdk_keyring_file.so.1.0 00:45:44.963 SYMLINK libspdk_scheduler_dynamic.so 00:45:44.963 SYMLINK libspdk_accel_ioat.so 00:45:44.963 SYMLINK libspdk_keyring_linux.so 00:45:45.222 SYMLINK libspdk_accel_error.so 00:45:45.222 SYMLINK libspdk_keyring_file.so 00:45:45.479 CC module/blobfs/bdev/blobfs_bdev.o 00:45:45.479 CC module/bdev/nvme/bdev_nvme.o 00:45:45.479 CC module/bdev/null/bdev_null.o 00:45:45.479 CC module/bdev/malloc/bdev_malloc.o 00:45:45.479 CC module/bdev/lvol/vbdev_lvol.o 00:45:45.479 CC module/bdev/delay/vbdev_delay.o 00:45:45.479 CC module/bdev/error/vbdev_error.o 00:45:45.479 CC module/bdev/gpt/gpt.o 00:45:45.479 CC module/bdev/passthru/vbdev_passthru.o 00:45:45.479 LIB libspdk_sock_posix.a 00:45:45.479 SO libspdk_sock_posix.so.6.0 00:45:45.479 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:45:45.736 SYMLINK libspdk_sock_posix.so 00:45:45.736 CC module/bdev/gpt/vbdev_gpt.o 00:45:45.736 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:45:45.736 CC module/bdev/null/bdev_null_rpc.o 00:45:45.736 CC module/bdev/error/vbdev_error_rpc.o 00:45:45.736 CC module/bdev/malloc/bdev_malloc_rpc.o 00:45:45.736 CC module/bdev/nvme/bdev_nvme_rpc.o 00:45:45.736 LIB libspdk_blobfs_bdev.a 00:45:45.736 CC module/bdev/delay/vbdev_delay_rpc.o 00:45:45.736 SO libspdk_blobfs_bdev.so.6.0 00:45:45.736 LIB libspdk_bdev_null.a 00:45:45.736 LIB libspdk_bdev_passthru.a 00:45:45.994 LIB libspdk_bdev_error.a 00:45:45.994 SO libspdk_bdev_null.so.6.0 00:45:45.994 SO libspdk_bdev_passthru.so.6.0 00:45:45.994 SYMLINK libspdk_blobfs_bdev.so 00:45:45.994 SO libspdk_bdev_error.so.6.0 00:45:45.994 LIB libspdk_bdev_gpt.a 00:45:45.994 LIB libspdk_bdev_malloc.a 00:45:45.994 SO libspdk_bdev_gpt.so.6.0 00:45:45.994 SYMLINK libspdk_bdev_passthru.so 00:45:45.994 SYMLINK libspdk_bdev_null.so 00:45:45.994 SYMLINK libspdk_bdev_error.so 00:45:45.994 SO libspdk_bdev_malloc.so.6.0 00:45:45.994 LIB libspdk_bdev_delay.a 00:45:45.994 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:45:45.994 SYMLINK libspdk_bdev_gpt.so 00:45:45.994 SO libspdk_bdev_delay.so.6.0 00:45:45.994 SYMLINK libspdk_bdev_malloc.so 00:45:45.994 CC module/bdev/nvme/nvme_rpc.o 00:45:45.994 CC module/bdev/nvme/bdev_mdns_client.o 00:45:45.994 SYMLINK libspdk_bdev_delay.so 00:45:45.994 CC module/bdev/nvme/vbdev_opal.o 00:45:46.252 CC module/bdev/raid/bdev_raid.o 00:45:46.252 CC module/bdev/zone_block/vbdev_zone_block.o 00:45:46.252 CC module/bdev/aio/bdev_aio.o 00:45:46.252 CC module/bdev/aio/bdev_aio_rpc.o 00:45:46.252 CC module/bdev/split/vbdev_split.o 00:45:46.511 LIB libspdk_bdev_lvol.a 00:45:46.511 CC module/bdev/split/vbdev_split_rpc.o 00:45:46.511 CC module/bdev/ftl/bdev_ftl.o 00:45:46.511 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:45:46.511 SO libspdk_bdev_lvol.so.6.0 00:45:46.511 CC module/bdev/ftl/bdev_ftl_rpc.o 00:45:46.511 CC module/bdev/nvme/vbdev_opal_rpc.o 00:45:46.511 LIB libspdk_bdev_aio.a 00:45:46.769 SO libspdk_bdev_aio.so.6.0 00:45:46.769 SYMLINK libspdk_bdev_lvol.so 00:45:46.769 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:45:46.769 SYMLINK libspdk_bdev_aio.so 00:45:46.769 CC module/bdev/virtio/bdev_virtio_scsi.o 00:45:46.769 CC module/bdev/virtio/bdev_virtio_blk.o 00:45:46.769 LIB libspdk_bdev_split.a 00:45:46.769 SO libspdk_bdev_split.so.6.0 00:45:46.769 LIB libspdk_bdev_zone_block.a 00:45:46.769 SO libspdk_bdev_zone_block.so.6.0 00:45:46.769 SYMLINK libspdk_bdev_split.so 00:45:46.769 CC module/bdev/raid/bdev_raid_rpc.o 00:45:47.026 CC module/bdev/raid/bdev_raid_sb.o 00:45:47.026 CC module/bdev/raid/raid0.o 00:45:47.026 SYMLINK libspdk_bdev_zone_block.so 00:45:47.026 CC module/bdev/raid/raid1.o 00:45:47.026 LIB libspdk_bdev_ftl.a 00:45:47.026 SO libspdk_bdev_ftl.so.6.0 00:45:47.026 CC module/bdev/raid/concat.o 00:45:47.026 SYMLINK libspdk_bdev_ftl.so 00:45:47.026 CC module/bdev/virtio/bdev_virtio_rpc.o 00:45:47.284 LIB libspdk_bdev_raid.a 00:45:47.284 SO libspdk_bdev_raid.so.6.0 00:45:47.541 SYMLINK libspdk_bdev_raid.so 00:45:47.541 LIB libspdk_bdev_virtio.a 00:45:47.541 SO libspdk_bdev_virtio.so.6.0 00:45:47.541 SYMLINK libspdk_bdev_virtio.so 00:45:48.473 LIB libspdk_bdev_nvme.a 00:45:48.473 SO libspdk_bdev_nvme.so.7.0 00:45:48.473 SYMLINK libspdk_bdev_nvme.so 00:45:49.039 CC module/event/subsystems/scheduler/scheduler.o 00:45:49.039 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:45:49.039 CC module/event/subsystems/keyring/keyring.o 00:45:49.039 CC module/event/subsystems/iobuf/iobuf.o 00:45:49.039 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:45:49.039 CC module/event/subsystems/sock/sock.o 00:45:49.039 CC module/event/subsystems/vmd/vmd.o 00:45:49.039 CC module/event/subsystems/vmd/vmd_rpc.o 00:45:49.039 LIB libspdk_event_scheduler.a 00:45:49.039 LIB libspdk_event_keyring.a 00:45:49.039 LIB libspdk_event_iobuf.a 00:45:49.039 SO libspdk_event_scheduler.so.4.0 00:45:49.039 SO libspdk_event_keyring.so.1.0 00:45:49.039 LIB libspdk_event_vhost_blk.a 00:45:49.039 SO libspdk_event_iobuf.so.3.0 00:45:49.298 LIB libspdk_event_sock.a 00:45:49.298 SYMLINK libspdk_event_keyring.so 00:45:49.298 SO libspdk_event_vhost_blk.so.3.0 00:45:49.298 LIB libspdk_event_vmd.a 00:45:49.298 SYMLINK libspdk_event_iobuf.so 00:45:49.298 SYMLINK libspdk_event_scheduler.so 00:45:49.298 SO libspdk_event_sock.so.5.0 00:45:49.298 SO libspdk_event_vmd.so.6.0 00:45:49.298 SYMLINK libspdk_event_vhost_blk.so 00:45:49.298 SYMLINK libspdk_event_vmd.so 00:45:49.298 SYMLINK libspdk_event_sock.so 00:45:49.557 CC module/event/subsystems/accel/accel.o 00:45:49.557 LIB libspdk_event_accel.a 00:45:49.557 SO libspdk_event_accel.so.6.0 00:45:49.816 SYMLINK libspdk_event_accel.so 00:45:50.074 CC module/event/subsystems/bdev/bdev.o 00:45:50.332 LIB libspdk_event_bdev.a 00:45:50.332 SO libspdk_event_bdev.so.6.0 00:45:50.332 SYMLINK libspdk_event_bdev.so 00:45:50.591 CC module/event/subsystems/nbd/nbd.o 00:45:50.591 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:45:50.591 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:45:50.591 CC module/event/subsystems/scsi/scsi.o 00:45:50.849 LIB libspdk_event_nbd.a 00:45:50.849 SO libspdk_event_nbd.so.6.0 00:45:50.849 LIB libspdk_event_scsi.a 00:45:50.849 SYMLINK libspdk_event_nbd.so 00:45:50.849 SO libspdk_event_scsi.so.6.0 00:45:50.849 SYMLINK libspdk_event_scsi.so 00:45:50.849 LIB libspdk_event_nvmf.a 00:45:50.849 SO libspdk_event_nvmf.so.6.0 00:45:51.106 SYMLINK libspdk_event_nvmf.so 00:45:51.106 CC module/event/subsystems/iscsi/iscsi.o 00:45:51.106 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:45:51.365 LIB libspdk_event_iscsi.a 00:45:51.365 LIB libspdk_event_vhost_scsi.a 00:45:51.365 SO libspdk_event_iscsi.so.6.0 00:45:51.365 SO libspdk_event_vhost_scsi.so.3.0 00:45:51.365 SYMLINK libspdk_event_iscsi.so 00:45:51.365 SYMLINK libspdk_event_vhost_scsi.so 00:45:51.623 SO libspdk.so.6.0 00:45:51.623 SYMLINK libspdk.so 00:45:51.623 make[1]: Nothing to be done for 'all'. 00:45:51.880 CXX app/trace/trace.o 00:45:51.880 CC examples/sock/hello_world/hello_sock.o 00:45:51.880 CC examples/accel/perf/accel_perf.o 00:45:51.880 CC examples/vmd/lsvmd/lsvmd.o 00:45:51.880 CC examples/util/zipf/zipf.o 00:45:51.880 CC examples/ioat/perf/perf.o 00:45:51.880 CC examples/nvme/hello_world/hello_world.o 00:45:52.138 CC examples/blob/hello_world/hello_blob.o 00:45:52.138 CC examples/bdev/hello_world/hello_bdev.o 00:45:52.138 CC examples/nvmf/nvmf/nvmf.o 00:45:52.138 LINK lsvmd 00:45:52.138 LINK zipf 00:45:52.138 LINK ioat_perf 00:45:52.138 LINK hello_world 00:45:52.397 LINK hello_bdev 00:45:52.397 LINK hello_blob 00:45:52.397 LINK hello_sock 00:45:52.397 LINK spdk_trace 00:45:52.397 CC examples/vmd/led/led.o 00:45:52.397 CC examples/ioat/verify/verify.o 00:45:52.397 CC examples/nvme/reconnect/reconnect.o 00:45:52.397 LINK nvmf 00:45:52.656 CC examples/bdev/bdevperf/bdevperf.o 00:45:52.656 LINK accel_perf 00:45:52.656 CC examples/nvme/nvme_manage/nvme_manage.o 00:45:52.656 LINK led 00:45:52.656 CC examples/blob/cli/blobcli.o 00:45:52.656 CC app/trace_record/trace_record.o 00:45:52.656 CC examples/thread/thread/thread_ex.o 00:45:52.656 LINK verify 00:45:52.914 CC app/nvmf_tgt/nvmf_main.o 00:45:52.914 CC app/iscsi_tgt/iscsi_tgt.o 00:45:52.914 CC app/spdk_lspci/spdk_lspci.o 00:45:52.914 LINK reconnect 00:45:52.914 CC app/spdk_tgt/spdk_tgt.o 00:45:52.914 LINK thread 00:45:53.173 LINK nvmf_tgt 00:45:53.173 LINK spdk_trace_record 00:45:53.173 LINK spdk_lspci 00:45:53.173 LINK iscsi_tgt 00:45:53.173 LINK spdk_tgt 00:45:53.173 CC examples/nvme/arbitration/arbitration.o 00:45:53.173 LINK blobcli 00:45:53.173 LINK nvme_manage 00:45:53.173 CC examples/nvme/hotplug/hotplug.o 00:45:53.432 CC examples/nvme/cmb_copy/cmb_copy.o 00:45:53.432 CC app/spdk_nvme_perf/perf.o 00:45:53.432 CC examples/nvme/abort/abort.o 00:45:53.432 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:45:53.432 CC app/spdk_nvme_identify/identify.o 00:45:53.432 LINK bdevperf 00:45:53.432 CC app/spdk_nvme_discover/discovery_aer.o 00:45:53.432 CC app/spdk_top/spdk_top.o 00:45:53.432 LINK hotplug 00:45:53.432 LINK cmb_copy 00:45:53.691 LINK arbitration 00:45:53.691 LINK pmr_persistence 00:45:53.691 LINK spdk_nvme_discover 00:45:53.691 LINK abort 00:45:53.691 CC app/vhost/vhost.o 00:45:53.691 CC app/spdk_dd/spdk_dd.o 00:45:53.949 CC examples/interrupt_tgt/interrupt_tgt.o 00:45:53.949 LINK vhost 00:45:54.207 LINK interrupt_tgt 00:45:54.207 LINK spdk_dd 00:45:54.466 LINK spdk_nvme_perf 00:45:54.724 LINK spdk_top 00:45:54.724 LINK spdk_nvme_identify 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel_module.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/base64.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/assert.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/barrier.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_module.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_zone.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_array.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_pool.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob_bdev.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs_bdev.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/conf.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/config.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/cpuset.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc32.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc16.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc64.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dif.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dma.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/endian.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env_dpdk.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/event.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd_group.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/file.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ftl.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/gpt_spec.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/hexlify.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/histogram_data.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd_spec.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/init.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat_spec.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/iscsi_spec.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/json.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/jsonrpc.h 00:45:55.295 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring.h 00:45:55.295 cp /home/vagrant/spdk_repo/spdk/scripts/rpc.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:45:55.295 cp /home/vagrant/spdk_repo/spdk/scripts/spdkcli.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring_module.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/likely.h 00:45:55.296 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:45:55.296 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/log.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/lvol.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_rpc 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_cli 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/memory.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/mmio.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nbd.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/notify.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd_spec.h 00:45:55.296 patchelf: not an ELF executable 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_intel.h 00:45:55.296 patchelf: not an ELF executable 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_zns.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_cmd.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_fc_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_transport.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pci_ids.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pipe.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue_extras.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/reduce.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/rpc.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scheduler.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/sock.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/stdinc.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/string.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/thread.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace_parser.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/tree.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/uuid.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ublk.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/util.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/version.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_spec.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfu_target.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_pci.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vhost.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vmd.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/xor.h 00:45:55.296 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/zipf.h 00:45:55.557 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:45:55.816 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:45:55.816 Processing /home/vagrant/spdk_repo/spdk/python 00:45:55.816 DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. 00:45:55.816 pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. 00:45:56.381 Using legacy 'setup.py install' for spdk, since package 'wheel' is not installed. 00:45:56.639 Installing collected packages: spdk 00:45:56.639 Running setup.py install for spdk: started 00:45:56.898 Running setup.py install for spdk: finished with status 'done' 00:45:56.898 Successfully installed spdk-24.9rc0 00:45:57.835 rm -rf /home/vagrant/spdk_repo/spdk/python/spdk.egg-info 00:45:58.862 The Meson build system 00:45:58.862 Version: 1.4.0 00:45:58.862 Source dir: /home/vagrant/spdk_repo/spdk/dpdk 00:45:58.862 Build dir: /home/vagrant/spdk_repo/spdk/dpdk/build-tmp 00:45:58.862 Build type: native build 00:45:58.862 Program cat found: YES (/bin/cat) 00:45:58.862 Project name: DPDK 00:45:58.862 Project version: 24.03.0 00:45:58.862 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:45:58.862 C linker for the host machine: cc ld.bfd 2.35.2-42 00:45:58.862 Host machine cpu family: x86_64 00:45:58.862 Host machine cpu: x86_64 00:45:58.862 Message: ## Building in Developer Mode ## 00:45:58.862 Program pkg-config found: YES (/bin/pkg-config) 00:45:58.862 Program check-symbols.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/check-symbols.sh) 00:45:58.862 Program options-ibverbs-static.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:45:58.862 Program python3 found: YES (/usr/bin/python3) 00:45:58.862 Program cat found: YES (/bin/cat) 00:45:58.862 Compiler for C supports arguments -march=native: YES (cached) 00:45:58.862 Checking for size of "void *" : 8 (cached) 00:45:58.862 Checking for size of "void *" : 8 (cached) 00:45:58.862 Compiler for C supports link arguments -Wl,--undefined-version: NO (cached) 00:45:58.862 Library m found: YES 00:45:58.862 Library numa found: YES 00:45:58.862 Has header "numaif.h" : YES (cached) 00:45:58.862 Library fdt found: NO 00:45:58.862 Library execinfo found: NO 00:45:58.862 Has header "execinfo.h" : YES (cached) 00:45:58.862 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:45:58.862 Run-time dependency libarchive found: NO (tried pkgconfig) 00:45:58.862 Run-time dependency libbsd found: NO (tried pkgconfig) 00:45:58.862 Run-time dependency jansson found: NO (tried pkgconfig) 00:45:58.862 Dependency openssl found: YES 3.0.7 (cached) 00:45:58.862 Run-time dependency libpcap found: NO (tried pkgconfig) 00:45:58.862 Library pcap found: NO 00:45:58.862 Compiler for C supports arguments -Wcast-qual: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wdeprecated: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wformat: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wformat-nonliteral: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wformat-security: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wmissing-declarations: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wmissing-prototypes: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wnested-externs: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wold-style-definition: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wpointer-arith: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wsign-compare: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wstrict-prototypes: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wundef: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wwrite-strings: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wno-address-of-packed-member: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wno-packed-not-aligned: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wno-missing-field-initializers: YES (cached) 00:45:58.862 Compiler for C supports arguments -Wno-zero-length-bounds: YES (cached) 00:45:58.862 Program objdump found: YES (/bin/objdump) 00:45:58.862 Compiler for C supports arguments -mavx512f: YES (cached) 00:45:58.862 Checking if "AVX512 checking" compiles: YES (cached) 00:45:58.862 Fetching value of define "__SSE4_2__" : 1 (cached) 00:45:58.862 Fetching value of define "__AES__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX2__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512BW__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512CD__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512F__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512VL__" : 1 (cached) 00:45:58.862 Fetching value of define "__PCLMUL__" : 1 (cached) 00:45:58.862 Fetching value of define "__RDRND__" : 1 (cached) 00:45:58.862 Fetching value of define "__RDSEED__" : 1 (cached) 00:45:58.862 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:45:58.862 Fetching value of define "__znver1__" : (undefined) (cached) 00:45:58.862 Fetching value of define "__znver2__" : (undefined) (cached) 00:45:58.862 Fetching value of define "__znver3__" : (undefined) (cached) 00:45:58.862 Fetching value of define "__znver4__" : (undefined) (cached) 00:45:58.862 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:45:58.862 Message: lib/log: Defining dependency "log" 00:45:58.862 Message: lib/kvargs: Defining dependency "kvargs" 00:45:58.862 Message: lib/telemetry: Defining dependency "telemetry" 00:45:58.862 Checking for function "getentropy" : NO (cached) 00:45:58.862 Message: lib/eal: Defining dependency "eal" 00:45:58.862 Message: lib/ring: Defining dependency "ring" 00:45:58.862 Message: lib/rcu: Defining dependency "rcu" 00:45:58.862 Message: lib/mempool: Defining dependency "mempool" 00:45:58.862 Message: lib/mbuf: Defining dependency "mbuf" 00:45:58.862 Fetching value of define "__PCLMUL__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512F__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512BW__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:45:58.862 Fetching value of define "__AVX512VL__" : 1 (cached) 00:45:58.862 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:45:58.862 Compiler for C supports arguments -mpclmul: YES (cached) 00:45:58.862 Compiler for C supports arguments -maes: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx512f: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx512bw: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx512dq: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx512vl: YES (cached) 00:45:58.862 Compiler for C supports arguments -mvpclmulqdq: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx2: YES (cached) 00:45:58.862 Compiler for C supports arguments -mavx: YES (cached) 00:45:58.862 Message: lib/net: Defining dependency "net" 00:45:58.862 Message: lib/meter: Defining dependency "meter" 00:45:58.862 Message: lib/ethdev: Defining dependency "ethdev" 00:45:58.862 Message: lib/pci: Defining dependency "pci" 00:45:58.862 Message: lib/cmdline: Defining dependency "cmdline" 00:45:58.862 Message: lib/hash: Defining dependency "hash" 00:45:58.862 Message: lib/timer: Defining dependency "timer" 00:45:58.862 Message: lib/compressdev: Defining dependency "compressdev" 00:45:58.862 Message: lib/cryptodev: Defining dependency "cryptodev" 00:45:58.862 Message: lib/dmadev: Defining dependency "dmadev" 00:45:58.862 Compiler for C supports arguments -Wno-cast-qual: YES (cached) 00:45:58.862 Message: lib/power: Defining dependency "power" 00:45:58.862 Message: lib/reorder: Defining dependency "reorder" 00:45:58.862 Message: lib/security: Defining dependency "security" 00:45:58.862 Has header "linux/userfaultfd.h" : YES (cached) 00:45:58.862 Has header "linux/vduse.h" : NO (cached) 00:45:58.862 Message: lib/vhost: Defining dependency "vhost" 00:45:58.862 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:45:58.862 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:45:58.862 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:45:58.862 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:45:58.862 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:45:58.862 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:45:58.863 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:45:58.863 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:45:58.863 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:45:58.863 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:45:58.863 Program doxygen found: YES (/bin/doxygen) 00:45:58.863 Configuring doxy-api-html.conf using configuration 00:45:58.863 Configuring doxy-api-man.conf using configuration 00:45:58.863 Program mandb found: YES (/bin/mandb) 00:45:58.863 Program sphinx-build found: NO 00:45:58.863 Configuring rte_build_config.h using configuration 00:45:58.863 Message: 00:45:58.863 ================= 00:45:58.863 Applications Enabled 00:45:58.863 ================= 00:45:58.863 00:45:58.863 apps: 00:45:58.863 00:45:58.863 00:45:58.863 Message: 00:45:58.863 ================= 00:45:58.863 Libraries Enabled 00:45:58.863 ================= 00:45:58.863 00:45:58.863 libs: 00:45:58.863 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:45:58.863 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:45:58.863 cryptodev, dmadev, power, reorder, security, vhost, 00:45:58.863 00:45:58.863 Message: 00:45:58.863 =============== 00:45:58.863 Drivers Enabled 00:45:58.863 =============== 00:45:58.863 00:45:58.863 common: 00:45:58.863 00:45:58.863 bus: 00:45:58.863 pci, vdev, 00:45:58.863 mempool: 00:45:58.863 ring, 00:45:58.863 dma: 00:45:58.863 00:45:58.863 net: 00:45:58.863 00:45:58.863 crypto: 00:45:58.863 00:45:58.863 compress: 00:45:58.863 00:45:58.863 vdpa: 00:45:58.863 00:45:58.863 00:45:58.863 Message: 00:45:58.863 ================= 00:45:58.863 Content Skipped 00:45:58.863 ================= 00:45:58.863 00:45:58.863 apps: 00:45:58.863 dumpcap: explicitly disabled via build config 00:45:58.863 graph: explicitly disabled via build config 00:45:58.863 pdump: explicitly disabled via build config 00:45:58.863 proc-info: explicitly disabled via build config 00:45:58.863 test-acl: explicitly disabled via build config 00:45:58.863 test-bbdev: explicitly disabled via build config 00:45:58.863 test-cmdline: explicitly disabled via build config 00:45:58.863 test-compress-perf: explicitly disabled via build config 00:45:58.863 test-crypto-perf: explicitly disabled via build config 00:45:58.863 test-dma-perf: explicitly disabled via build config 00:45:58.863 test-eventdev: explicitly disabled via build config 00:45:58.863 test-fib: explicitly disabled via build config 00:45:58.863 test-flow-perf: explicitly disabled via build config 00:45:58.863 test-gpudev: explicitly disabled via build config 00:45:58.863 test-mldev: explicitly disabled via build config 00:45:58.863 test-pipeline: explicitly disabled via build config 00:45:58.863 test-pmd: explicitly disabled via build config 00:45:58.863 test-regex: explicitly disabled via build config 00:45:58.863 test-sad: explicitly disabled via build config 00:45:58.863 test-security-perf: explicitly disabled via build config 00:45:58.863 00:45:58.863 libs: 00:45:58.863 argparse: explicitly disabled via build config 00:45:58.863 metrics: explicitly disabled via build config 00:45:58.863 acl: explicitly disabled via build config 00:45:58.863 bbdev: explicitly disabled via build config 00:45:58.863 bitratestats: explicitly disabled via build config 00:45:58.863 bpf: explicitly disabled via build config 00:45:58.863 cfgfile: explicitly disabled via build config 00:45:58.863 distributor: explicitly disabled via build config 00:45:58.863 efd: explicitly disabled via build config 00:45:58.863 eventdev: explicitly disabled via build config 00:45:58.863 dispatcher: explicitly disabled via build config 00:45:58.863 gpudev: explicitly disabled via build config 00:45:58.863 gro: explicitly disabled via build config 00:45:58.863 gso: explicitly disabled via build config 00:45:58.863 ip_frag: explicitly disabled via build config 00:45:58.863 jobstats: explicitly disabled via build config 00:45:58.863 latencystats: explicitly disabled via build config 00:45:58.863 lpm: explicitly disabled via build config 00:45:58.863 member: explicitly disabled via build config 00:45:58.863 pcapng: explicitly disabled via build config 00:45:58.863 rawdev: explicitly disabled via build config 00:45:58.863 regexdev: explicitly disabled via build config 00:45:58.863 mldev: explicitly disabled via build config 00:45:58.863 rib: explicitly disabled via build config 00:45:58.863 sched: explicitly disabled via build config 00:45:58.863 stack: explicitly disabled via build config 00:45:58.863 ipsec: explicitly disabled via build config 00:45:58.863 pdcp: explicitly disabled via build config 00:45:58.863 fib: explicitly disabled via build config 00:45:58.863 port: explicitly disabled via build config 00:45:58.863 pdump: explicitly disabled via build config 00:45:58.863 table: explicitly disabled via build config 00:45:58.863 pipeline: explicitly disabled via build config 00:45:58.863 graph: explicitly disabled via build config 00:45:58.863 node: explicitly disabled via build config 00:45:58.863 00:45:58.863 drivers: 00:45:58.863 common/cpt: not in enabled drivers build config 00:45:58.863 common/dpaax: not in enabled drivers build config 00:45:58.863 common/iavf: not in enabled drivers build config 00:45:58.863 common/idpf: not in enabled drivers build config 00:45:58.863 common/ionic: not in enabled drivers build config 00:45:58.863 common/mvep: not in enabled drivers build config 00:45:58.863 common/octeontx: not in enabled drivers build config 00:45:58.863 bus/auxiliary: not in enabled drivers build config 00:45:58.863 bus/cdx: not in enabled drivers build config 00:45:58.863 bus/dpaa: not in enabled drivers build config 00:45:58.863 bus/fslmc: not in enabled drivers build config 00:45:58.863 bus/ifpga: not in enabled drivers build config 00:45:58.863 bus/platform: not in enabled drivers build config 00:45:58.863 bus/uacce: not in enabled drivers build config 00:45:58.863 bus/vmbus: not in enabled drivers build config 00:45:58.863 common/cnxk: not in enabled drivers build config 00:45:58.863 common/mlx5: not in enabled drivers build config 00:45:58.863 common/nfp: not in enabled drivers build config 00:45:58.863 common/nitrox: not in enabled drivers build config 00:45:58.863 common/qat: not in enabled drivers build config 00:45:58.863 common/sfc_efx: not in enabled drivers build config 00:45:58.863 mempool/bucket: not in enabled drivers build config 00:45:58.863 mempool/cnxk: not in enabled drivers build config 00:45:58.863 mempool/dpaa: not in enabled drivers build config 00:45:58.863 mempool/dpaa2: not in enabled drivers build config 00:45:58.863 mempool/octeontx: not in enabled drivers build config 00:45:58.863 mempool/stack: not in enabled drivers build config 00:45:58.863 dma/cnxk: not in enabled drivers build config 00:45:58.863 dma/dpaa: not in enabled drivers build config 00:45:58.863 dma/dpaa2: not in enabled drivers build config 00:45:58.863 dma/hisilicon: not in enabled drivers build config 00:45:58.863 dma/idxd: not in enabled drivers build config 00:45:58.863 dma/ioat: not in enabled drivers build config 00:45:58.863 dma/skeleton: not in enabled drivers build config 00:45:58.863 net/af_packet: not in enabled drivers build config 00:45:58.863 net/af_xdp: not in enabled drivers build config 00:45:58.863 net/ark: not in enabled drivers build config 00:45:58.863 net/atlantic: not in enabled drivers build config 00:45:58.863 net/avp: not in enabled drivers build config 00:45:58.863 net/axgbe: not in enabled drivers build config 00:45:58.863 net/bnx2x: not in enabled drivers build config 00:45:58.863 net/bnxt: not in enabled drivers build config 00:45:58.863 net/bonding: not in enabled drivers build config 00:45:58.863 net/cnxk: not in enabled drivers build config 00:45:58.863 net/cpfl: not in enabled drivers build config 00:45:58.863 net/cxgbe: not in enabled drivers build config 00:45:58.863 net/dpaa: not in enabled drivers build config 00:45:58.863 net/dpaa2: not in enabled drivers build config 00:45:58.863 net/e1000: not in enabled drivers build config 00:45:58.863 net/ena: not in enabled drivers build config 00:45:58.863 net/enetc: not in enabled drivers build config 00:45:58.863 net/enetfec: not in enabled drivers build config 00:45:58.863 net/enic: not in enabled drivers build config 00:45:58.863 net/failsafe: not in enabled drivers build config 00:45:58.863 net/fm10k: not in enabled drivers build config 00:45:58.863 net/gve: not in enabled drivers build config 00:45:58.863 net/hinic: not in enabled drivers build config 00:45:58.863 net/hns3: not in enabled drivers build config 00:45:58.863 net/i40e: not in enabled drivers build config 00:45:58.863 net/iavf: not in enabled drivers build config 00:45:58.863 net/ice: not in enabled drivers build config 00:45:58.863 net/idpf: not in enabled drivers build config 00:45:58.863 net/igc: not in enabled drivers build config 00:45:58.863 net/ionic: not in enabled drivers build config 00:45:58.863 net/ipn3ke: not in enabled drivers build config 00:45:58.863 net/ixgbe: not in enabled drivers build config 00:45:58.863 net/mana: not in enabled drivers build config 00:45:58.863 net/memif: not in enabled drivers build config 00:45:58.863 net/mlx4: not in enabled drivers build config 00:45:58.863 net/mlx5: not in enabled drivers build config 00:45:58.863 net/mvneta: not in enabled drivers build config 00:45:58.863 net/mvpp2: not in enabled drivers build config 00:45:58.863 net/netvsc: not in enabled drivers build config 00:45:58.863 net/nfb: not in enabled drivers build config 00:45:58.863 net/nfp: not in enabled drivers build config 00:45:58.863 net/ngbe: not in enabled drivers build config 00:45:58.863 net/null: not in enabled drivers build config 00:45:58.863 net/octeontx: not in enabled drivers build config 00:45:58.863 net/octeon_ep: not in enabled drivers build config 00:45:58.863 net/pcap: not in enabled drivers build config 00:45:58.863 net/pfe: not in enabled drivers build config 00:45:58.863 net/qede: not in enabled drivers build config 00:45:58.863 net/ring: not in enabled drivers build config 00:45:58.863 net/sfc: not in enabled drivers build config 00:45:58.863 net/softnic: not in enabled drivers build config 00:45:58.863 net/tap: not in enabled drivers build config 00:45:58.863 net/thunderx: not in enabled drivers build config 00:45:58.863 net/txgbe: not in enabled drivers build config 00:45:58.863 net/vdev_netvsc: not in enabled drivers build config 00:45:58.863 net/vhost: not in enabled drivers build config 00:45:58.863 net/virtio: not in enabled drivers build config 00:45:58.863 net/vmxnet3: not in enabled drivers build config 00:45:58.863 raw/*: missing internal dependency, "rawdev" 00:45:58.863 crypto/armv8: not in enabled drivers build config 00:45:58.864 crypto/bcmfs: not in enabled drivers build config 00:45:58.864 crypto/caam_jr: not in enabled drivers build config 00:45:58.864 crypto/ccp: not in enabled drivers build config 00:45:58.864 crypto/cnxk: not in enabled drivers build config 00:45:58.864 crypto/dpaa_sec: not in enabled drivers build config 00:45:58.864 crypto/dpaa2_sec: not in enabled drivers build config 00:45:58.864 crypto/ipsec_mb: not in enabled drivers build config 00:45:58.864 crypto/mlx5: not in enabled drivers build config 00:45:58.864 crypto/mvsam: not in enabled drivers build config 00:45:58.864 crypto/nitrox: not in enabled drivers build config 00:45:58.864 crypto/null: not in enabled drivers build config 00:45:58.864 crypto/octeontx: not in enabled drivers build config 00:45:58.864 crypto/openssl: not in enabled drivers build config 00:45:58.864 crypto/scheduler: not in enabled drivers build config 00:45:58.864 crypto/uadk: not in enabled drivers build config 00:45:58.864 crypto/virtio: not in enabled drivers build config 00:45:58.864 compress/isal: not in enabled drivers build config 00:45:58.864 compress/mlx5: not in enabled drivers build config 00:45:58.864 compress/nitrox: not in enabled drivers build config 00:45:58.864 compress/octeontx: not in enabled drivers build config 00:45:58.864 compress/zlib: not in enabled drivers build config 00:45:58.864 regex/*: missing internal dependency, "regexdev" 00:45:58.864 ml/*: missing internal dependency, "mldev" 00:45:58.864 vdpa/ifc: not in enabled drivers build config 00:45:58.864 vdpa/mlx5: not in enabled drivers build config 00:45:58.864 vdpa/nfp: not in enabled drivers build config 00:45:58.864 vdpa/sfc: not in enabled drivers build config 00:45:58.864 event/*: missing internal dependency, "eventdev" 00:45:58.864 baseband/*: missing internal dependency, "bbdev" 00:45:58.864 gpu/*: missing internal dependency, "gpudev" 00:45:58.864 00:45:58.864 00:45:59.123 Cleaning... 0 files. 00:45:59.123 Build targets in project: 85 00:45:59.123 00:45:59.123 DPDK 24.03.0 00:45:59.123 00:45:59.123 User defined options 00:45:59.123 default_library : shared 00:45:59.123 libdir : lib 00:45:59.123 prefix : /usr/local 00:45:59.123 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:45:59.123 c_link_args : 00:45:59.123 cpu_instruction_set: native 00:45:59.123 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:45:59.123 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:45:59.123 enable_docs : false 00:45:59.123 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:45:59.123 enable_kmods : false 00:45:59.123 tests : false 00:45:59.123 00:45:59.123 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:45:59.697 Installing subdir /home/vagrant/spdk_repo/spdk/dpdk/examples to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bbdev_app/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bbdev_app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bbdev_app/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bbdev_app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bond/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bond 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bond/commands.list to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bond 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bond/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bond 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bpf/README to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bpf 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bpf/dummy.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bpf 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bpf/t1.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bpf 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bpf/t2.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bpf 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/bpf/t3.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/bpf 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/commands.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/commands.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/parse_obj_list.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/cmdline/parse_obj_list.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/common/pkt_group.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/common 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/common/altivec/port_group.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/common/altivec 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/common/neon/port_group.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/common/neon 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/common/sse/port_group.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/common/sse 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/distributor/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/distributor 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/distributor/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/distributor 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/dma/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/dma 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/dma/dmafwd.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/dma 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/ethtool-app/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/ethtool-app/ethapp.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/ethtool-app/ethapp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/ethtool-app/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/lib/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/lib/rte_ethtool.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:45:59.697 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ethtool/lib/rte_ethtool.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/eventdev_pipeline/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/eventdev_pipeline/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/eventdev_pipeline/pipeline_common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_dev_self_test.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_dev_self_test.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_aes.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_ccm.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_cmac.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_gcm.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_hmac.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_rsa.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_sha.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_tdes.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/fips_validation_xts.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/fips_validation/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/flow_filtering/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/flow_filtering/flow_blocks.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/flow_filtering/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/helloworld/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/helloworld 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/helloworld/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/helloworld 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_fragmentation/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_fragmentation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_fragmentation/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_fragmentation 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/action.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/action.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/cli.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/cli.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/conn.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/conn.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/cryptodev.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/cryptodev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/link.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/link.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/mempool.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/mempool.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/parser.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/parser.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/pipeline.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/pipeline.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/swq.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/swq.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/tap.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/tap.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/thread.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/thread.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/tmgr.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/tmgr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/firewall.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/flow.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/route.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/rss.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.698 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_pipeline/examples/tap.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_reassembly/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_reassembly 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ip_reassembly/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ip_reassembly 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ep0.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ep1.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/esp.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/esp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/event_helper.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/event_helper.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/flow.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/flow.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipip.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec_neon.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec_process.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec_worker.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/ipsec_worker.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/parser.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/parser.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/rt.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/sa.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/sad.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/sad.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/sp4.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/sp6.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/linux_test.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/load_env.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/pkttest.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/pkttest.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/run_test.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipv4_multicast/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipv4_multicast 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ipv4_multicast/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ipv4_multicast 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-cat/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-cat/cat.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-cat/cat.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:45:59.699 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-crypto/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-crypto 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-crypto/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-crypto 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_common.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_event.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_event.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_poll.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/l2fwd_poll.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-event/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-jobstats/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-jobstats 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-jobstats/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-jobstats 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/shm.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/shm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-macsec/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-macsec 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd-macsec/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd-macsec 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l2fwd/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l2fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-graph/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-graph 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-graph/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-graph 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-power/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-power/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-power/main.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-power/perf_core.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd-power/perf_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/em_default_v4.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/em_default_v6.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/em_route_parse.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_acl.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_acl.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_altivec.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_event.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_event.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_event_generic.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_fib.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_lpm.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_lpm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_neon.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_route.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/l3fwd_sse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/lpm_default_v4.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/lpm_default_v6.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/lpm_route_parse.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/l3fwd/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:45:59.700 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/link_status_interrupt/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/link_status_interrupt 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/link_status_interrupt/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/link_status_interrupt 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/client_server_mp/shared/common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/shared 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/hotplug_mp/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/hotplug_mp/commands.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/hotplug_mp/commands.list to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/hotplug_mp/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/simple_mp/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/simple_mp/commands.list to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/simple_mp/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/simple_mp/mp_commands.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/simple_mp/mp_commands.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/symmetric_mp/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/symmetric_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/multi_process/symmetric_mp/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/multi_process/symmetric_mp 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ntb/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ntb 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ntb/commands.list to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ntb 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ntb/ntb_fwd.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ntb 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/packet_ordering/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/packet_ordering 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/packet_ordering/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/packet_ordering 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/cli.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/cli.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/conn.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/conn.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/obj.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/obj.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/thread.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/thread.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ethdev.io to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/fib.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/fib.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/fib_routing_table.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/hash_func.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/hash_func.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipsec.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.701 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipsec.io to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipsec.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipsec_sa.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/learner.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/learner.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/meter.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/meter.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/mirroring.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/mirroring.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/packet.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/pcap.io to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/recirculation.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/recirculation.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/registers.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/registers.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/rss.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/rss.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/selector.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/selector.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/selector.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/varbit.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/varbit.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/vxlan.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/vxlan.spec to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/vxlan_table.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/pipeline/examples/vxlan_table.txt to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ptpclient/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ptpclient 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/ptpclient/ptpclient.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/ptpclient 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_meter/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_meter/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_meter/main.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_meter/rte_policer.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_meter/rte_policer.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/app_thread.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/args.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/cfg_file.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/cfg_file.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/cmdline.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/init.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/main.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/profile.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/profile_ov.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/profile_pie.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/profile_red.cfg to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/qos_sched/stats.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/rxtx_callbacks/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/rxtx_callbacks 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/rxtx_callbacks/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/rxtx_callbacks 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd 00:45:59.702 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_node/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_node 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_node/node.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_node 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/args.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/args.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/init.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/init.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/efd_server/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/server_node_efd/shared/common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/shared 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/service_cores/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/service_cores 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/service_cores/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/service_cores 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/skeleton/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/skeleton 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/skeleton/basicfwd.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/skeleton 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/timer/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/timer 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/timer/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/timer 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vdpa/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vdpa/commands.list to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vdpa/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vdpa/vdpa_blk_compact.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost/main.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost/virtio_net.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/blk.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/blk_spec.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/vhost_blk.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/vhost_blk.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_blk/vhost_blk_compat.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_crypto/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_crypto 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vhost_crypto/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vhost_crypto 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/channel_manager.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/channel_manager.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/channel_monitor.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/channel_monitor.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/oob_monitor.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/parse.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/parse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/power_manager.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/power_manager.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/vm_power_cli.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/vm_power_cli.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/parse.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/parse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vmdq/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vmdq 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vmdq/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vmdq 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vmdq_dcb/Makefile to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vmdq_dcb 00:45:59.703 Installing /home/vagrant/spdk_repo/spdk/dpdk/examples/vmdq_dcb/main.c to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/dpdk/examples/vmdq_dcb 00:45:59.703 Installing lib/librte_log.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_log.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_kvargs.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_kvargs.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_telemetry.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_telemetry.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_eal.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_eal.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_ring.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_ring.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.703 Installing lib/librte_rcu.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_rcu.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_mempool.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_mempool.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_mbuf.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_mbuf.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_net.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_net.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_meter.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_meter.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_ethdev.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_ethdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_pci.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_pci.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_cmdline.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_cmdline.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_hash.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_hash.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_timer.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_timer.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_compressdev.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_compressdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_cryptodev.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_cryptodev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_dmadev.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_dmadev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_power.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_power.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_reorder.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_reorder.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_security.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_security.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_vhost.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing lib/librte_vhost.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing drivers/librte_bus_pci.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing drivers/librte_bus_pci.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:45:59.704 Installing drivers/librte_bus_vdev.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing drivers/librte_bus_vdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:45:59.704 Installing drivers/librte_mempool_ring.a to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib 00:45:59.704 Installing drivers/librte_mempool_ring.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/config/rte_config.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/log/rte_log.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/kvargs/rte_kvargs.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/telemetry/rte_telemetry.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_atomic.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_byteorder.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_cpuflags.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_cycles.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_io.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_memcpy.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_pause.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_prefetch.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_rwlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_spinlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/generic/rte_vect.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/generic 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_atomic.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_byteorder.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_cpuflags.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_cycles.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_io.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_memcpy.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_pause.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_prefetch.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_rtm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_rwlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_spinlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_vect.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_atomic_32.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_atomic_64.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_alarm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_bitmap.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_bitops.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_branch_prediction.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_bus.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_class.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.704 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_common.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_compat.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_debug.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_dev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_devargs.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_eal.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_eal_memconfig.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_eal_trace.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_errno.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_epoll.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_fbarray.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_hexdump.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_hypervisor.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_interrupts.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_keepalive.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_launch.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_lcore.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_lock_annotations.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_malloc.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_mcslock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_memory.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_memzone.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_pci_dev_features.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_per_lcore.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_pflock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_random.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_reciprocal.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_seqcount.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_seqlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_service.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_service_component.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_stdatomic.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_string_fns.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_tailq.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_thread.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_ticketlock.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_time.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_trace.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_trace_point.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_trace_point_register.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_uuid.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_version.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/include/rte_vfio.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/eal/linux/include/rte_os.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_elem.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_elem_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_c11_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_generic_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_hts.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_peek.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_peek_zc.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_rts.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/rcu/rte_rcu_qsbr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mempool/rte_mempool.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mempool/rte_mempool_trace_fp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mbuf/rte_mbuf.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mbuf/rte_mbuf_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mbuf/rte_mbuf_ptype.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/mbuf/rte_mbuf_dyn.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_ip.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_tcp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_udp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_tls.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_dtls.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_esp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_sctp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.705 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_icmp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_arp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_ether.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_macsec.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_vxlan.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_gre.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_gtp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_net.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_net_crc.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_mpls.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_higig.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_ecpri.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_pdcp_hdr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_geneve.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_l2tpv2.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_ppp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/net/rte_ib.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/meter/rte_meter.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_cman.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_ethdev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_dev_info.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_flow.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_flow_driver.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_mtr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_mtr_driver.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_tm.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_tm_driver.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_ethdev_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/ethdev/rte_eth_ctrl.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/pci/rte_pci.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse_num.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse_string.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_rdline.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_vt100.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_socket.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_cirbuf.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cmdline/cmdline_parse_portlist.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_fbk_hash.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_hash_crc.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_hash.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_jhash.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_thash.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_thash_gfni.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_crc_arm64.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_crc_generic.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_crc_sw.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_crc_x86.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/hash/rte_thash_x86_gfni.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/timer/rte_timer.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/compressdev/rte_compressdev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/compressdev/rte_comp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_cryptodev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_crypto.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_crypto_sym.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.706 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_crypto_asym.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/cryptodev/rte_cryptodev_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/dmadev/rte_dmadev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/dmadev/rte_dmadev_core.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/power/rte_power.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/power/rte_power_guest_channel.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/power/rte_power_pmd_mgmt.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/power/rte_power_uncore.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/reorder/rte_reorder.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/security/rte_security.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/security/rte_security_driver.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/vhost/rte_vdpa.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/vhost/rte_vhost.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/vhost/rte_vhost_async.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/lib/vhost/rte_vhost_crypto.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/drivers/bus/pci/rte_bus_pci.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/buildtools/dpdk-cmdline-gen.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/usertools/dpdk-devbind.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/usertools/dpdk-pmdinfo.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/usertools/dpdk-telemetry.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/usertools/dpdk-hugepages.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/usertools/dpdk-rss-flows.py to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/build-tmp/rte_build_config.h to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig 00:45:59.707 Installing /home/vagrant/spdk_repo/spdk/dpdk/build-tmp/meson-private/libdpdk.pc to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig 00:45:59.707 Installing symlink pointing to librte_log.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_log.so.24 00:45:59.707 Installing symlink pointing to librte_log.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_log.so 00:45:59.707 Installing symlink pointing to librte_kvargs.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_kvargs.so.24 00:45:59.707 Installing symlink pointing to librte_kvargs.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_kvargs.so 00:45:59.707 Installing symlink pointing to librte_telemetry.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_telemetry.so.24 00:45:59.707 Installing symlink pointing to librte_telemetry.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_telemetry.so 00:45:59.707 Installing symlink pointing to librte_eal.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_eal.so.24 00:45:59.707 Installing symlink pointing to librte_eal.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_eal.so 00:45:59.707 Installing symlink pointing to librte_ring.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_ring.so.24 00:45:59.707 Installing symlink pointing to librte_ring.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_ring.so 00:45:59.707 Installing symlink pointing to librte_rcu.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_rcu.so.24 00:45:59.707 Installing symlink pointing to librte_rcu.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_rcu.so 00:45:59.707 Installing symlink pointing to librte_mempool.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_mempool.so.24 00:45:59.707 Installing symlink pointing to librte_mempool.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_mempool.so 00:45:59.707 Installing symlink pointing to librte_mbuf.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_mbuf.so.24 00:45:59.707 Installing symlink pointing to librte_mbuf.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_mbuf.so 00:45:59.707 Installing symlink pointing to librte_net.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_net.so.24 00:45:59.707 Installing symlink pointing to librte_net.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_net.so 00:45:59.966 './librte_bus_pci.so' -> 'dpdk/pmds-24.1/librte_bus_pci.so' 00:45:59.966 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.1/librte_bus_pci.so.24' 00:45:59.966 './librte_bus_pci.so.24.1' -> 'dpdk/pmds-24.1/librte_bus_pci.so.24.1' 00:45:59.966 './librte_bus_vdev.so' -> 'dpdk/pmds-24.1/librte_bus_vdev.so' 00:45:59.966 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.1/librte_bus_vdev.so.24' 00:45:59.966 './librte_bus_vdev.so.24.1' -> 'dpdk/pmds-24.1/librte_bus_vdev.so.24.1' 00:45:59.966 './librte_mempool_ring.so' -> 'dpdk/pmds-24.1/librte_mempool_ring.so' 00:45:59.966 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.1/librte_mempool_ring.so.24' 00:45:59.966 './librte_mempool_ring.so.24.1' -> 'dpdk/pmds-24.1/librte_mempool_ring.so.24.1' 00:45:59.966 Installing symlink pointing to librte_meter.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_meter.so.24 00:45:59.966 Installing symlink pointing to librte_meter.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_meter.so 00:45:59.966 Installing symlink pointing to librte_ethdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_ethdev.so.24 00:45:59.966 Installing symlink pointing to librte_ethdev.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_ethdev.so 00:45:59.966 Installing symlink pointing to librte_pci.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_pci.so.24 00:45:59.966 Installing symlink pointing to librte_pci.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_pci.so 00:45:59.966 Installing symlink pointing to librte_cmdline.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_cmdline.so.24 00:45:59.966 Installing symlink pointing to librte_cmdline.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_cmdline.so 00:45:59.966 Installing symlink pointing to librte_hash.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_hash.so.24 00:45:59.966 Installing symlink pointing to librte_hash.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_hash.so 00:45:59.966 Installing symlink pointing to librte_timer.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_timer.so.24 00:45:59.966 Installing symlink pointing to librte_timer.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_timer.so 00:45:59.966 Installing symlink pointing to librte_compressdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_compressdev.so.24 00:45:59.966 Installing symlink pointing to librte_compressdev.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_compressdev.so 00:45:59.966 Installing symlink pointing to librte_cryptodev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_cryptodev.so.24 00:45:59.966 Installing symlink pointing to librte_cryptodev.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_cryptodev.so 00:45:59.966 Installing symlink pointing to librte_dmadev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_dmadev.so.24 00:45:59.966 Installing symlink pointing to librte_dmadev.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_dmadev.so 00:45:59.966 Installing symlink pointing to librte_power.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_power.so.24 00:45:59.966 Installing symlink pointing to librte_power.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_power.so 00:45:59.966 Installing symlink pointing to librte_reorder.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_reorder.so.24 00:45:59.966 Installing symlink pointing to librte_reorder.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_reorder.so 00:45:59.966 Installing symlink pointing to librte_security.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_security.so.24 00:45:59.966 Installing symlink pointing to librte_security.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_security.so 00:45:59.966 Installing symlink pointing to librte_vhost.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_vhost.so.24 00:45:59.966 Installing symlink pointing to librte_vhost.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/librte_vhost.so 00:45:59.966 Installing symlink pointing to librte_bus_pci.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_pci.so.24 00:45:59.966 Installing symlink pointing to librte_bus_pci.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_pci.so 00:45:59.966 Installing symlink pointing to librte_bus_vdev.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_vdev.so.24 00:45:59.966 Installing symlink pointing to librte_bus_vdev.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_vdev.so 00:45:59.966 Installing symlink pointing to librte_mempool_ring.so.24.1 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_mempool_ring.so.24 00:45:59.966 Installing symlink pointing to librte_mempool_ring.so.24 to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_mempool_ring.so 00:45:59.966 Running custom install script '/bin/sh /home/vagrant/spdk_repo/spdk/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.1' 00:46:02.500 The Meson build system 00:46:02.500 Version: 1.4.0 00:46:02.500 Source dir: /home/vagrant/spdk_repo/spdk/dpdk 00:46:02.500 Build dir: /home/vagrant/spdk_repo/spdk/dpdk/build-tmp 00:46:02.500 Build type: native build 00:46:02.500 Program cat found: YES (/bin/cat) 00:46:02.500 Project name: DPDK 00:46:02.500 Project version: 24.03.0 00:46:02.500 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:46:02.500 C linker for the host machine: cc ld.bfd 2.35.2-42 00:46:02.500 Host machine cpu family: x86_64 00:46:02.500 Host machine cpu: x86_64 00:46:02.500 Message: ## Building in Developer Mode ## 00:46:02.500 Program pkg-config found: YES (/bin/pkg-config) 00:46:02.500 Program check-symbols.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/check-symbols.sh) 00:46:02.500 Program options-ibverbs-static.sh found: YES (/home/vagrant/spdk_repo/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:46:02.500 Program python3 found: YES (/usr/bin/python3) 00:46:02.500 Program cat found: YES (/bin/cat) 00:46:02.500 Compiler for C supports arguments -march=native: YES (cached) 00:46:02.500 Checking for size of "void *" : 8 (cached) 00:46:02.500 Checking for size of "void *" : 8 (cached) 00:46:02.500 Compiler for C supports link arguments -Wl,--undefined-version: NO (cached) 00:46:02.500 Library m found: YES 00:46:02.500 Library numa found: YES 00:46:02.500 Has header "numaif.h" : YES (cached) 00:46:02.500 Library fdt found: NO 00:46:02.500 Library execinfo found: NO 00:46:02.500 Has header "execinfo.h" : YES (cached) 00:46:02.500 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:46:02.500 Run-time dependency libarchive found: NO (tried pkgconfig) 00:46:02.500 Run-time dependency libbsd found: NO (tried pkgconfig) 00:46:02.500 Run-time dependency jansson found: NO (tried pkgconfig) 00:46:02.500 Dependency openssl found: YES 3.0.7 (cached) 00:46:02.500 Run-time dependency libpcap found: NO (tried pkgconfig) 00:46:02.500 Library pcap found: NO 00:46:02.500 Compiler for C supports arguments -Wcast-qual: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wdeprecated: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wformat: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wformat-nonliteral: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wformat-security: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wmissing-declarations: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wmissing-prototypes: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wnested-externs: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wold-style-definition: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wpointer-arith: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wsign-compare: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wstrict-prototypes: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wundef: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wwrite-strings: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wno-address-of-packed-member: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wno-packed-not-aligned: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wno-missing-field-initializers: YES (cached) 00:46:02.500 Compiler for C supports arguments -Wno-zero-length-bounds: YES (cached) 00:46:02.500 Program objdump found: YES (/bin/objdump) 00:46:02.500 Compiler for C supports arguments -mavx512f: YES (cached) 00:46:02.500 Checking if "AVX512 checking" compiles: YES (cached) 00:46:02.500 Fetching value of define "__SSE4_2__" : 1 (cached) 00:46:02.500 Fetching value of define "__AES__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX2__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512BW__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512CD__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512F__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512VL__" : 1 (cached) 00:46:02.500 Fetching value of define "__PCLMUL__" : 1 (cached) 00:46:02.500 Fetching value of define "__RDRND__" : 1 (cached) 00:46:02.500 Fetching value of define "__RDSEED__" : 1 (cached) 00:46:02.500 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:46:02.500 Fetching value of define "__znver1__" : (undefined) (cached) 00:46:02.500 Fetching value of define "__znver2__" : (undefined) (cached) 00:46:02.500 Fetching value of define "__znver3__" : (undefined) (cached) 00:46:02.500 Fetching value of define "__znver4__" : (undefined) (cached) 00:46:02.500 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:46:02.500 Message: lib/log: Defining dependency "log" 00:46:02.500 Message: lib/kvargs: Defining dependency "kvargs" 00:46:02.500 Message: lib/telemetry: Defining dependency "telemetry" 00:46:02.500 Checking for function "getentropy" : NO (cached) 00:46:02.500 Message: lib/eal: Defining dependency "eal" 00:46:02.500 Message: lib/ring: Defining dependency "ring" 00:46:02.500 Message: lib/rcu: Defining dependency "rcu" 00:46:02.500 Message: lib/mempool: Defining dependency "mempool" 00:46:02.500 Message: lib/mbuf: Defining dependency "mbuf" 00:46:02.500 Fetching value of define "__PCLMUL__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512F__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512BW__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:46:02.500 Fetching value of define "__AVX512VL__" : 1 (cached) 00:46:02.500 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:46:02.500 Compiler for C supports arguments -mpclmul: YES (cached) 00:46:02.500 Compiler for C supports arguments -maes: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx512f: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx512bw: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx512dq: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx512vl: YES (cached) 00:46:02.500 Compiler for C supports arguments -mvpclmulqdq: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx2: YES (cached) 00:46:02.500 Compiler for C supports arguments -mavx: YES (cached) 00:46:02.500 Message: lib/net: Defining dependency "net" 00:46:02.500 Message: lib/meter: Defining dependency "meter" 00:46:02.500 Message: lib/ethdev: Defining dependency "ethdev" 00:46:02.500 Message: lib/pci: Defining dependency "pci" 00:46:02.500 Message: lib/cmdline: Defining dependency "cmdline" 00:46:02.500 Message: lib/hash: Defining dependency "hash" 00:46:02.500 Message: lib/timer: Defining dependency "timer" 00:46:02.500 Message: lib/compressdev: Defining dependency "compressdev" 00:46:02.500 Message: lib/cryptodev: Defining dependency "cryptodev" 00:46:02.500 Message: lib/dmadev: Defining dependency "dmadev" 00:46:02.500 Compiler for C supports arguments -Wno-cast-qual: YES (cached) 00:46:02.500 Message: lib/power: Defining dependency "power" 00:46:02.500 Message: lib/reorder: Defining dependency "reorder" 00:46:02.500 Message: lib/security: Defining dependency "security" 00:46:02.500 Has header "linux/userfaultfd.h" : YES (cached) 00:46:02.500 Has header "linux/vduse.h" : NO (cached) 00:46:02.500 Message: lib/vhost: Defining dependency "vhost" 00:46:02.500 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:46:02.500 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:46:02.500 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:46:02.500 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:46:02.500 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:46:02.500 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:46:02.501 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:46:02.501 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:46:02.501 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:46:02.501 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:46:02.501 Program doxygen found: YES (/bin/doxygen) 00:46:02.501 Configuring doxy-api-html.conf using configuration 00:46:02.501 Configuring doxy-api-man.conf using configuration 00:46:02.501 Program mandb found: YES (/bin/mandb) 00:46:02.501 Program sphinx-build found: NO 00:46:02.501 Configuring rte_build_config.h using configuration 00:46:02.501 Message: 00:46:02.501 ================= 00:46:02.501 Applications Enabled 00:46:02.501 ================= 00:46:02.501 00:46:02.501 apps: 00:46:02.501 00:46:02.501 00:46:02.501 Message: 00:46:02.501 ================= 00:46:02.501 Libraries Enabled 00:46:02.501 ================= 00:46:02.501 00:46:02.501 libs: 00:46:02.501 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:46:02.501 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:46:02.501 cryptodev, dmadev, power, reorder, security, vhost, 00:46:02.501 00:46:02.501 Message: 00:46:02.501 =============== 00:46:02.501 Drivers Enabled 00:46:02.501 =============== 00:46:02.501 00:46:02.501 common: 00:46:02.501 00:46:02.501 bus: 00:46:02.501 pci, vdev, 00:46:02.501 mempool: 00:46:02.501 ring, 00:46:02.501 dma: 00:46:02.501 00:46:02.501 net: 00:46:02.501 00:46:02.501 crypto: 00:46:02.501 00:46:02.501 compress: 00:46:02.501 00:46:02.501 vdpa: 00:46:02.501 00:46:02.501 00:46:02.501 Message: 00:46:02.501 ================= 00:46:02.501 Content Skipped 00:46:02.501 ================= 00:46:02.501 00:46:02.501 apps: 00:46:02.501 dumpcap: explicitly disabled via build config 00:46:02.501 graph: explicitly disabled via build config 00:46:02.501 pdump: explicitly disabled via build config 00:46:02.501 proc-info: explicitly disabled via build config 00:46:02.501 test-acl: explicitly disabled via build config 00:46:02.501 test-bbdev: explicitly disabled via build config 00:46:02.501 test-cmdline: explicitly disabled via build config 00:46:02.501 test-compress-perf: explicitly disabled via build config 00:46:02.501 test-crypto-perf: explicitly disabled via build config 00:46:02.501 test-dma-perf: explicitly disabled via build config 00:46:02.501 test-eventdev: explicitly disabled via build config 00:46:02.501 test-fib: explicitly disabled via build config 00:46:02.501 test-flow-perf: explicitly disabled via build config 00:46:02.501 test-gpudev: explicitly disabled via build config 00:46:02.501 test-mldev: explicitly disabled via build config 00:46:02.501 test-pipeline: explicitly disabled via build config 00:46:02.501 test-pmd: explicitly disabled via build config 00:46:02.501 test-regex: explicitly disabled via build config 00:46:02.501 test-sad: explicitly disabled via build config 00:46:02.501 test-security-perf: explicitly disabled via build config 00:46:02.501 00:46:02.501 libs: 00:46:02.501 argparse: explicitly disabled via build config 00:46:02.501 metrics: explicitly disabled via build config 00:46:02.501 acl: explicitly disabled via build config 00:46:02.501 bbdev: explicitly disabled via build config 00:46:02.501 bitratestats: explicitly disabled via build config 00:46:02.501 bpf: explicitly disabled via build config 00:46:02.501 cfgfile: explicitly disabled via build config 00:46:02.501 distributor: explicitly disabled via build config 00:46:02.501 efd: explicitly disabled via build config 00:46:02.501 eventdev: explicitly disabled via build config 00:46:02.501 dispatcher: explicitly disabled via build config 00:46:02.501 gpudev: explicitly disabled via build config 00:46:02.501 gro: explicitly disabled via build config 00:46:02.501 gso: explicitly disabled via build config 00:46:02.501 ip_frag: explicitly disabled via build config 00:46:02.501 jobstats: explicitly disabled via build config 00:46:02.501 latencystats: explicitly disabled via build config 00:46:02.501 lpm: explicitly disabled via build config 00:46:02.501 member: explicitly disabled via build config 00:46:02.501 pcapng: explicitly disabled via build config 00:46:02.501 rawdev: explicitly disabled via build config 00:46:02.501 regexdev: explicitly disabled via build config 00:46:02.501 mldev: explicitly disabled via build config 00:46:02.501 rib: explicitly disabled via build config 00:46:02.501 sched: explicitly disabled via build config 00:46:02.501 stack: explicitly disabled via build config 00:46:02.501 ipsec: explicitly disabled via build config 00:46:02.501 pdcp: explicitly disabled via build config 00:46:02.501 fib: explicitly disabled via build config 00:46:02.501 port: explicitly disabled via build config 00:46:02.501 pdump: explicitly disabled via build config 00:46:02.501 table: explicitly disabled via build config 00:46:02.501 pipeline: explicitly disabled via build config 00:46:02.501 graph: explicitly disabled via build config 00:46:02.501 node: explicitly disabled via build config 00:46:02.501 00:46:02.501 drivers: 00:46:02.501 common/cpt: not in enabled drivers build config 00:46:02.501 common/dpaax: not in enabled drivers build config 00:46:02.501 common/iavf: not in enabled drivers build config 00:46:02.501 common/idpf: not in enabled drivers build config 00:46:02.501 common/ionic: not in enabled drivers build config 00:46:02.501 common/mvep: not in enabled drivers build config 00:46:02.501 common/octeontx: not in enabled drivers build config 00:46:02.501 bus/auxiliary: not in enabled drivers build config 00:46:02.501 bus/cdx: not in enabled drivers build config 00:46:02.501 bus/dpaa: not in enabled drivers build config 00:46:02.501 bus/fslmc: not in enabled drivers build config 00:46:02.501 bus/ifpga: not in enabled drivers build config 00:46:02.501 bus/platform: not in enabled drivers build config 00:46:02.501 bus/uacce: not in enabled drivers build config 00:46:02.501 bus/vmbus: not in enabled drivers build config 00:46:02.501 common/cnxk: not in enabled drivers build config 00:46:02.501 common/mlx5: not in enabled drivers build config 00:46:02.501 common/nfp: not in enabled drivers build config 00:46:02.501 common/nitrox: not in enabled drivers build config 00:46:02.501 common/qat: not in enabled drivers build config 00:46:02.501 common/sfc_efx: not in enabled drivers build config 00:46:02.501 mempool/bucket: not in enabled drivers build config 00:46:02.501 mempool/cnxk: not in enabled drivers build config 00:46:02.501 mempool/dpaa: not in enabled drivers build config 00:46:02.501 mempool/dpaa2: not in enabled drivers build config 00:46:02.501 mempool/octeontx: not in enabled drivers build config 00:46:02.501 mempool/stack: not in enabled drivers build config 00:46:02.501 dma/cnxk: not in enabled drivers build config 00:46:02.501 dma/dpaa: not in enabled drivers build config 00:46:02.501 dma/dpaa2: not in enabled drivers build config 00:46:02.501 dma/hisilicon: not in enabled drivers build config 00:46:02.501 dma/idxd: not in enabled drivers build config 00:46:02.501 dma/ioat: not in enabled drivers build config 00:46:02.501 dma/skeleton: not in enabled drivers build config 00:46:02.501 net/af_packet: not in enabled drivers build config 00:46:02.501 net/af_xdp: not in enabled drivers build config 00:46:02.501 net/ark: not in enabled drivers build config 00:46:02.501 net/atlantic: not in enabled drivers build config 00:46:02.501 net/avp: not in enabled drivers build config 00:46:02.501 net/axgbe: not in enabled drivers build config 00:46:02.501 net/bnx2x: not in enabled drivers build config 00:46:02.501 net/bnxt: not in enabled drivers build config 00:46:02.501 net/bonding: not in enabled drivers build config 00:46:02.501 net/cnxk: not in enabled drivers build config 00:46:02.501 net/cpfl: not in enabled drivers build config 00:46:02.501 net/cxgbe: not in enabled drivers build config 00:46:02.501 net/dpaa: not in enabled drivers build config 00:46:02.501 net/dpaa2: not in enabled drivers build config 00:46:02.501 net/e1000: not in enabled drivers build config 00:46:02.501 net/ena: not in enabled drivers build config 00:46:02.501 net/enetc: not in enabled drivers build config 00:46:02.501 net/enetfec: not in enabled drivers build config 00:46:02.501 net/enic: not in enabled drivers build config 00:46:02.501 net/failsafe: not in enabled drivers build config 00:46:02.501 net/fm10k: not in enabled drivers build config 00:46:02.501 net/gve: not in enabled drivers build config 00:46:02.501 net/hinic: not in enabled drivers build config 00:46:02.501 net/hns3: not in enabled drivers build config 00:46:02.501 net/i40e: not in enabled drivers build config 00:46:02.501 net/iavf: not in enabled drivers build config 00:46:02.501 net/ice: not in enabled drivers build config 00:46:02.501 net/idpf: not in enabled drivers build config 00:46:02.501 net/igc: not in enabled drivers build config 00:46:02.501 net/ionic: not in enabled drivers build config 00:46:02.501 net/ipn3ke: not in enabled drivers build config 00:46:02.501 net/ixgbe: not in enabled drivers build config 00:46:02.501 net/mana: not in enabled drivers build config 00:46:02.501 net/memif: not in enabled drivers build config 00:46:02.501 net/mlx4: not in enabled drivers build config 00:46:02.501 net/mlx5: not in enabled drivers build config 00:46:02.501 net/mvneta: not in enabled drivers build config 00:46:02.501 net/mvpp2: not in enabled drivers build config 00:46:02.501 net/netvsc: not in enabled drivers build config 00:46:02.501 net/nfb: not in enabled drivers build config 00:46:02.501 net/nfp: not in enabled drivers build config 00:46:02.501 net/ngbe: not in enabled drivers build config 00:46:02.501 net/null: not in enabled drivers build config 00:46:02.501 net/octeontx: not in enabled drivers build config 00:46:02.501 net/octeon_ep: not in enabled drivers build config 00:46:02.501 net/pcap: not in enabled drivers build config 00:46:02.501 net/pfe: not in enabled drivers build config 00:46:02.501 net/qede: not in enabled drivers build config 00:46:02.501 net/ring: not in enabled drivers build config 00:46:02.501 net/sfc: not in enabled drivers build config 00:46:02.501 net/softnic: not in enabled drivers build config 00:46:02.501 net/tap: not in enabled drivers build config 00:46:02.501 net/thunderx: not in enabled drivers build config 00:46:02.501 net/txgbe: not in enabled drivers build config 00:46:02.502 net/vdev_netvsc: not in enabled drivers build config 00:46:02.502 net/vhost: not in enabled drivers build config 00:46:02.502 net/virtio: not in enabled drivers build config 00:46:02.502 net/vmxnet3: not in enabled drivers build config 00:46:02.502 raw/*: missing internal dependency, "rawdev" 00:46:02.502 crypto/armv8: not in enabled drivers build config 00:46:02.502 crypto/bcmfs: not in enabled drivers build config 00:46:02.502 crypto/caam_jr: not in enabled drivers build config 00:46:02.502 crypto/ccp: not in enabled drivers build config 00:46:02.502 crypto/cnxk: not in enabled drivers build config 00:46:02.502 crypto/dpaa_sec: not in enabled drivers build config 00:46:02.502 crypto/dpaa2_sec: not in enabled drivers build config 00:46:02.502 crypto/ipsec_mb: not in enabled drivers build config 00:46:02.502 crypto/mlx5: not in enabled drivers build config 00:46:02.502 crypto/mvsam: not in enabled drivers build config 00:46:02.502 crypto/nitrox: not in enabled drivers build config 00:46:02.502 crypto/null: not in enabled drivers build config 00:46:02.502 crypto/octeontx: not in enabled drivers build config 00:46:02.502 crypto/openssl: not in enabled drivers build config 00:46:02.502 crypto/scheduler: not in enabled drivers build config 00:46:02.502 crypto/uadk: not in enabled drivers build config 00:46:02.502 crypto/virtio: not in enabled drivers build config 00:46:02.502 compress/isal: not in enabled drivers build config 00:46:02.502 compress/mlx5: not in enabled drivers build config 00:46:02.502 compress/nitrox: not in enabled drivers build config 00:46:02.502 compress/octeontx: not in enabled drivers build config 00:46:02.502 compress/zlib: not in enabled drivers build config 00:46:02.502 regex/*: missing internal dependency, "regexdev" 00:46:02.502 ml/*: missing internal dependency, "mldev" 00:46:02.502 vdpa/ifc: not in enabled drivers build config 00:46:02.502 vdpa/mlx5: not in enabled drivers build config 00:46:02.502 vdpa/nfp: not in enabled drivers build config 00:46:02.502 vdpa/sfc: not in enabled drivers build config 00:46:02.502 event/*: missing internal dependency, "eventdev" 00:46:02.502 baseband/*: missing internal dependency, "bbdev" 00:46:02.502 gpu/*: missing internal dependency, "gpudev" 00:46:02.502 00:46:02.502 00:46:03.069 Cleaning... 0 files. 00:46:03.069 Build targets in project: 85 00:46:03.069 00:46:03.069 DPDK 24.03.0 00:46:03.069 00:46:03.069 User defined options 00:46:03.069 default_library : shared 00:46:03.069 libdir : lib 00:46:03.069 prefix : /home/vagrant/spdk_repo/spdk/dpdk/build 00:46:03.069 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:46:03.069 c_link_args : 00:46:03.069 cpu_instruction_set: native 00:46:03.069 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:46:03.069 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:46:03.069 enable_docs : false 00:46:03.069 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:46:03.069 enable_kmods : false 00:46:03.069 tests : false 00:46:03.069 00:46:03.069 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.a 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_log.pc 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.a 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.so 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ut_mock.pc 00:46:03.636 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.so 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.a 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.a 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_util.pc 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.a 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ioat.pc 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dma.pc 00:46:03.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.a 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.so 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.so 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace_parser.pc 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.so 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.so 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.a 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vfio_user.pc 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.so 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.a 00:46:04.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vmd.pc 00:46:04.411 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.a 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.so 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_json.pc 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.a 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.so 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_conf.pc 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dpdklibs.pc 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.so 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.a 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk.pc 00:46:04.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.so 00:46:04.670 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.a 00:46:04.670 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_jsonrpc.pc 00:46:04.670 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.so 00:46:04.927 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.a 00:46:04.927 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_rpc.pc 00:46:04.927 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.so 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.a 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.a 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring.pc 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.a 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_notify.pc 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace.pc 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.so 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.so 00:46:05.186 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.so 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.a 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.a 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock.pc 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_thread.pc 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.so 00:46:05.443 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.so 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.a 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.a 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel.pc 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.a 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.a 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_init.pc 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.a 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob.pc 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_virtio.pc 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.so 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvme.pc 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.so 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.so 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.so 00:46:05.702 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.so 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.a 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.a 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs.pc 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.a 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev.pc 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.a 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.so 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event.pc 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_lvol.pc 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.so 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.so 00:46:05.961 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.so 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.a 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.a 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.a 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scsi.pc 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvmf.pc 00:46:06.219 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nbd.pc 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.a 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.so 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.so 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.so 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ftl.pc 00:46:06.478 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.so 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.a 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_iscsi.pc 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.so 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.a 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vhost.pc 00:46:06.736 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.so 00:46:06.995 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk_rpc.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_error.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_file.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_gscheduler.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_linux.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_ioat.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.a 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob_bdev.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dynamic.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_posix.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dpdk_governor.pc 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.so 00:46:07.254 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.so 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.a 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.a 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.a 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.a 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_gpt.pc 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_malloc.pc 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_error.pc 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_delay.pc 00:46:07.513 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_lvol.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_null.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_raid.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_nvme.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs_bdev.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_passthru.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_aio.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_split.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.a 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.so 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_zone_block.pc 00:46:07.773 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_ftl.pc 00:46:08.032 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.a 00:46:08.032 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.so 00:46:08.032 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.so 00:46:08.032 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_virtio.pc 00:46:08.032 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iobuf.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scheduler.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_blk.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vmd.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.a 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_keyring.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_sock.pc 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.so 00:46:08.599 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.so 00:46:08.856 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.a 00:46:08.856 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_accel.pc 00:46:08.856 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.so 00:46:09.114 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.a 00:46:09.114 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_bdev.pc 00:46:09.114 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.so 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.a 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.a 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scsi.pc 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.a 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nvmf.pc 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nbd.pc 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.so 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.so 00:46:09.372 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.so 00:46:09.637 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.a 00:46:09.637 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.a 00:46:09.637 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iscsi.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_scsi.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.so 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.so 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_modules.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_modules.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_modules.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_syslibs.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_modules.pc 00:46:09.901 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_modules.pc 00:46:10.160 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk.so 00:46:10.160 make[1]: Nothing to be done for 'install'. 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace_record 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_lspci 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_perf 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/nvmf_tgt 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_tgt 00:46:10.419 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/iscsi_tgt 00:46:10.677 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_identify 00:46:10.677 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_discover 00:46:10.677 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_top 00:46:10.677 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_dd 00:46:10.677 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/vhost 00:46:10.953 Installed to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local 00:46:10.953 Processing files: spdk-v24.09-1.x86_64 00:46:11.213 Provides: spdk = v24.09-1 spdk(x86-64) = v24.09-1 00:46:11.213 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:46:11.213 Requires: /usr/bin/env libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3.3)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libcrypto.so.3()(64bit) libfuse3.so.3()(64bit) libgcc_s.so.1()(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libm.so.6()(64bit) libmenu.so.6()(64bit) libncurses.so.6()(64bit) libpanel.so.6()(64bit) librte_bus_pci.so.24()(64bit) librte_cryptodev.so.24()(64bit) librte_dmadev.so.24()(64bit) librte_eal.so.24()(64bit) librte_ethdev.so.24()(64bit) librte_hash.so.24()(64bit) librte_kvargs.so.24()(64bit) librte_log.so.24()(64bit) librte_mbuf.so.24()(64bit) librte_mempool.so.24()(64bit) librte_mempool_ring.so.24()(64bit) librte_net.so.24()(64bit) librte_pci.so.24()(64bit) librte_power.so.24()(64bit) librte_rcu.so.24()(64bit) librte_ring.so.24()(64bit) librte_telemetry.so.24()(64bit) librte_vhost.so.24()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libstdc++.so.6()(64bit) libtinfo.so.6()(64bit) libuuid.so.1()(64bit) rtld(GNU_HASH) 00:46:11.213 Processing files: spdk-devel-v24.09-1.x86_64 00:46:14.497 Provides: libisal_crypto.so.2()(64bit) librte_bus_pci.so.24()(64bit) librte_bus_pci.so.24(DPDK_24)(64bit) librte_bus_pci.so.24(EXPERIMENTAL)(64bit) librte_bus_pci.so.24(INTERNAL)(64bit) librte_bus_vdev.so.24()(64bit) librte_bus_vdev.so.24(DPDK_24)(64bit) librte_bus_vdev.so.24(INTERNAL)(64bit) librte_cmdline.so.24()(64bit) librte_cmdline.so.24(DPDK_24)(64bit) librte_compressdev.so.24()(64bit) librte_compressdev.so.24(DPDK_24)(64bit) librte_cryptodev.so.24()(64bit) librte_cryptodev.so.24(DPDK_24)(64bit) librte_cryptodev.so.24(EXPERIMENTAL)(64bit) librte_cryptodev.so.24(INTERNAL)(64bit) librte_dmadev.so.24()(64bit) librte_dmadev.so.24(DPDK_24)(64bit) librte_dmadev.so.24(EXPERIMENTAL)(64bit) librte_dmadev.so.24(INTERNAL)(64bit) librte_eal.so.24()(64bit) librte_eal.so.24(DPDK_24)(64bit) librte_eal.so.24(EXPERIMENTAL)(64bit) librte_eal.so.24(INTERNAL)(64bit) librte_ethdev.so.24()(64bit) librte_ethdev.so.24(DPDK_24)(64bit) librte_ethdev.so.24(EXPERIMENTAL)(64bit) librte_ethdev.so.24(INTERNAL)(64bit) librte_hash.so.24()(64bit) librte_hash.so.24(DPDK_24)(64bit) librte_hash.so.24(INTERNAL)(64bit) librte_kvargs.so.24()(64bit) librte_kvargs.so.24(DPDK_24)(64bit) librte_log.so.24()(64bit) librte_log.so.24(DPDK_24)(64bit) librte_log.so.24(INTERNAL)(64bit) librte_mbuf.so.24()(64bit) librte_mbuf.so.24(DPDK_24)(64bit) librte_mempool.so.24()(64bit) librte_mempool.so.24(DPDK_24)(64bit) librte_mempool.so.24(EXPERIMENTAL)(64bit) librte_mempool.so.24(INTERNAL)(64bit) librte_mempool_ring.so.24()(64bit) librte_mempool_ring.so.24(DPDK_24)(64bit) librte_meter.so.24()(64bit) librte_meter.so.24(DPDK_24)(64bit) librte_net.so.24()(64bit) librte_net.so.24(DPDK_24)(64bit) librte_pci.so.24()(64bit) librte_pci.so.24(DPDK_24)(64bit) librte_power.so.24()(64bit) librte_power.so.24(DPDK_24)(64bit) librte_power.so.24(EXPERIMENTAL)(64bit) librte_rcu.so.24()(64bit) librte_rcu.so.24(DPDK_24)(64bit) librte_reorder.so.24()(64bit) librte_reorder.so.24(DPDK_24)(64bit) librte_reorder.so.24(EXPERIMENTAL)(64bit) librte_ring.so.24()(64bit) librte_ring.so.24(DPDK_24)(64bit) librte_security.so.24()(64bit) librte_security.so.24(DPDK_24)(64bit) librte_security.so.24(EXPERIMENTAL)(64bit) librte_security.so.24(INTERNAL)(64bit) librte_telemetry.so.24()(64bit) librte_telemetry.so.24(DPDK_24)(64bit) librte_telemetry.so.24(EXPERIMENTAL)(64bit) librte_telemetry.so.24(INTERNAL)(64bit) librte_timer.so.24()(64bit) librte_timer.so.24(DPDK_24)(64bit) librte_vhost.so.24()(64bit) librte_vhost.so.24(DPDK_24)(64bit) librte_vhost.so.24(EXPERIMENTAL)(64bit) librte_vhost.so.24(INTERNAL)(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) spdk-devel = v24.09-1 spdk-devel(x86-64) = v24.09-1 00:46:14.497 Requires(interp): /bin/sh 00:46:14.497 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:46:14.497 Requires(post): /bin/sh 00:46:14.497 Requires: ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.22)(64bit) libc.so.6(GLIBC_2.27)(64bit) libc.so.6(GLIBC_2.28)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.32)(64bit) libc.so.6(GLIBC_2.33)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.6)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libcrypto.so.3()(64bit) libcrypto.so.3(OPENSSL_3.0.0)(64bit) libfuse3.so.3()(64bit) libfuse3.so.3(FUSE_3.0)(64bit) libfuse3.so.3(FUSE_3.7)(64bit) libgcc_s.so.1()(64bit) libgcc_s.so.1(GCC_3.0)(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libkeyutils.so.1(KEYUTILS_0.3)(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.2.5)(64bit) libm.so.6(GLIBC_2.29)(64bit) libnuma.so.1()(64bit) libnuma.so.1(libnuma_1.1)(64bit) libnuma.so.1(libnuma_1.2)(64bit) librte_bus_pci.so.24()(64bit) librte_bus_vdev.so.24()(64bit) librte_cmdline.so.24()(64bit) librte_compressdev.so.24()(64bit) librte_cryptodev.so.24()(64bit) librte_cryptodev.so.24(DPDK_24)(64bit) librte_dmadev.so.24()(64bit) librte_dmadev.so.24(DPDK_24)(64bit) librte_dmadev.so.24(INTERNAL)(64bit) librte_eal.so.24()(64bit) librte_eal.so.24(DPDK_24)(64bit) librte_eal.so.24(EXPERIMENTAL)(64bit) librte_eal.so.24(INTERNAL)(64bit) librte_ethdev.so.24()(64bit) librte_ethdev.so.24(DPDK_24)(64bit) librte_ethdev.so.24(EXPERIMENTAL)(64bit) librte_hash.so.24()(64bit) librte_hash.so.24(DPDK_24)(64bit) librte_kvargs.so.24()(64bit) librte_kvargs.so.24(DPDK_24)(64bit) librte_log.so.24()(64bit) librte_log.so.24(DPDK_24)(64bit) librte_log.so.24(INTERNAL)(64bit) librte_mbuf.so.24()(64bit) librte_mbuf.so.24(DPDK_24)(64bit) librte_mempool.so.24()(64bit) librte_mempool.so.24(DPDK_24)(64bit) librte_mempool_ring.so.24()(64bit) librte_meter.so.24()(64bit) librte_net.so.24()(64bit) librte_net.so.24(DPDK_24)(64bit) librte_pci.so.24()(64bit) librte_pci.so.24(DPDK_24)(64bit) librte_power.so.24()(64bit) librte_rcu.so.24()(64bit) librte_rcu.so.24(DPDK_24)(64bit) librte_reorder.so.24()(64bit) librte_ring.so.24()(64bit) librte_ring.so.24(DPDK_24)(64bit) librte_security.so.24()(64bit) librte_telemetry.so.24()(64bit) librte_telemetry.so.24(DPDK_24)(64bit) librte_telemetry.so.24(EXPERIMENTAL)(64bit) librte_telemetry.so.24(INTERNAL)(64bit) librte_timer.so.24()(64bit) librte_vhost.so.24()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libssl.so.3(OPENSSL_3.0.0)(64bit) libstdc++.so.6()(64bit) libstdc++.so.6(CXXABI_1.3)(64bit) libstdc++.so.6(GLIBCXX_3.4)(64bit) libuuid.so.1()(64bit) libuuid.so.1(UUID_1.0)(64bit) libuuid.so.1(UUID_2.31)(64bit) rtld(GNU_HASH) 00:46:14.497 Processing files: spdk-scripts-v24.09-1.x86_64 00:46:14.497 warning: absolute symlink: /etc/bash_completion.d/spdk -> /usr/libexec/spdk/scripts/bash-completion/spdk 00:46:14.497 warning: absolute symlink: /usr/libexec/spdk/include -> /usr/local/include 00:46:15.431 Provides: spdk-scripts = v24.09-1 spdk-scripts(x86-64) = v24.09-1 00:46:15.431 Requires(interp): /bin/sh 00:46:15.431 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:46:15.431 Requires(post): /bin/sh 00:46:15.431 Requires: /bin/bash /usr/bin/env 00:46:15.431 Checking for unpackaged file(s): /usr/lib/rpm/check-files /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:46:15.431 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/srcrpm/spdk-v24.09-1.src.rpm 00:46:15.997 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-scripts-v24.09-1.x86_64.rpm 00:46:17.372 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-v24.09-1.x86_64.rpm 00:46:27.341 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-devel-v24.09-1.x86_64.rpm 00:46:27.341 Executing(%clean): /bin/sh -e /var/tmp/rpm-tmp.bJlPzw 00:46:27.341 + umask 022 00:46:27.341 + cd /home/vagrant/spdk_repo/spdk 00:46:27.341 + /usr/bin/rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:46:27.341 + RPM_EC=0 00:46:27.341 ++ jobs -p 00:46:27.341 + exit 0 00:46:27.341 Executing(--clean): /bin/sh -e /var/tmp/rpm-tmp.2VGItq 00:46:27.341 + umask 022 00:46:27.341 + cd /home/vagrant/spdk_repo/spdk 00:46:27.341 + RPM_EC=0 00:46:27.341 ++ jobs -p 00:46:27.341 + exit 0 00:46:27.341 12:48:50 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@120 -- $ [[ -n '' ]] 00:46:27.341 12:48:50 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@123 -- $ install_uninstall_rpms 00:46:27.341 12:48:50 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@98 -- $ local rpms 00:46:27.341 12:48:50 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@100 -- $ rpms=("${1:-$builddir/rpm/}/$arch/"*.rpm) 00:46:27.341 12:48:50 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@103 -- $ make -C /home/vagrant/spdk_repo/spdk clean -j10 00:46:27.341 make: Entering directory '/home/vagrant/spdk_repo/spdk' 00:46:27.341 make[1]: Nothing to be done for 'clean'. 00:46:33.902 make[1]: Nothing to be done for 'clean'. 00:46:34.160 make: Leaving directory '/home/vagrant/spdk_repo/spdk' 00:46:34.161 12:48:57 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@105 -- $ sudo rpm -i /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:46:34.161 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:46:36.061 12:48:59 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@108 -- $ LIST_LIBS=yes 00:46:36.061 12:48:59 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@108 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm-deps.sh spdk_tgt 00:46:36.320 /usr/local/bin/spdk_tgt 00:46:38.853 /usr/lib64/libaio.so.1:libaio-0.3.111-13.el9.x86_64 00:46:38.853 /usr/lib64/libc.so.6:glibc-2.34-83.el9.12.x86_64 00:46:38.853 /usr/lib64/libcrypto.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:46:38.853 /usr/lib64/libfuse3.so.3:fuse3-libs-3.10.2-6.el9.x86_64 00:46:38.853 /usr/lib64/libgcc_s.so.1:libgcc-11.4.1-2.1.el9.x86_64 00:46:38.853 /usr/lib64/libkeyutils.so.1:keyutils-libs-1.6.3-1.el9.x86_64 00:46:38.853 /usr/lib64/libm.so.6:glibc-2.34-83.el9.12.x86_64 00:46:38.853 /usr/lib64/libnuma.so.1:numactl-libs-2.0.16-1.el9.x86_64 00:46:38.853 /usr/lib64/libssl.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:46:38.853 /usr/lib64/libuuid.so.1:libuuid-2.37.4-15.el9.x86_64 00:46:38.853 /usr/lib64/libz.so.1:zlib-1.2.11-40.el9.x86_64 00:46:38.853 /usr/local/lib/libisal_crypto.so.2:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_bus_pci.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_cryptodev.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_dmadev.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_eal.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_ethdev.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_hash.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_kvargs.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_log.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_mbuf.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_mempool.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_mempool_ring.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_meter.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_net.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_pci.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_power.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_rcu.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_ring.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_telemetry.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_timer.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/librte_vhost.so.24:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_accel.so.15.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_accel_error.so.2.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_accel_ioat.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev.so.15.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_aio.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_delay.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_error.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_ftl.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_gpt.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_lvol.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_malloc.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_null.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_nvme.so.7.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_passthru.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_raid.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_split.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_virtio.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_bdev_zone_block.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_blob.so.11.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_blob_bdev.so.11.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_blobfs.so.10.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_blobfs_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_conf.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_dma.so.4.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_env_dpdk.so.14.1:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_env_dpdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event.so.13.1:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_accel.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_iobuf.so.3.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_iscsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_nbd.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_nvmf.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_scheduler.so.4.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_scsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_sock.so.5.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_vhost_blk.so.3.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_vhost_scsi.so.3.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_event_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_ftl.so.9.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_init.so.5.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_ioat.so.7.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_iscsi.so.8.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_json.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_jsonrpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_keyring_file.so.1.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_keyring_linux.so.1.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_log.so.7.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_lvol.so.10.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_nbd.so.7.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_notify.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_nvme.so.13.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_nvmf.so.18.1:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_scheduler_dpdk_governor.so.4.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_scheduler_dynamic.so.4.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_scheduler_gscheduler.so.4.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_scsi.so.9.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_sock.so.9.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_sock_posix.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_thread.so.10.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_trace.so.10.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_util.so.9.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_vfio_user.so.5.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_vhost.so.8.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_virtio.so.7.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 /usr/local/lib/libspdk_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:46:38.853 12:49:01 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@109 -- $ rm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:46:38.853 12:49:01 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]##*/}") 00:46:38.853 12:49:01 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]%.rpm}") 00:46:38.853 12:49:01 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@111 -- $ sudo rpm -e spdk-devel-v24.09-1.x86_64 spdk-scripts-v24.09-1.x86_64 spdk-v24.09-1.x86_64 00:46:38.854 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:46:38.854 12:49:02 packaging.rpm_packaging.build_shared_rpm -- rpm/rpm.sh@124 -- $ [[ -n '' ]] 00:46:38.854 00:46:38.854 real 2m50.559s 00:46:38.854 user 9m3.258s 00:46:38.854 sys 4m26.707s 00:46:38.854 12:49:02 packaging.rpm_packaging.build_shared_rpm -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:46:38.854 12:49:02 packaging.rpm_packaging.build_shared_rpm -- common/autotest_common.sh@10 -- $ set +x 00:46:38.854 ************************************ 00:46:38.854 END TEST build_shared_rpm 00:46:38.854 ************************************ 00:46:38.854 12:49:02 packaging.rpm_packaging -- rpm/rpm.sh@196 -- $ run_test build_rpm_from_gen_spec build_rpm_from_gen_spec 00:46:38.854 12:49:02 packaging.rpm_packaging -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:46:38.854 12:49:02 packaging.rpm_packaging -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:46:38.854 12:49:02 packaging.rpm_packaging -- common/autotest_common.sh@10 -- $ set +x 00:46:38.854 ************************************ 00:46:38.854 START TEST build_rpm_from_gen_spec 00:46:38.854 ************************************ 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- common/autotest_common.sh@1124 -- $ build_rpm_from_gen_spec 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@156 -- $ local version=test_gen_spec 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@157 -- $ local sourcedir rpmdir rpmbuilddir 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@159 -- $ GEN_SPEC=yes 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@159 -- $ USE_DEFAULT_DIRS=yes 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@159 -- $ SPDK_VERSION=test_gen_spec 00:46:38.854 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@159 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@165 -- $ rpm --eval '%{_sourcedir}' 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@165 -- $ sourcedir=/home/vagrant/rpmbuild/SOURCES 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@166 -- $ rpm --eval '%{_rpmdir}' 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@166 -- $ rpmdir=/home/vagrant/rpmbuild/RPMS 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@167 -- $ rpm --eval '%{_builddir}' 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@167 -- $ rpmbuilddir=/home/vagrant/rpmbuild/BUILD 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@169 -- $ mkdir -p /home/vagrant/rpmbuild/SOURCES /home/vagrant/rpmbuild/RPMS /home/vagrant/rpmbuild/BUILD 00:46:39.111 12:49:02 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@173 -- $ cp -r /home/vagrant/spdk_repo/spdk /tmp/spdk-test_gen_spec 00:46:40.057 12:49:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@174 -- $ tar -czf /home/vagrant/rpmbuild/SOURCES/spdk-test_gen_spec.tar.gz -C /tmp spdk-test_gen_spec 00:46:58.243 12:49:18 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@177 -- $ python3 -c 'import sys; print('\''%s'\'' % '\'':'\''.join(sys.path)[1:])' 00:46:58.243 12:49:18 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@177 -- $ PYTHONPATH=/home/vagrant/spdk_repo/spdk:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/usr/lib64/python39.zip:/usr/lib64/python3.9:/usr/lib64/python3.9/lib-dynload:/usr/local/lib64/python3.9/site-packages:/usr/local/lib/python3.9/site-packages:/usr/lib64/python3.9/site-packages:/usr/lib/python3.9/site-packages 00:46:58.243 12:49:18 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@177 -- $ rpmbuild -ba /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/gen-spdk.spec 00:46:58.243 setting SOURCE_DATE_EPOCH=1613433600 00:46:58.243 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.jq4Dxe 00:46:58.243 + umask 022 00:46:58.243 + cd /home/vagrant/rpmbuild/BUILD 00:46:58.243 + make clean -j10 00:46:58.243 + : 00:46:58.243 + cd /home/vagrant/rpmbuild/BUILD 00:46:58.243 + rm -rf spdk-test_gen_spec 00:46:58.243 + /usr/bin/gzip -dc /home/vagrant/rpmbuild/SOURCES/spdk-test_gen_spec.tar.gz 00:46:58.243 + /usr/bin/tar -xvvof - 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/ 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_discover/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_discover/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 8298 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_discover/discovery_aer.c 00:46:58.243 -rw-r--r-- vagrant/vagrant 782 2024-06-07 12:49 spdk-test_gen_spec/app/Makefile 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_identify/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_identify/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 494 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_identify/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 100296 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_identify/identify.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_perf/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_perf/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 554 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_perf/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 190 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_perf/README.md 00:46:58.243 -rw-r--r-- vagrant/vagrant 93460 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_nvme_perf/perf.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_tgt/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_tgt/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 812 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_tgt/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_tgt/spdk_tgt.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_top/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_top/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_top/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 2233 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_top/README 00:46:58.243 -rw-r--r-- vagrant/vagrant 106054 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_top/spdk_top.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/fio/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/app/fio/Makefile 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 438 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 8241 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/README.md 00:46:58.243 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/bdev.json 00:46:58.243 -rw-r--r-- vagrant/vagrant 522 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/bdev_zoned.json 00:46:58.243 -rw-r--r-- vagrant/vagrant 211 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/example_config.fio 00:46:58.243 -rw-r--r-- vagrant/vagrant 37223 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/fio_plugin.c 00:46:58.243 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/full_bench.fio 00:46:58.243 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/app/fio/bdev/zbd_example.fio 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 485 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 7407 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/README.md 00:46:58.243 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/example_config.fio 00:46:58.243 -rw-r--r-- vagrant/vagrant 54924 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/fio_plugin.c 00:46:58.243 -rw-r--r-- vagrant/vagrant 564 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/full_bench.fio 00:46:58.243 -rw-r--r-- vagrant/vagrant 183 2024-06-07 12:49 spdk-test_gen_spec/app/fio/nvme/mock_sgl_config.fio 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/trace/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/app/trace/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 446 2024-06-07 12:49 spdk-test_gen_spec/app/trace/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 13687 2024-06-07 12:49 spdk-test_gen_spec/app/trace/trace.cpp 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/iscsi_tgt/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/app/iscsi_tgt/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 715 2024-06-07 12:49 spdk-test_gen_spec/app/iscsi_tgt/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 1361 2024-06-07 12:49 spdk-test_gen_spec/app/iscsi_tgt/iscsi_tgt.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/trace_record/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/app/trace_record/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/app/trace_record/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 18330 2024-06-07 12:49 spdk-test_gen_spec/app/trace_record/trace_record.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/nvmf_tgt/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/app/nvmf_tgt/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 578 2024-06-07 12:49 spdk-test_gen_spec/app/nvmf_tgt/Makefile 00:46:58.243 -rw-r--r-- vagrant/vagrant 875 2024-06-07 12:49 spdk-test_gen_spec/app/nvmf_tgt/nvmf_main.c 00:46:58.243 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/vhost/ 00:46:58.243 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/app/vhost/.gitignore 00:46:58.243 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/app/vhost/Makefile 00:46:58.244 -rw-r--r-- vagrant/vagrant 1458 2024-06-07 12:49 spdk-test_gen_spec/app/vhost/vhost.c 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_dd/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_dd/.gitignore 00:46:58.244 -rw-r--r-- vagrant/vagrant 434 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_dd/Makefile 00:46:58.244 -rw-r--r-- vagrant/vagrant 35371 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_dd/spdk_dd.c 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_lspci/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_lspci/Makefile 00:46:58.244 -rw-r--r-- vagrant/vagrant 1715 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_lspci/spdk_lspci.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/app/spdk_lspci/.gitignore 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 338267 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Makefile.in 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 3489 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/Makefile.am 00:46:58.244 -rw-r--r-- vagrant/vagrant 5642 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 3105 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_avx512.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 2017 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_base_aliases.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 15676 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_block_avx.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 18351 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_block_avx2.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 22552 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_block_avx512.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 6930 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_block_base.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 15469 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_block_sse.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 4815 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_finalize_base.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 12275 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_internal.h 00:46:58.244 -rw-r--r-- vagrant/vagrant 3232 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_multibinary.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 5597 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_perf.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 15601 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_ref.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 5925 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_test.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 4311 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_update_base.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 6812 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/mh_sha256_update_test.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 6729 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/sha256_for_mh_sha256.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.dirstamp 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/ 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/.deps/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/.deps/mh_sha256_aarch64_dispatcher.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/.deps/mh_sha256_block_ce.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/.deps/mh_sha256_ce.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/.deps/mh_sha256_multibinary.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 2155 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/mh_sha256_aarch64_dispatcher.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 28483 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/mh_sha256_block_ce.S 00:46:58.244 -rw-r--r-- vagrant/vagrant 2752 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/mh_sha256_ce.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 1797 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/aarch64/mh_sha256_multibinary.S 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_base_aliases.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_perf.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_test.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_update_test.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 1709 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_finalize_base.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1702 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_update_base.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1700 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_block_base.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1700 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/sha256_for_mh_sha256.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1840 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/mh_sha256_avx512.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha256/.deps/.dirstamp 00:46:58.244 -rw-r--r-- vagrant/vagrant 37 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/.git 00:46:58.244 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/.gitignore 00:46:58.244 -rw-r--r-- vagrant/vagrant 1386 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/CONTRIBUTING.md 00:46:58.244 -rw-r--r-- vagrant/vagrant 1288 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Doxyfile 00:46:58.244 -rw-r--r-- vagrant/vagrant 1559 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/LICENSE 00:46:58.244 -rw-r--r-- vagrant/vagrant 4256 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Makefile.am 00:46:58.244 -rw-r--r-- vagrant/vagrant 15377 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Makefile.nmake 00:46:58.244 -rw-r--r-- vagrant/vagrant 2356 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Makefile.unx 00:46:58.244 -rw-r--r-- vagrant/vagrant 2316 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/README.md 00:46:58.244 -rw-r--r-- vagrant/vagrant 6526 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Release_notes.txt 00:46:58.244 -rw-r--r-- vagrant/vagrant 430 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/SECURITY.md 00:46:58.244 -rwxr-xr-x vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autogen.sh 00:46:58.244 -rw-r--r-- vagrant/vagrant 11007 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/configure.ac 00:46:58.244 -rw-r--r-- vagrant/vagrant 3265 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/isa-l_crypto.def 00:46:58.244 -rw-r--r-- vagrant/vagrant 241 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/libisal_crypto.pc.in 00:46:58.244 -rw-r--r-- vagrant/vagrant 11042 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/make.inc 00:46:58.244 -rw-r--r-- vagrant/vagrant 373164 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aclocal.m4 00:46:58.244 -rwxr-xr-x vagrant/vagrant 339191 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/libtool 00:46:58.244 -rwxr-xr-x vagrant/vagrant 474520 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/configure 00:46:58.244 -rw-r--r-- vagrant/vagrant 39313 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/config.log 00:46:58.244 -rwxr-xr-x vagrant/vagrant 56039 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/config.status 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 2815 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/Makefile.am 00:46:58.244 -rw-r--r-- vagrant/vagrant 5937 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/chunking_with_mb_hash.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 4985 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 2171 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_base_aliases.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 3794 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_multibinary.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 3685 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_perf.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 7398 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_table.h 00:46:58.244 -rw-r--r-- vagrant/vagrant 8129 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_test.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 5047 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_until_00.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 5079 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hash2_until_04.asm 00:46:58.244 -rw-r--r-- vagrant/vagrant 2288 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/rolling_hashx_base.c 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/chunking_with_mb_hash.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/rolling_hash2_base_aliases.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/rolling_hash2_perf.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/rolling_hash2_test.Po 00:46:58.244 -rw-r--r-- vagrant/vagrant 1192 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/rolling_hashx_base.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/.dirstamp 00:46:58.244 -rw-r--r-- vagrant/vagrant 3585 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.deps/rolling_hash2.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/.dirstamp 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/ 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/.deps/ 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/.deps/rolling_hash2_aarch64_dispatcher.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/.deps/rolling_hash2_aarch64_multibinary.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/.deps/rolling_hash2_run_until_unroll.Plo 00:46:58.244 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/rolling_hash2_aarch64_dispatcher.c 00:46:58.244 -rw-r--r-- vagrant/vagrant 1771 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/rolling_hash2_aarch64_multibinary.S 00:46:58.244 -rw-r--r-- vagrant/vagrant 3271 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/rolling_hash/aarch64/rolling_hash2_run_until_unroll.S 00:46:58.244 -rw-r--r-- vagrant/vagrant 315303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/Makefile 00:46:58.244 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/ 00:46:58.245 -rw-r--r-- vagrant/vagrant 4692 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/Makefile.am 00:46:58.245 -rw-r--r-- vagrant/vagrant 9049 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_avx.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9057 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_avx2.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9171 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_avx512.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9686 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_avx512_ni.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 10851 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_base.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 2493 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_base_aliases.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8681 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_sse.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8810 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ctx_sse_ni.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 3175 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_job.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 4691 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_flush_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 3261 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_datastruct.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7451 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_avx.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 8359 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_avx2.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 8572 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_avx512.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 8778 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_avx512_ni.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7414 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_sse.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7595 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_flush_sse_ni.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 1963 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_init_avx2.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 1973 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_init_avx512.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 1959 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_init_sse.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 7205 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_submit_avx.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7318 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_submit_avx2.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 8028 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_submit_avx512.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7153 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_submit_sse.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 8429 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_mgr_submit_sse_ni.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 4764 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_rand_ssl_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 5749 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_rand_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8746 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_rand_update_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9413 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 4362 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_vs_ossl_perf.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 4500 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_vs_ossl_shortage_perf.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 20414 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_x16_avx512.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 10165 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_x4_avx.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 10028 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_x4_sse.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 13091 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_mb_x8_avx2.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 4399 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_multi_buffer_example.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 5535 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_multibinary.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7879 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ni_x1.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 11273 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ni_x2.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 12731 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_opt_x1.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 6814 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/sha1_ref.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.dirstamp 00:46:58.245 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/ 00:46:58.245 -rw-r--r-- vagrant/vagrant 14289 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_sse.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14289 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_avx.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 1420 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_mgr_init_avx2.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14291 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_avx2.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 1418 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_mgr_init_sse.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_base_aliases.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14291 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_base.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_flush_test.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_rand_ssl_test.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_rand_test.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_rand_update_test.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_test.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_vs_ossl_perf.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_vs_ossl_shortage_perf.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_multi_buffer_example.Po 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ref.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/.dirstamp 00:46:58.245 -rw-r--r-- vagrant/vagrant 1424 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_mb_mgr_init_avx512.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14295 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_avx512.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14295 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_sse_ni.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 14301 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/.deps/sha1_ctx_avx512_ni.Plo 00:46:58.245 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/ 00:46:58.245 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/ 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_aarch64_x1.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_ctx_asimd.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_ctx_ce.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_aarch64_dispatcher.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_asimd_x4.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_mgr_asimd.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_mgr_ce.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_multibinary.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_x1_ce.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/.deps/sha1_mb_x2_ce.Plo 00:46:58.245 -rw-r--r-- vagrant/vagrant 6459 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_aarch64_x1.S 00:46:58.245 -rw-r--r-- vagrant/vagrant 6578 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_asimd_common.S 00:46:58.245 -rw-r--r-- vagrant/vagrant 8844 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_ctx_asimd.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8801 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_ctx_ce.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 3303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_aarch64_dispatcher.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 5247 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_asimd_x4.S 00:46:58.245 -rw-r--r-- vagrant/vagrant 6825 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_mgr_asimd.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 5815 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_mgr_ce.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 1833 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_multibinary.S 00:46:58.245 -rw-r--r-- vagrant/vagrant 5828 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_x1_ce.S 00:46:58.245 -rw-r--r-- vagrant/vagrant 7942 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha1_mb/aarch64/sha1_mb_x2_ce.S 00:46:58.245 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/ 00:46:58.245 -rw-r--r-- vagrant/vagrant 4876 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/Makefile.am 00:46:58.245 -rw-r--r-- vagrant/vagrant 9209 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_avx.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9232 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_avx2.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9337 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_avx512.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 9849 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_avx512_ni.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 11339 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_base.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 2535 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_base_aliases.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8837 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_sse.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 8974 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ctx_sse_ni.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 2978 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_job.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 4729 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_flush_test.c 00:46:58.245 -rw-r--r-- vagrant/vagrant 3278 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_datastruct.asm 00:46:58.245 -rw-r--r-- vagrant/vagrant 7694 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_avx.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 8551 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_avx2.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 8874 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_avx512.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 9161 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_avx512_ni.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 7663 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_sse.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 7888 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_flush_sse_ni.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 1971 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_init_avx2.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 1981 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_init_avx512.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 1967 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_init_sse.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 7758 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_submit_avx.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 7286 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_submit_avx2.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 8076 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_submit_avx512.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 7721 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_submit_sse.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 9054 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_mgr_submit_sse_ni.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 4810 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_rand_ssl_test.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 5809 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_rand_test.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 8903 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_rand_update_test.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 9700 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_test.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 4399 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_vs_ossl_perf.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 4532 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_vs_ossl_shortage_perf.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 35677 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_x16_avx512.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 11519 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_x4_avx.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 11251 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_x4_sse.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 19082 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_mb_x8_avx2.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 5628 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_multibinary.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 11299 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ni_x1.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 17683 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ni_x2.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 19978 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_opt_x1.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 7340 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/sha256_ref.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.dirstamp 00:46:58.246 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/ 00:46:58.246 -rw-r--r-- vagrant/vagrant 1436 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_mgr_init_avx512.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14307 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_avx512.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 1430 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_mgr_init_sse.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14301 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_sse.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 1432 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_mgr_init_avx2.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_avx2.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14307 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_sse_ni.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_base_aliases.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14313 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_avx512_ni.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_flush_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_rand_ssl_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_rand_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_rand_update_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_vs_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_mb_vs_ossl_shortage_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ref.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/.dirstamp 00:46:58.246 -rw-r--r-- vagrant/vagrant 14301 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_avx.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 14303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/.deps/sha256_ctx_base.Plo 00:46:58.246 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/ 00:46:58.246 -rw-r--r-- vagrant/vagrant 8965 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_ctx_ce.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 2396 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_aarch64_dispatcher.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 7365 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_mgr_ce.c 00:46:58.246 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_multibinary.S 00:46:58.246 -rw-r--r-- vagrant/vagrant 6145 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_x1_ce.S 00:46:58.246 -rw-r--r-- vagrant/vagrant 7777 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_x2_ce.S 00:46:58.246 -rw-r--r-- vagrant/vagrant 9495 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_x3_ce.S 00:46:58.246 -rw-r--r-- vagrant/vagrant 10488 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/sha256_mb_x4_ce.S 00:46:58.246 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/ 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_ctx_ce.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_aarch64_dispatcher.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_mgr_ce.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_multibinary.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_x1_ce.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_x2_ce.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_x3_ce.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha256_mb/aarch64/.deps/sha256_mb_x4_ce.Plo 00:46:58.246 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/ 00:46:58.246 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/ 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/cbc_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/cbc_std_vectors_random_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/cbc_std_vectors_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_nt_rand_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_nt_std_vectors_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_simple_example.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_std_vectors_random_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_std_vectors_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_dec_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_dec_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_enc_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_enc_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_expanded_key_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_rand.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_rand_ossl_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_128_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_dec_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_dec_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_enc_ossl_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_enc_perf.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_expanded_key_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_rand.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_rand_ossl_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/xts_256_test.Po 00:46:58.246 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/.dirstamp 00:46:58.246 -rw-r--r-- vagrant/vagrant 1232 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/cbc_pre.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 1232 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.deps/gcm_pre.Plo 00:46:58.246 -rw-r--r-- vagrant/vagrant 8890 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/Makefile.am 00:46:58.246 -rw-r--r-- vagrant/vagrant 51249 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_avx.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 48398 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_expanded_key_avx.asm 00:46:58.246 -rw-r--r-- vagrant/vagrant 47891 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_expanded_key_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 44749 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_expanded_key_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 50728 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 47508 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_dec_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 44539 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 42936 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_expanded_key_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 42488 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_expanded_key_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 39396 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_expanded_key_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 44086 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 40906 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_128_enc_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 56066 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 51991 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_expanded_key_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 51398 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_expanded_key_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 48348 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_expanded_key_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 55446 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 52401 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_dec_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 50036 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 46528 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_expanded_key_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 45991 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_expanded_key_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 43037 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_expanded_key_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 49487 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 46465 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/XTS_AES_256_enc_vaes.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 16406 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aes_common.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 14108 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_common.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5311 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_128_x4_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5175 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_128_x8_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5362 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_192_x4_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5139 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_192_x8_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5316 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_256_x4_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5139 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_256_x8_avx.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 20695 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_dec_vaes_avx512.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4327 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_128_x4_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4607 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_128_x8_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4627 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_192_x4_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4576 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_192_x8_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4489 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_256_x4_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 4600 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_enc_256_x8_sb.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 3704 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_multibinary.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 9214 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 2382 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_pre.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 20933 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_std_vectors.h 00:46:58.247 -rw-r--r-- vagrant/vagrant 12117 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_std_vectors_random_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 5341 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/cbc_std_vectors_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 5114 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/clear_regs.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_avx_gen2.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1826 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_avx_gen2_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_avx_gen4.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1826 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_avx_gen4_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1777 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1821 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_sse_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1818 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_vaes_avx512.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1830 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm128_vaes_avx512_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_avx_gen2.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1826 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_avx_gen2_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_avx_gen4.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1826 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_avx_gen4_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1777 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1821 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_sse_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1818 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_vaes_avx512.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 1830 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm256_vaes_avx512_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 73634 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_avx_gen2.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 119538 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_avx_gen4.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 12623 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_defines.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 13403 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_keys_vaes_avx512.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 9267 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_multibinary.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 5904 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_multibinary_nt.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 61438 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_nt_rand_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 10692 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_nt_std_vectors_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 8507 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 2614 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_pre.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3082 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_simple_example.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 77271 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_sse.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 57411 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_std_vectors_random_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 21732 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_std_vectors_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 181715 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_vaes_avx512.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 23718 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/gcm_vectors.h 00:46:58.247 -rw-r--r-- vagrant/vagrant 11351 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/keyexp_128.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 8811 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/keyexp_192.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 9744 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/keyexp_256.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 2798 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/keyexp_multibinary.asm 00:46:58.247 -rw-r--r-- vagrant/vagrant 14698 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/ossl_helper.h 00:46:58.247 -rw-r--r-- vagrant/vagrant 5071 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_dec_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3984 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_dec_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 4971 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_enc_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3886 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_enc_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 4042 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_expanded_key_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 8030 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_rand.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 7872 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_rand_ossl_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3387 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 75617 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_128_vect.h 00:46:58.247 -rw-r--r-- vagrant/vagrant 5118 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_dec_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 4030 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_dec_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 5000 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_enc_ossl_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3932 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_enc_perf.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 3937 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_expanded_key_test.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 8112 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_rand.c 00:46:58.247 -rw-r--r-- vagrant/vagrant 7923 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_rand_ossl_test.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 3311 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_test.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 50308 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_256_vect.h 00:46:58.248 -rw-r--r-- vagrant/vagrant 3635 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_aes_128_multibinary.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 3635 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/xts_aes_256_multibinary.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/.dirstamp 00:46:58.248 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/ 00:46:58.248 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/ 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_aes_finalize_128.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_aes_finalize_256.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_aes_init.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_consts.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_enc_dec_128.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_enc_dec_256.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_precomp_128.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_precomp_256.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_update_128.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/aes_gcm_update_256.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/cbc_aarch64_dispatcher.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/cbc_dec_aes.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/cbc_enc_aes.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/cbc_multibinary_aarch64.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/gcm_aarch64_dispatcher.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/gcm_multibinary_aarch64.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/keyexp_128_aarch64_aes.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/keyexp_192_aarch64_aes.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/keyexp_256_aarch64_aes.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/keyexp_aarch64_dispatcher.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/keyexp_multibinary_aarch64.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_aarch64_dispatcher.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_aes_128_dec.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_aes_128_enc.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_aes_256_dec.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_aes_256_enc.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_keyexp_aes_128_dec.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_keyexp_aes_128_enc.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_keyexp_aes_256_dec.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_keyexp_aes_256_enc.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/.deps/xts_multibinary_aarch64.Plo 00:46:58.248 -rw-r--r-- vagrant/vagrant 4394 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/cbc_aarch64_dispatcher.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 2481 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/cbc_common.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 14748 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/cbc_dec_aes.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 5207 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/cbc_enc_aes.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1940 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/cbc_multibinary_aarch64.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 6352 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_aarch64_dispatcher.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 18160 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_common.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 6813 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_common_128.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 7482 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_common_256.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 26050 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_enc_dec.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2698 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_multibinary_aarch64.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 4054 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_precomp.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 11573 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/gcm_update.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 4780 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/keyexp_128_aarch64_aes.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 5003 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/keyexp_192_aarch64_aes.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 5550 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/keyexp_256_aarch64_aes.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2805 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/keyexp_aarch64_dispatcher.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 1869 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/keyexp_multibinary_aarch64.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 3457 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aarch64_dispatcher.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 6408 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_128_common.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 3918 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_128_dec.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 3390 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_128_enc.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 7388 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_256_common.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 4041 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_256_dec.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 3355 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_256_enc.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 6054 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_aes_common.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2522 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_keyexp_aes_128_dec.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2486 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_keyexp_aes_128_enc.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2522 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_keyexp_aes_256_dec.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2486 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_keyexp_aes_256_enc.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 2061 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/xts_multibinary_aarch64.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 9198 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_aes_finalize_128.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 9423 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_aes_finalize_256.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 6131 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_aes_init.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 10807 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_consts.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1747 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_enc_dec_128.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1747 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_enc_dec_256.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1746 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_precomp_128.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1746 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_precomp_256.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1748 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_update_128.S 00:46:58.248 -rw-r--r-- vagrant/vagrant 1748 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/aes/aarch64/aes_gcm_update_256.S 00:46:58.248 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/ 00:46:58.248 -rw-r--r-- vagrant/vagrant 4241 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/Makefile.am 00:46:58.248 -rw-r--r-- vagrant/vagrant 9210 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_avx.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 9233 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_avx2.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 9338 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_avx512.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 13100 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_base.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 2535 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_base_aliases.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 8888 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_sb_sse4.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 8831 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ctx_sse.c 00:46:58.248 -rw-r--r-- vagrant/vagrant 2494 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_job.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 3188 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_datastruct.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 6757 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_flush_avx.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 7300 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_flush_avx2.asm 00:46:58.248 -rw-r--r-- vagrant/vagrant 8432 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_flush_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 6728 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_flush_sse.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 2035 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_init_avx2.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2042 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_init_avx512.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 1989 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_init_sse.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 7709 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_submit_avx.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 8001 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_submit_avx2.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 8902 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_submit_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 7662 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_mgr_submit_sse.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 4818 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_rand_ssl_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 5821 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_rand_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 8907 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_rand_update_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 10662 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 4402 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_vs_ossl_perf.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 13018 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_x2_avx.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 12508 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_x2_sse.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 17670 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_x4_avx2.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 31547 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_mb_x8_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 7747 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_multibinary.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 9388 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_ref.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2006 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_sb_mgr_flush_sse4.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 1929 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_sb_mgr_init_sse4.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2748 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_sb_mgr_submit_sse4.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 13725 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/sha512_sse4.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.dirstamp 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 14301 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_sse.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 1430 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_mgr_init_sse.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_base_aliases.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 1432 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_mgr_init_avx2.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 1432 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_sb_mgr_init_sse4.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 14309 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_sb_sse4.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_rand_ssl_test.Po 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_rand_test.Po 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_rand_update_test.Po 00:46:58.249 -rw-r--r-- vagrant/vagrant 14301 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_avx.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 14303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_avx2.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 2777 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_sb_mgr_submit_sse4.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 14303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_base.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 2775 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_sb_mgr_flush_sse4.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 1436 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_mgr_init_avx512.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 14307 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_ctx_avx512.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_test.Po 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/sha512_mb_vs_ossl_perf.Po 00:46:58.249 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/.deps/.dirstamp 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 8965 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_ctx_ce.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2396 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_mb_aarch64_dispatcher.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 5998 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_mb_mgr_ce.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_mb_multibinary.S 00:46:58.249 -rw-r--r-- vagrant/vagrant 10399 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_mb_x1_ce.S 00:46:58.249 -rw-r--r-- vagrant/vagrant 13126 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/sha512_mb_x2_ce.S 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_ctx_ce.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_mb_aarch64_dispatcher.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_mb_mgr_ce.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_mb_multibinary.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_mb_x1_ce.Plo 00:46:58.249 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sha512_mb/aarch64/.deps/sha512_mb_x2_ce.Plo 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 14589 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/requests 00:46:58.249 -rw-r--r-- vagrant/vagrant 477498 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/output.1 00:46:58.249 -rw-r--r-- vagrant/vagrant 477688 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/output.0 00:46:58.249 -rw-r--r-- vagrant/vagrant 124155 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/traces.0 00:46:58.249 -rw-r--r-- vagrant/vagrant 51697 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/autom4te.cache/traces.1 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/ 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/Makefile 00:46:58.249 -rw-r--r-- vagrant/vagrant 962 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/README.txt 00:46:58.249 -rw-r--r-- vagrant/vagrant 8632 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/aes_thread.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 4836 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/isal_multithread_perf.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 1212 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/isal_multithread_perf.h 00:46:58.249 -rw-r--r-- vagrant/vagrant 5317 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/md5_thread.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 529 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/sha1_thread.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/sha256_thread.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 559 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/examples/saturation_test/sha512_thread.c 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/ 00:46:58.249 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.dirstamp 00:46:58.249 -rw-r--r-- vagrant/vagrant 4694 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/Makefile.am 00:46:58.249 -rw-r--r-- vagrant/vagrant 9562 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_ctx_avx2.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 9471 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_ctx_avx512.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 9694 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_ctx_base.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2465 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_ctx_base_aliases.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2954 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_job.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 4621 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_flush_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 3251 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_mgr_datastruct.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 8024 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_mgr_flush_avx2.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 7974 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_mgr_flush_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 7257 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_mgr_submit_avx2.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 8360 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_mgr_submit_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 4846 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_rand_ssl_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 5961 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_rand_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 8864 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_rand_update_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 9617 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 4405 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_vs_ossl_perf.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 4542 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_vs_ossl_shortage_perf.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 32028 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_x16_avx512.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 19542 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_mb_x8_avx2.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 3474 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_multibinary.asm 00:46:58.249 -rw-r--r-- vagrant/vagrant 6065 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_ref_test.c 00:46:58.249 -rw-r--r-- vagrant/vagrant 2089 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/sm3_test_helper.c 00:46:58.249 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/ 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_aarch64_dispatcher.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_asimd_x1.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_asimd_x4.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_ctx_asimd_aarch64.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_ctx_sm_aarch64.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_ctx_sve.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_mgr_asimd_aarch64.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_mgr_sm_aarch64.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_mgr_sve.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_multibinary_aarch64.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_sm_x1.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_sm_x2.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_sm_x3.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_sm_x4.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/.deps/sm3_mb_sve.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 2963 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_aarch64_dispatcher.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 9457 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_asimd_x1.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 17235 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_asimd_x4.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 8754 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_ctx_asimd_aarch64.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 8500 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_ctx_sm_aarch64.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 8728 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_ctx_sve.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 5538 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_mgr_asimd_aarch64.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 7190 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_mgr_sm_aarch64.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 5775 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_mgr_sve.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 1830 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_multibinary_aarch64.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 7945 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_sm_x1.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 10667 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_sm_x2.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 11117 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_sm_x3.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 13215 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_sm_x4.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 4531 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_mb_sve.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 12504 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/aarch64/sm3_sve_common.S 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 14289 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_ctx_avx512.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 14282 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_ctx_base.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 14282 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_ctx_avx2.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_ctx_base_aliases.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_flush_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_rand_ssl_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_rand_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_rand_update_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_vs_ossl_perf.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_mb_vs_ossl_shortage_perf.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/sm3_ref_test.Po 00:46:58.250 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/sm3_mb/.deps/.dirstamp 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 9469 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/aarch64_multibinary.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 6363 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/aes_cbc.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 26222 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/aes_gcm.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 2845 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/aes_keyexp.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 7952 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/aes_xts.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 2957 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/datastruct.asm 00:46:58.250 -rw-r--r-- vagrant/vagrant 3312 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/endian_helper.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 2336 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/intrinreg.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 15038 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/md5_mb.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 17160 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/memcpy.asm 00:46:58.250 -rw-r--r-- vagrant/vagrant 11172 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/memcpy_inline.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 10567 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/mh_sha1.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 12686 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/mh_sha1_murmur3_x64_128.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 10796 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/mh_sha256.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 3738 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/multi_buffer.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 12776 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/multibinary.asm 00:46:58.250 -rw-r--r-- vagrant/vagrant 10959 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/reg_sizes.asm 00:46:58.250 -rw-r--r-- vagrant/vagrant 4131 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/rolling_hashx.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 17947 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/sha1_mb.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 18328 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/sha256_mb.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 17456 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/sha512_mb.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 5537 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/sm3_mb.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 3392 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/test.h 00:46:58.250 -rw-r--r-- vagrant/vagrant 3686 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/include/types.h 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/ 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/Makefile 00:46:58.250 -rw-r--r-- vagrant/vagrant 2497 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/Makefile.nmake 00:46:58.250 -rw-r--r-- vagrant/vagrant 4933 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/md5_mb_over_4GB_test.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 4957 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/sha1_mb_over_4GB_test.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 5005 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/sha256_mb_over_4GB_test.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 5009 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/sha512_mb_over_4GB_test.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 5079 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tests/extended/sm3_mb_over_4GB_test.c 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/ 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 8156 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_ctx_aarch64_asimd.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 8039 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_ctx_aarch64_sve.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 8052 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_ctx_aarch64_sve2.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 3234 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_aarch64_dispatcher.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 7583 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_asimd_x1.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 12483 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_asimd_x4.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 5540 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_mgr_aarch64_asimd.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 6140 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_mgr_aarch64_sve.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 6145 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_mgr_aarch64_sve2.c 00:46:58.250 -rw-r--r-- vagrant/vagrant 1830 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_multibinary.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 5048 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_mb_sve.S 00:46:58.250 -rw-r--r-- vagrant/vagrant 10904 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/md5_sve_common.S 00:46:58.250 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/ 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_ctx_aarch64_asimd.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_ctx_aarch64_sve.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_ctx_aarch64_sve2.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_aarch64_dispatcher.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_asimd_x1.Plo 00:46:58.250 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_asimd_x4.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_mgr_aarch64_asimd.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_mgr_aarch64_sve.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_mgr_aarch64_sve2.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_multibinary.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/aarch64/.deps/md5_mb_sve.Plo 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/ 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_base_aliases.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 14118 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_avx2.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_rand_ssl_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_rand_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_rand_update_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_vs_ossl_perf.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/.dirstamp 00:46:58.251 -rw-r--r-- vagrant/vagrant 1300 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_mgr_init_sse.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 14116 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_sse.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1302 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_mgr_init_avx2.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 14116 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_avx.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1306 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_mb_mgr_init_avx512.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 14170 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_base.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 14125 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.deps/md5_ctx_avx512.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/.dirstamp 00:46:58.251 -rw-r--r-- vagrant/vagrant 3978 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/Makefile.am 00:46:58.251 -rw-r--r-- vagrant/vagrant 8989 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_avx.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 9009 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_avx2.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 9088 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_avx512.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 10890 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_base.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 2409 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_base_aliases.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 8615 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ctx_sse.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 2699 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_job.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 3246 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_datastruct.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 7912 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_flush_avx.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 7942 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_flush_avx2.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 9059 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_flush_avx512.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 7848 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_flush_sse.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 1968 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_init_avx2.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 2108 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_init_avx512.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 1959 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_init_sse.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 7373 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_submit_avx.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 7856 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_submit_avx2.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 8162 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_submit_avx512.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 7307 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_mgr_submit_sse.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 5038 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_rand_ssl_test.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 5723 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_rand_test.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 8684 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_rand_update_test.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 9477 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_test.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 4642 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_vs_ossl_perf.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 28723 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_x16x2_avx512.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 28212 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_x4x2_avx.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 32109 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_x4x2_sse.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 40487 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_mb_x8x2_avx2.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 3673 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_multibinary.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 6662 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/md5_mb/md5_ref.c 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/ 00:46:58.251 -rwxr-xr-x vagrant/vagrant 2412 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/check_format.sh 00:46:58.251 -rw-r--r-- vagrant/vagrant 4740 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/gen_nmake.mk 00:46:58.251 -rwxr-xr-x vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/iindent 00:46:58.251 -rwxr-xr-x vagrant/vagrant 776 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/nasm-filter.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 49 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/remove_trailing_whitespace.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 1264 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/test_autorun.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 1680 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/test_checks.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 3394 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/test_extended.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/test_tools.sh 00:46:58.251 -rwxr-xr-x vagrant/vagrant 666 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/tools/yasm-filter.sh 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/ 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/ 00:46:58.251 -rw-r--r-- vagrant/vagrant 2293 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_aarch64_dispatcher.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 2720 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_asimd.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 3708 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_block_asimd.S 00:46:58.251 -rw-r--r-- vagrant/vagrant 13949 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_block_ce.S 00:46:58.251 -rw-r--r-- vagrant/vagrant 2701 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_ce.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/mh_sha1_multibinary.S 00:46:58.251 -rw-r--r-- vagrant/vagrant 6578 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/sha1_asimd_common.S 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/ 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_aarch64_dispatcher.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_asimd.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_block_asimd.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_block_ce.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_ce.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/aarch64/.deps/mh_sha1_multibinary.Plo 00:46:58.251 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/ 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_base_aliases.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1804 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_avx512.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_perf.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_update_test.Po 00:46:58.251 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/.dirstamp 00:46:58.251 -rw-r--r-- vagrant/vagrant 1689 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_finalize_base.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1682 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_update_base.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1676 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/sha1_for_mh_sha1.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 1680 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.deps/mh_sha1_block_base.Plo 00:46:58.251 -rw-r--r-- vagrant/vagrant 3147 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/Makefile.am 00:46:58.251 -rw-r--r-- vagrant/vagrant 5303 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 3037 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_avx512.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 1999 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_base_aliases.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 12871 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_block_avx.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 12831 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_block_avx2.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 10504 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_block_avx512.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 13450 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_block_base.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 12542 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_block_sse.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 4884 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_finalize_base.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 11600 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_internal.h 00:46:58.251 -rw-r--r-- vagrant/vagrant 3152 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_multibinary.asm 00:46:58.251 -rw-r--r-- vagrant/vagrant 5537 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_perf.c 00:46:58.251 -rw-r--r-- vagrant/vagrant 14852 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_ref.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 5861 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_test.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 4248 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_update_base.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 6750 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/mh_sha1_update_test.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 6492 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/sha1_for_mh_sha1.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1/.dirstamp 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/ 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/ 00:46:58.252 -rw-r--r-- vagrant/vagrant 2387 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_aarch64_dispatcher.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 3526 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_aarch64_internal.h 00:46:58.252 -rw-r--r-- vagrant/vagrant 2644 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_asimd.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 6147 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_block_asimd.S 00:46:58.252 -rw-r--r-- vagrant/vagrant 16367 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_block_ce.S 00:46:58.252 -rw-r--r-- vagrant/vagrant 2619 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_ce.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 1824 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/mh_sha1_murmur3_multibinary.S 00:46:58.252 -rw-r--r-- vagrant/vagrant 6652 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/sha1_asimd_common.S 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/ 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_aarch64_dispatcher.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_asimd.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_block_asimd.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_block_ce.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_ce.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/aarch64/.deps/mh_sha1_murmur3_multibinary.Plo 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/ 00:46:58.252 -rw-r--r-- vagrant/vagrant 2233 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_avx512.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_base_aliases.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_perf.Po 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_test.Po 00:46:58.252 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_update_test.Po 00:46:58.252 -rw-r--r-- vagrant/vagrant 3818 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_finalize_base.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 3792 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/murmur3_x64_128_internal.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 2219 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 1983 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/mh_sha1_murmur3_x64_128_update_base.Plo 00:46:58.252 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.deps/.dirstamp 00:46:58.252 -rw-r--r-- vagrant/vagrant 4560 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/Makefile.am 00:46:58.252 -rw-r--r-- vagrant/vagrant 6082 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 2986 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_avx512.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 2192 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_base_aliases.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 18431 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_block_avx.asm 00:46:58.252 -rw-r--r-- vagrant/vagrant 17038 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_block_avx2.asm 00:46:58.252 -rw-r--r-- vagrant/vagrant 13348 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_block_avx512.asm 00:46:58.252 -rw-r--r-- vagrant/vagrant 17946 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_block_sse.asm 00:46:58.252 -rw-r--r-- vagrant/vagrant 4370 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_finalize_base.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 7805 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_internal.h 00:46:58.252 -rw-r--r-- vagrant/vagrant 3787 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_multibinary.asm 00:46:58.252 -rw-r--r-- vagrant/vagrant 6516 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_perf.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 7369 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_test.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 4235 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_update_base.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 8281 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/mh_sha1_murmur3_x64_128_update_test.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 3551 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/murmur3_x64_128.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 4355 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/murmur3_x64_128_internal.c 00:46:58.252 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/mh_sha1_murmur3_x64_128/.dirstamp 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/ 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/ltmain.sh -> /usr/share/libtool/build-aux/ltmain.sh 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/compile -> /usr/share/automake-1.16/compile 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/config.guess -> /usr/share/automake-1.16/config.guess 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/depcomp -> /usr/share/automake-1.16/depcomp 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/test-driver -> /usr/share/automake-1.16/test-driver 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/config.sub -> /usr/share/automake-1.16/config.sub 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/install-sh -> /usr/share/automake-1.16/install-sh 00:46:58.252 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l-crypto/build-aux/missing -> /usr/share/automake-1.16/missing 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/proto/ 00:46:58.252 -rw-r--r-- vagrant/vagrant 690 2024-06-07 12:49 spdk-test_gen_spec/proto/Makefile 00:46:58.252 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/proto/nvme.proto 00:46:58.252 -rw-r--r-- vagrant/vagrant 1120 2024-06-07 12:49 spdk-test_gen_spec/proto/nvmf.proto 00:46:58.252 -rw-r--r-- vagrant/vagrant 579 2024-06-07 12:49 spdk-test_gen_spec/proto/nvmf_tcp.proto 00:46:58.252 -rw-r--r-- vagrant/vagrant 6480 2024-06-07 12:49 spdk-test_gen_spec/proto/sma.proto 00:46:58.252 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/proto/virtio_blk.proto 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/doc/ 00:46:58.252 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/doc/img/ 00:46:58.252 -rw-r--r-- vagrant/vagrant 33413 2024-06-07 12:49 spdk-test_gen_spec/doc/img/iscsi.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 21621 2024-06-07 12:49 spdk-test_gen_spec/doc/img/iscsi_example.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 19155 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_esnap_clone.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 13204 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_inflate_clone_snapshot.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 11794 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_thin_provisioning.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 10922 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_thin_provisioning_write.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 12377 2024-06-07 12:49 spdk-test_gen_spec/doc/img/nvme_cuse.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 12523 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_clone_snapshot_read.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 13122 2024-06-07 12:49 spdk-test_gen_spec/doc/img/lvol_clone_snapshot_write.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 17749 2024-06-07 12:49 spdk-test_gen_spec/doc/img/qemu_vhost_data_flow.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 5716 2024-06-07 12:49 spdk-test_gen_spec/doc/img/ublk_service.svg 00:46:58.252 -rw-r--r-- vagrant/vagrant 95 2024-06-07 12:49 spdk-test_gen_spec/doc/.gitignore 00:46:58.252 -rw-r--r-- vagrant/vagrant 105096 2024-06-07 12:49 spdk-test_gen_spec/doc/Doxyfile 00:46:58.252 -rw-r--r-- vagrant/vagrant 672 2024-06-07 12:49 spdk-test_gen_spec/doc/Makefile 00:46:58.252 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/doc/README.md 00:46:58.252 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/doc/index.md 00:46:58.252 -rw-r--r-- vagrant/vagrant 187 2024-06-07 12:49 spdk-test_gen_spec/doc/intro.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 537 2024-06-07 12:49 spdk-test_gen_spec/doc/ioat.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 11025 2024-06-07 12:49 spdk-test_gen_spec/doc/iscsi.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 296808 2024-06-07 12:49 spdk-test_gen_spec/doc/jsonrpc.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 2236 2024-06-07 12:49 spdk-test_gen_spec/doc/jsonrpc_proxy.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 13249 2024-06-07 12:49 spdk-test_gen_spec/doc/libraries.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 10437 2024-06-07 12:49 spdk-test_gen_spec/doc/lvol.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 6867 2024-06-07 12:49 spdk-test_gen_spec/doc/memory.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 109 2024-06-07 12:49 spdk-test_gen_spec/doc/misc.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/doc/notify.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 18977 2024-06-07 12:49 spdk-test_gen_spec/doc/nvme.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 9824 2024-06-07 12:49 spdk-test_gen_spec/doc/nvme_multipath.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 7568 2024-06-07 12:49 spdk-test_gen_spec/doc/nvme_spec.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 16860 2024-06-07 12:49 spdk-test_gen_spec/doc/nvmf.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 3552 2024-06-07 12:49 spdk-test_gen_spec/doc/nvmf_multipath_howto.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 9340 2024-06-07 12:49 spdk-test_gen_spec/doc/nvmf_tgt_pg.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 13342 2024-06-07 12:49 spdk-test_gen_spec/doc/nvmf_tracing.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 4466 2024-06-07 12:49 spdk-test_gen_spec/doc/overview.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 2782 2024-06-07 12:49 spdk-test_gen_spec/doc/peer_2_peer.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 11179 2024-06-07 12:49 spdk-test_gen_spec/doc/performance_reports.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 2204 2024-06-07 12:49 spdk-test_gen_spec/doc/pkgconfig.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 952 2024-06-07 12:49 spdk-test_gen_spec/doc/porting.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/doc/prog_guides.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 2540 2024-06-07 12:49 spdk-test_gen_spec/doc/rpm.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 5271 2024-06-07 12:49 spdk-test_gen_spec/doc/scheduler.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 3751 2024-06-07 12:49 spdk-test_gen_spec/doc/shfmt.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 7358 2024-06-07 12:49 spdk-test_gen_spec/doc/sma.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 5063 2024-06-07 12:49 spdk-test_gen_spec/doc/spdk_top.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1635 2024-06-07 12:49 spdk-test_gen_spec/doc/spdkcli.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 5978 2024-06-07 12:49 spdk-test_gen_spec/doc/ssd_internals.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 27372 2024-06-07 12:49 spdk-test_gen_spec/doc/stylesheet.css 00:46:58.253 -rw-r--r-- vagrant/vagrant 5474 2024-06-07 12:49 spdk-test_gen_spec/doc/system_configuration.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 4728 2024-06-07 12:49 spdk-test_gen_spec/doc/template_pg.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 77 2024-06-07 12:49 spdk-test_gen_spec/doc/tools.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 123587 2024-06-07 12:49 spdk-test_gen_spec/doc/two.min.js 00:46:58.253 -rw-r--r-- vagrant/vagrant 8895 2024-06-07 12:49 spdk-test_gen_spec/doc/ublk.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 7926 2024-06-07 12:49 spdk-test_gen_spec/doc/usdt.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 329 2024-06-07 12:49 spdk-test_gen_spec/doc/user_guides.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 6010 2024-06-07 12:49 spdk-test_gen_spec/doc/userspace.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 6926 2024-06-07 12:49 spdk-test_gen_spec/doc/vagrant.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 13316 2024-06-07 12:49 spdk-test_gen_spec/doc/vhost.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 9046 2024-06-07 12:49 spdk-test_gen_spec/doc/vhost_processing.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1533 2024-06-07 12:49 spdk-test_gen_spec/doc/virtio.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 3419 2024-06-07 12:49 spdk-test_gen_spec/doc/vmd.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1967 2024-06-07 12:49 spdk-test_gen_spec/doc/about.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 9880 2024-06-07 12:49 spdk-test_gen_spec/doc/accel_fw.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 8103 2024-06-07 12:49 spdk-test_gen_spec/doc/applications.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 5765 2024-06-07 12:49 spdk-test_gen_spec/doc/backporting.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 29464 2024-06-07 12:49 spdk-test_gen_spec/doc/bdev.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 9200 2024-06-07 12:49 spdk-test_gen_spec/doc/bdev_module.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 7651 2024-06-07 12:49 spdk-test_gen_spec/doc/bdev_pg.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 2448 2024-06-07 12:49 spdk-test_gen_spec/doc/bdevperf.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 32388 2024-06-07 12:49 spdk-test_gen_spec/doc/blob.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 3712 2024-06-07 12:49 spdk-test_gen_spec/doc/blobfs.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/doc/ci_tools.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 15453 2024-06-07 12:49 spdk-test_gen_spec/doc/compression.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 199 2024-06-07 12:49 spdk-test_gen_spec/doc/concepts.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 11778 2024-06-07 12:49 spdk-test_gen_spec/doc/concurrency.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 4185 2024-06-07 12:49 spdk-test_gen_spec/doc/containers.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1461 2024-06-07 12:49 spdk-test_gen_spec/doc/distributions.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 117 2024-06-07 12:49 spdk-test_gen_spec/doc/driver_modules.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 4665 2024-06-07 12:49 spdk-test_gen_spec/doc/event.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 5009 2024-06-07 12:49 spdk-test_gen_spec/doc/fips.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/doc/footer.html 00:46:58.253 -rw-r--r-- vagrant/vagrant 10991 2024-06-07 12:49 spdk-test_gen_spec/doc/ftl.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 8913 2024-06-07 12:49 spdk-test_gen_spec/doc/gdb_macros.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 135 2024-06-07 12:49 spdk-test_gen_spec/doc/general.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 3185 2024-06-07 12:49 spdk-test_gen_spec/doc/getting_started.md 00:46:58.253 -rw-r--r-- vagrant/vagrant 1024 2024-06-07 12:49 spdk-test_gen_spec/doc/header.html 00:46:58.253 -rw-r--r-- vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/doc/idxd.md 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/ 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.github/ 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.github/workflows/ 00:46:58.253 -rw-r--r-- vagrant/vagrant 1623 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.github/workflows/ci.yml 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/ 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/ 00:46:58.253 -rw-r--r-- vagrant/vagrant 2616 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/Makefile.am 00:46:58.253 -rw-r--r-- vagrant/vagrant 189 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/Makefile.unx 00:46:58.253 -rw-r--r-- vagrant/vagrant 2338 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/igzip_checked_inflate_fuzz_test.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 897 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/igzip_dump_inflate_corpus.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 913 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/igzip_fuzz_inflate.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 489 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/igzip_simple_inflate_fuzz_test.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 3512 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/igzip_simple_round_trip_fuzz_test.c 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/.deps/ 00:46:58.253 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/.deps/igzip_dump_inflate_corpus.Po 00:46:58.253 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tests/fuzz/.deps/igzip_fuzz_inflate.Po 00:46:58.253 -rw-r--r-- vagrant/vagrant 240037 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Makefile.in 00:46:58.253 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/ 00:46:58.253 -rw-r--r-- vagrant/vagrant 3566 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/Makefile.am 00:46:58.253 -rw-r--r-- vagrant/vagrant 16273 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_01.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 16470 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_02.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 16196 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 13785 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_by4.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 14602 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_copy_by4.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 14753 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_copy_by4_02.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2924 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_copy_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 5473 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_copy_test.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 3917 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_op_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 2802 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 5848 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc16_t10dif_test.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 9302 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_funcs_test.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 15991 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_gzip_refl_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 18436 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_gzip_refl_by8.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 15070 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_gzip_refl_by8_02.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 3118 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_gzip_refl_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 16184 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_ieee_01.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 16389 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_ieee_02.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 16720 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_ieee_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 12256 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_ieee_by4.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2796 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_ieee_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 26431 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_iscsi_00.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 17326 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_iscsi_01.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 15253 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_iscsi_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2799 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc32_iscsi_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 52351 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_base.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 2577 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_ecma_norm_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2424 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_ecma_norm_by8.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2577 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_ecma_refl_by16_10.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2427 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_ecma_refl_by8.asm 00:46:58.253 -rw-r--r-- vagrant/vagrant 2579 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_example.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 3758 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_funcs_perf.c 00:46:58.253 -rw-r--r-- vagrant/vagrant 9550 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_funcs_test.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 13953 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_iso_norm_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 15269 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_iso_norm_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 13523 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_iso_refl_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 16779 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_iso_refl_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2578 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_jones_norm_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2410 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_jones_norm_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2578 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_jones_refl_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2410 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_jones_refl_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 4422 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_multibinary.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 5288 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_ref.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 2576 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_rocksoft_norm_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2408 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_rocksoft_norm_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2576 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_rocksoft_refl_by16_10.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 2408 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc64_rocksoft_refl_by8.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 14747 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc_base.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 3475 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc_base_aliases.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 8329 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc_multibinary.asm 00:46:58.254 -rw-r--r-- vagrant/vagrant 4096 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc_ref.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 2463 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/crc_simple_test.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.dirstamp 00:46:58.254 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/ 00:46:58.254 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/ 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc16_t10dif_copy_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc16_t10dif_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_gzip_refl_3crc_fold.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_gzip_refl_crc_ext.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_gzip_refl_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_ieee_norm_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_iscsi_3crc_fold.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_iscsi_crc_ext.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_iscsi_refl_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_mix_default.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32_mix_neoverse_n1.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32c_mix_default.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc32c_mix_neoverse_n1.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_ecma_norm_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_ecma_refl_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_iso_norm_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_iso_refl_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_jones_norm_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_jones_refl_pmull.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc64_rocksoft.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc_aarch64_dispatcher.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/.deps/crc_multibinary_arm.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 2661 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/Makefile.am 00:46:58.254 -rw-r--r-- vagrant/vagrant 11971 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc16_t10dif_copy_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 11565 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc16_t10dif_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 7965 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_aarch64_common.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3684 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_common_crc_ext_cortex_a72.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 13367 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_common_mix_neoverse_n1.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 3060 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_gzip_refl_3crc_fold.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 2501 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_gzip_refl_crc_ext.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_gzip_refl_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 5596 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_gzip_refl_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_ieee_norm_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 5588 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_ieee_norm_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3040 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_iscsi_3crc_fold.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 2485 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_iscsi_crc_ext.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 2250 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_iscsi_refl_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 5600 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_iscsi_refl_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3063 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_mix_default.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 14828 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_mix_default_common.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 2662 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_mix_neoverse_n1.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 4170 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_norm_common_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3939 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32_refl_common_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3106 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32c_mix_default.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 2672 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc32c_mix_neoverse_n1.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_ecma_norm_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8638 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_ecma_norm_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_ecma_refl_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8546 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_ecma_refl_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1871 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_iso_norm_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8651 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_iso_norm_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1871 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_iso_refl_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8547 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_iso_refl_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1875 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_jones_norm_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8651 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_jones_norm_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 1875 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_jones_refl_pmull.S 00:46:58.254 -rw-r--r-- vagrant/vagrant 8547 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_jones_refl_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 4102 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_norm_common_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 3949 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_refl_common_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 2004 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc64_rocksoft.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 6491 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc_aarch64_dispatcher.c 00:46:58.254 -rw-r--r-- vagrant/vagrant 8561 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc_common_pmull.h 00:46:58.254 -rw-r--r-- vagrant/vagrant 2089 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/aarch64/crc_multibinary_arm.S 00:46:58.254 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/ 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc16_t10dif_copy_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc16_t10dif_copy_test.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc16_t10dif_op_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc16_t10dif_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc16_t10dif_test.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc32_funcs_test.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc32_gzip_refl_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc32_ieee_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc32_iscsi_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc64_example.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc64_funcs_perf.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc64_funcs_test.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc_base_aliases.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc_simple_test.Po 00:46:58.254 -rw-r--r-- vagrant/vagrant 1180 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc_base.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/crc64_base.Plo 00:46:58.254 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/crc/.deps/.dirstamp 00:46:58.254 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/ 00:46:58.254 -rwxr-xr-x vagrant/vagrant 2866 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/check_format.sh 00:46:58.254 -rw-r--r-- vagrant/vagrant 6408 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/gen_nmake.mk 00:46:58.254 -rwxr-xr-x vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/iindent 00:46:58.254 -rwxr-xr-x vagrant/vagrant 917 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/nasm-filter.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 49 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/remove_trailing_whitespace.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 1541 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/test_autorun.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 2989 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/test_checks.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 6918 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/test_extended.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 4608 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/test_fuzz.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/test_tools.sh 00:46:58.255 -rwxr-xr-x vagrant/vagrant 807 2024-06-07 12:49 spdk-test_gen_spec/isa-l/tools/yasm-filter.sh 00:46:58.255 -rw-r--r-- vagrant/vagrant 30 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.git 00:46:58.255 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.gitignore 00:46:58.255 -rw-r--r-- vagrant/vagrant 2038 2024-06-07 12:49 spdk-test_gen_spec/isa-l/.travis.yml 00:46:58.255 -rw-r--r-- vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/isa-l/CONTRIBUTING.md 00:46:58.255 -rw-r--r-- vagrant/vagrant 1213 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Doxyfile 00:46:58.255 -rw-r--r-- vagrant/vagrant 1600 2024-06-07 12:49 spdk-test_gen_spec/isa-l/LICENSE 00:46:58.255 -rw-r--r-- vagrant/vagrant 4342 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Makefile.am 00:46:58.255 -rw-r--r-- vagrant/vagrant 11138 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Makefile.nmake 00:46:58.255 -rw-r--r-- vagrant/vagrant 2313 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Makefile.unx 00:46:58.255 -rw-r--r-- vagrant/vagrant 4134 2024-06-07 12:49 spdk-test_gen_spec/isa-l/README.md 00:46:58.255 -rw-r--r-- vagrant/vagrant 11148 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Release_notes.txt 00:46:58.255 -rw-r--r-- vagrant/vagrant 430 2024-06-07 12:49 spdk-test_gen_spec/isa-l/SECURITY.md 00:46:58.255 -rwxr-xr-x vagrant/vagrant 443 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autogen.sh 00:46:58.255 -rw-r--r-- vagrant/vagrant 9218 2024-06-07 12:49 spdk-test_gen_spec/isa-l/configure.ac 00:46:58.255 -rw-r--r-- vagrant/vagrant 3738 2024-06-07 12:49 spdk-test_gen_spec/isa-l/isa-l.def 00:46:58.255 -rw-r--r-- vagrant/vagrant 3733 2024-06-07 12:49 spdk-test_gen_spec/isa-l/isa-l.rc 00:46:58.255 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/isa-l/libisal.pc.in 00:46:58.255 -rw-r--r-- vagrant/vagrant 12642 2024-06-07 12:49 spdk-test_gen_spec/isa-l/make.inc 00:46:58.255 -rw-r--r-- vagrant/vagrant 373164 2024-06-07 12:49 spdk-test_gen_spec/isa-l/aclocal.m4 00:46:58.255 -rwxr-xr-x vagrant/vagrant 339189 2024-06-07 12:49 spdk-test_gen_spec/isa-l/libtool 00:46:58.255 -rw-r--r-- vagrant/vagrant 38934 2024-06-07 12:49 spdk-test_gen_spec/isa-l/config.log 00:46:58.255 -rwxr-xr-x vagrant/vagrant 471494 2024-06-07 12:49 spdk-test_gen_spec/isa-l/configure 00:46:58.255 -rwxr-xr-x vagrant/vagrant 55898 2024-06-07 12:49 spdk-test_gen_spec/isa-l/config.status 00:46:58.255 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/doc/ 00:46:58.255 -rw-r--r-- vagrant/vagrant 2626 2024-06-07 12:49 spdk-test_gen_spec/isa-l/doc/build.md 00:46:58.255 -rw-r--r-- vagrant/vagrant 8143 2024-06-07 12:49 spdk-test_gen_spec/isa-l/doc/functions.md 00:46:58.255 -rw-r--r-- vagrant/vagrant 2292 2024-06-07 12:49 spdk-test_gen_spec/isa-l/doc/test.md 00:46:58.255 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ 00:46:58.255 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/ 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/ec_base_aliases.Plo 00:46:58.255 -rw-r--r-- vagrant/vagrant 2565 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/ec_base.Plo 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_base_perf.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_base_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_perf.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_update_perf.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/erasure_code_update_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gen_rs_matrix_limits.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_inverse_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_dot_prod_1tbl.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_dot_prod_base_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_dot_prod_perf.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_dot_prod_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_mad_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_mul_base_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_mul_perf.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/gf_vect_mul_test.Po 00:46:58.255 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/.dirstamp 00:46:58.255 -rw-r--r-- vagrant/vagrant 2076 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.deps/ec_highlevel_func.Plo 00:46:58.255 -rw-r--r-- vagrant/vagrant 6981 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/Makefile.am 00:46:58.255 -rw-r--r-- vagrant/vagrant 8657 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ec_base.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 411786 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ec_base.h 00:46:58.255 -rw-r--r-- vagrant/vagrant 2716 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ec_base_aliases.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 15782 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ec_highlevel_func.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 4513 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ec_multibinary.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 5349 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_base_perf.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 22104 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_base_test.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 7398 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_perf.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 22517 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_test.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 10774 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_update_perf.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 28335 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/erasure_code_update_test.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 2978 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gen_rs_matrix_limits.c 00:46:58.255 -rw-r--r-- vagrant/vagrant 8009 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8543 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9811 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_avx2_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6106 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 5017 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8017 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_dot_prod_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6195 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6581 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8319 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_avx2_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6304 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 4924 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6304 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_2vect_mad_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9268 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9972 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8900 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_avx2_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 7056 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 5526 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9217 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_dot_prod_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8193 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9106 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8047 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_avx2_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 7004 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 5476 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8219 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_3vect_mad_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 11397 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_dot_prod_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 12154 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_dot_prod_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8404 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_dot_prod_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6323 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_dot_prod_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 11390 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_dot_prod_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9590 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 10056 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 7075 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_avx2_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 7671 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 5997 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9381 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_4vect_mad_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8289 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_dot_prod_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8706 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_dot_prod_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 9537 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_dot_prod_avx512.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 6944 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_dot_prod_avx512_gfni.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8221 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_dot_prod_sse.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 10830 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_avx.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 11022 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_avx2.asm 00:46:58.255 -rw-r--r-- vagrant/vagrant 8007 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_avx2_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 8365 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_avx512.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 6566 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_avx512_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 10640 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_5vect_mad_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 8720 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_dot_prod_avx.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 9176 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_dot_prod_avx2.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 10198 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_dot_prod_avx512.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 7530 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_dot_prod_avx512_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 8643 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_dot_prod_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 11901 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_mad_avx.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 12493 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_mad_avx2.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 9353 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_mad_avx512.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 7090 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_mad_avx512_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 11872 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_6vect_mad_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 5299 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_inverse_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 4674 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_1tbl.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 6019 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_avx.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 6285 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_avx2.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 8071 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_avx2_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 6169 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_avx512.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 4568 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_avx512_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 7420 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_base_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 5097 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_perf.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 6011 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 13768 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_dot_prod_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 2697 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_gfni.inc 00:46:58.256 -rw-r--r-- vagrant/vagrant 5071 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_avx.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 5298 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_avx2.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 6838 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_avx2_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 5162 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_avx512.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 4436 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_avx512_gfni.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 5133 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 14066 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mad_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 4649 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mul_avx.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 4483 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mul_base_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 3188 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mul_perf.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 4660 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mul_sse.asm 00:46:58.256 -rw-r--r-- vagrant/vagrant 5717 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/gf_vect_mul_test.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/.dirstamp 00:46:58.256 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/ 00:46:58.256 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/ 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/ec_aarch64_dispatcher.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/ec_aarch64_highlevel_func.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/ec_multibinary_arm.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_2vect_dot_prod_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_2vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_2vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_2vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_3vect_dot_prod_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_3vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_3vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_3vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_4vect_dot_prod_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_4vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_4vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_4vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_5vect_dot_prod_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_5vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_5vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_5vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_6vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_6vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_6vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_7vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_8vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_dot_prod_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_dot_prod_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_mad_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_mad_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_mul_neon.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/.deps/gf_vect_mul_sve.Plo 00:46:58.256 -rw-r--r-- vagrant/vagrant 3137 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/Makefile.am 00:46:58.256 -rw-r--r-- vagrant/vagrant 4103 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/ec_aarch64_dispatcher.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 9128 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/ec_aarch64_highlevel_func.c 00:46:58.256 -rw-r--r-- vagrant/vagrant 1901 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/ec_multibinary_arm.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 10239 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_2vect_dot_prod_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 4925 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_2vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 11084 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_2vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 4271 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_2vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 9164 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_3vect_dot_prod_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 5470 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_3vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 10486 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_3vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 4884 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_3vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 11008 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_4vect_dot_prod_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 5994 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_4vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 12521 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_4vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 5392 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_4vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 12756 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_5vect_dot_prod_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 6670 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_5vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 14755 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_5vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 6037 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_5vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 7285 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_6vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 16985 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_6vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 6533 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_6vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 7877 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_7vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 8538 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_8vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 7775 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_dot_prod_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 3885 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_dot_prod_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 8575 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_mad_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 3520 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_mad_sve.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 7072 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_mul_neon.S 00:46:58.256 -rw-r--r-- vagrant/vagrant 3632 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/aarch64/gf_vect_mul_sve.S 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/ 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/ 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/ec_base_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_2vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_2vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_3vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_3vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_4vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_4vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_5vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_5vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_6vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_6vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_vect_dot_prod_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_vect_mad_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/.deps/gf_vect_mul_vsx.Plo 00:46:58.257 -rw-r--r-- vagrant/vagrant 656 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/Makefile.am 00:46:58.257 -rw-r--r-- vagrant/vagrant 2314 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/ec_base_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 14099 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/ec_base_vsx.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 2132 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_2vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_2vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 2844 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_3vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 2580 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_3vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 3555 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_4vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 3265 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_4vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 4267 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_5vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 3950 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_5vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 5024 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_6vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 4657 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_6vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 2053 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_vect_dot_prod_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 1229 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_vect_mad_vsx.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/isa-l/erasure_code/ppc64le/gf_vect_mul_vsx.c 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/ 00:46:58.257 -rw-r--r-- vagrant/vagrant 14589 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/requests 00:46:58.257 -rw-r--r-- vagrant/vagrant 474652 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/output.0 00:46:58.257 -rw-r--r-- vagrant/vagrant 123248 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/traces.0 00:46:58.257 -rw-r--r-- vagrant/vagrant 474462 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/output.1 00:46:58.257 -rw-r--r-- vagrant/vagrant 50225 2024-06-07 12:49 spdk-test_gen_spec/isa-l/autom4te.cache/traces.1 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/crc/ 00:46:58.257 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/crc/Makefile 00:46:58.257 -rw-r--r-- vagrant/vagrant 12528 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/crc/crc_combine_example.c 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/ 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/.deps/ 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/.deps/ec_piggyback_example.Po 00:46:58.257 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/.deps/ec_simple_example.Po 00:46:58.257 -rw-r--r-- vagrant/vagrant 505 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/Makefile 00:46:58.257 -rw-r--r-- vagrant/vagrant 1860 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/Makefile.am 00:46:58.257 -rw-r--r-- vagrant/vagrant 13870 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/ec_piggyback_example.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 8026 2024-06-07 12:49 spdk-test_gen_spec/isa-l/examples/ec/ec_simple_example.c 00:46:58.257 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/ 00:46:58.257 -rw-r--r-- vagrant/vagrant 5495 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/Makefile.am 00:46:58.257 -rw-r--r-- vagrant/vagrant 6568 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/adler32_avx2_4.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 2264 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/adler32_base.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 2709 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/adler32_perf.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 5234 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/adler32_sse.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 2419 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/bitbuf2.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 4163 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/bitbuf2.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 8417 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/checksum32_funcs_test.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 3849 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/checksum_test_ref.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 10738 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/data_struct2.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 963 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/encode_df.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 1145 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/encode_df.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 14839 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/encode_df_04.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 15380 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/encode_df_06.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 798 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/flatten_ll.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 57 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/flatten_ll.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 15984 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/generate_custom_hufftables.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 7484 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/generate_static_inflate.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 3431 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/heap_macros.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 64234 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/huff_codes.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 5408 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/huff_codes.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 7651 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/huffman.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 8367 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/huffman.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 379753 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/hufftables_c.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 59801 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 6169 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_base.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 5839 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_base_aliases.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 19012 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_body.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 871 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_build_hash_table_perf.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_checksums.h 00:46:58.257 -rw-r--r-- vagrant/vagrant 10827 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_compare_types.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 22135 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_decode_block_stateless.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 61 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_decode_block_stateless_01.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 79 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_decode_block_stateless_04.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 3855 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_deflate_hash.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 3225 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_example.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 9332 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_file_perf.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 9236 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_finish.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 20586 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_gen_icf_map_lh1_04.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 15185 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_gen_icf_map_lh1_06.asm 00:46:58.257 -rw-r--r-- vagrant/vagrant 3950 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_hist_perf.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 10333 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_icf_base.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 10836 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_icf_body.c 00:46:58.257 -rw-r--r-- vagrant/vagrant 21600 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_icf_body_h1_gr_bt.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 8691 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_icf_finish.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 75873 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_inflate.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 2236 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_inflate_multibinary.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 8841 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_inflate_test.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 1100 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_level_buf_structs.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 6308 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_multibinary.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 25981 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_perf.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 82351 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_rand_test.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 9326 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_semi_dyn_file_perf.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 8046 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_set_long_icf_fg_04.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 9836 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_set_long_icf_fg_06.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 2981 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_sync_flush_example.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 14353 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_update_histogram.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 107 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_update_histogram_01.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 125 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_update_histogram_04.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 2194 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_wrapper.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 20514 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/igzip_wrapper_hdr_test.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 5879 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/inflate_data_structs.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 57015 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/inflate_std_vects.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 3011 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/lz0a_const.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 2577 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/options.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 3461 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/proc_heap.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 2915 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/proc_heap_base.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 2966 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/repeated_char_result.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 5044 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/rfc1951_lookup.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 142279 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/static_inflate.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 10063 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/stdmac.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.dirstamp 00:46:58.258 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/ 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/adler32_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/checksum32_funcs_test.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/generate_custom_hufftables.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/generate_static_inflate.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_base_aliases.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_build_hash_table_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_example.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_file_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_hist_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_inflate_test.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_rand_test.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_semi_dyn_file_perf.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_sync_flush_example.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_wrapper_hdr_test.Po 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/proc_heap_base.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 1207 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/hufftables_c.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 14991 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_base.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 15107 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_icf_base.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 1215 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/adler32_base.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 4436 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/flatten_ll.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 15909 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 15976 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/encode_df.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 15107 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_icf_body.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/.dirstamp 00:46:58.258 -rw-r--r-- vagrant/vagrant 15033 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/huff_codes.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 15137 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/.deps/igzip_inflate.Plo 00:46:58.258 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/ 00:46:58.258 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/ 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/encode_df.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/gen_icf_map.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_decode_huffman_code_block_aarch64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_deflate_body_aarch64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_deflate_finish_aarch64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_deflate_hash_aarch64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_inflate_multibinary_arm64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_isal_adler32_neon.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_multibinary_aarch64_dispatcher.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_multibinary_arm64.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/igzip_set_long_icf_fg.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/isal_deflate_icf_body_hash_hist.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/isal_deflate_icf_finish_hash_hist.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/.deps/isal_update_histogram.Plo 00:46:58.258 -rw-r--r-- vagrant/vagrant 2443 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/bitbuf2_aarch64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 7805 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/data_struct_aarch64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 5603 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/encode_df.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 7661 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/gen_icf_map.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 5865 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/huffman_aarch64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 21088 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_decode_huffman_code_block_aarch64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 8496 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_deflate_body_aarch64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 8509 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_deflate_finish_aarch64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 3414 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_deflate_hash_aarch64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 1780 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_inflate_multibinary_arm64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 4754 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_isal_adler32_neon.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 8054 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_multibinary_aarch64_dispatcher.c 00:46:58.258 -rw-r--r-- vagrant/vagrant 2387 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_multibinary_arm64.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 6007 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/igzip_set_long_icf_fg.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 10201 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/isal_deflate_icf_body_hash_hist.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 10464 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/isal_deflate_icf_finish_hash_hist.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 9705 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/isal_update_histogram.S 00:46:58.258 -rw-r--r-- vagrant/vagrant 3037 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/lz0a_const_aarch64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 2435 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/options_aarch64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/isa-l/igzip/aarch64/stdmac_aarch64.h 00:46:58.258 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/ 00:46:58.258 -rw-r--r-- vagrant/vagrant 361 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/aarch64_label.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 11083 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/aarch64_multibinary.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 7052 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/crc.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 11206 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/crc64.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 38781 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/erasure_code.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 6157 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/gf_vect_mul.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 44914 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/igzip_lib.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 2397 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/mem_routines.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 21345 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/memcpy.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 13112 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/multibinary.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 10609 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/raid.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 7874 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/reg_sizes.asm 00:46:58.258 -rw-r--r-- vagrant/vagrant 8966 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/test.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 5654 2024-06-07 12:49 spdk-test_gen_spec/isa-l/include/unaligned.h 00:46:58.258 -rw-r--r-- vagrant/vagrant 232874 2024-06-07 12:49 spdk-test_gen_spec/isa-l/Makefile 00:46:58.258 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/ 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/ltmain.sh -> /usr/share/libtool/build-aux/ltmain.sh 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/compile -> /usr/share/automake-1.16/compile 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/depcomp -> /usr/share/automake-1.16/depcomp 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/test-driver -> /usr/share/automake-1.16/test-driver 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/config.guess -> /usr/share/automake-1.16/config.guess 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/config.sub -> /usr/share/automake-1.16/config.sub 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/install-sh -> /usr/share/automake-1.16/install-sh 00:46:58.259 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/build-aux/missing -> /usr/share/automake-1.16/missing 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/ 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 3821 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/mem_zero_detect_base.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/mem_zero_detect_base_aliases.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/mem_zero_detect_perf.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/mem_zero_detect_test.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.deps/.dirstamp 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/.dirstamp 00:46:58.259 -rw-r--r-- vagrant/vagrant 2250 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/Makefile.am 00:46:58.259 -rw-r--r-- vagrant/vagrant 2129 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_multibinary.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 4356 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_avx.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 4324 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_avx2.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 3827 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_avx512.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 2421 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_base.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 1887 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_base_aliases.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 2311 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_perf.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 4142 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_sse.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 6812 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/mem_zero_detect_test.c 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/ 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/.deps/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/.deps/mem_aarch64_dispatcher.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/.deps/mem_multibinary_arm.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/.deps/mem_zero_detect_neon.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 1856 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/Makefile.am 00:46:58.259 -rw-r--r-- vagrant/vagrant 2037 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/mem_aarch64_dispatcher.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 1791 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/mem_multibinary_arm.S 00:46:58.259 -rw-r--r-- vagrant/vagrant 5510 2024-06-07 12:49 spdk-test_gen_spec/isa-l/mem/aarch64/mem_zero_detect_neon.S 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 2007 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/Makefile.am 00:46:58.259 -rw-r--r-- vagrant/vagrant 1980 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/igzip.1 00:46:58.259 -rw-r--r-- vagrant/vagrant 646 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/igzip.1.h2m 00:46:58.259 -rw-r--r-- vagrant/vagrant 30747 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/igzip_cli.c 00:46:58.259 -rwxr-xr-x vagrant/vagrant 7084 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/igzip_cli_check.sh 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/.dirstamp 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/.deps/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 7264 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/.deps/igzip_cli.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/programs/.deps/.dirstamp 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/ 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 1953 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/Makefile.am 00:46:58.259 -rw-r--r-- vagrant/vagrant 9235 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/pq_check_neon.S 00:46:58.259 -rw-r--r-- vagrant/vagrant 7925 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/pq_gen_neon.S 00:46:58.259 -rw-r--r-- vagrant/vagrant 2719 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/raid_aarch64_dispatcher.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 1853 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/raid_multibinary_arm.S 00:46:58.259 -rw-r--r-- vagrant/vagrant 7132 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/xor_check_neon.S 00:46:58.259 -rw-r--r-- vagrant/vagrant 7112 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/xor_gen_neon.S 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/pq_check_neon.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/pq_gen_neon.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/raid_aarch64_dispatcher.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/raid_multibinary_arm.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/xor_check_neon.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/aarch64/.deps/xor_gen_neon.Plo 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/pq_check_test.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/pq_gen_perf.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/pq_gen_test.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/raid_base_aliases.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/xor_check_test.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/xor_example.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/xor_gen_perf.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/xor_gen_test.Po 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/.dirstamp 00:46:58.259 -rw-r--r-- vagrant/vagrant 1899 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.deps/raid_base.Plo 00:46:58.259 -rw-r--r-- vagrant/vagrant 2673 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/Makefile.am 00:46:58.259 -rw-r--r-- vagrant/vagrant 7909 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_check_sse.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 7720 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_check_sse_i32.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 8649 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_check_test.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 7563 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_avx.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 7645 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_avx2.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 6981 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_avx512.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 3072 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_perf.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 7551 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_sse.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 7294 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_sse_i32.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 5281 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/pq_gen_test.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 4009 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/raid_base.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 2094 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/raid_base_aliases.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 3581 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/raid_multibinary.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 2386 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/raid_multibinary_i32.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 6771 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_check_sse.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 8106 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_check_test.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 2917 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_example.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 5917 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_gen_avx.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 5787 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_gen_avx512.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 3100 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_gen_perf.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 7015 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_gen_sse.asm 00:46:58.259 -rw-r--r-- vagrant/vagrant 4639 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/xor_gen_test.c 00:46:58.259 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isa-l/raid/.dirstamp 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/ 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 131 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/__init__.py 00:46:58.259 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/ 00:46:58.259 -rw-r--r-- vagrant/vagrant 6293 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/__init__.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 3341 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/accel.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 3075 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/app.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 62339 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/bdev.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 1603 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/blobfs.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 10616 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/client.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 905 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/cmd_parser.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 385 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/compressdev.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 1430 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/log.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 10481 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/lvol.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 687 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/mlx5.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 621 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/nbd.py 00:46:58.259 -rw-r--r-- vagrant/vagrant 630 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/notify.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 4404 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/nvme.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 21590 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/nvmf.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 3371 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/sock.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 415 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/subsystem.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 2118 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/trace.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 1255 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/ublk.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 2837 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/vfio_user.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 5738 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/vhost.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 524 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/vmd.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 719 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/dpdk_cryptodev.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/dsa.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 327 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/env_dpdk.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 568 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/helpers.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 330 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/iaa.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/ioat.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 1006 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/iobuf.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 19229 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/iscsi.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 652 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/rpc/keyring.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 810 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/__init__.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 607 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/common.py 00:46:58.260 -rwxr-xr-x vagrant/vagrant 10031 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/qmp.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 2155 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/qos.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 8382 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/sma.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 314 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/__init__.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 1158 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/device.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 11553 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/nvmf_tcp.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 14052 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/nvmf_vfiouser.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 10186 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/device/vhost_blk.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/proto/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/proto/.gitignore 00:46:58.260 -rw-r--r-- vagrant/vagrant 106 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/proto/__init__.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/volume/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 365 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/volume/__init__.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 2612 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/volume/crypto.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 6720 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/volume/crypto_bdev.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 14016 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/sma/volume/volume.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 135 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/__init__.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 29955 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/ui_node.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 23083 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/ui_node_iscsi.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 15975 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/ui_node_nvmf.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 19810 2024-06-07 12:49 spdk-test_gen_spec/python/spdk/spdkcli/ui_root.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 1329 2024-06-07 12:49 spdk-test_gen_spec/python/Makefile 00:46:58.260 -rwxr-xr-x vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/python/setup.py 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/docker/ 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/docker/build_base/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 1891 2024-06-07 12:49 spdk-test_gen_spec/docker/build_base/Dockerfile 00:46:58.260 -rwxr-xr-x vagrant/vagrant 713 2024-06-07 12:49 spdk-test_gen_spec/docker/build_base/post-install 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/docker/monitoring/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/docker/monitoring/prometheus.yaml 00:46:58.260 -rw-r--r-- vagrant/vagrant 639 2024-06-07 12:49 spdk-test_gen_spec/docker/monitoring/telegraf.conf 00:46:58.260 -rw-r--r-- vagrant/vagrant 4910 2024-06-07 12:49 spdk-test_gen_spec/docker/README.md 00:46:58.260 -rw-r--r-- vagrant/vagrant 1179 2024-06-07 12:49 spdk-test_gen_spec/docker/docker-compose.monitoring.yaml 00:46:58.260 -rw-r--r-- vagrant/vagrant 1929 2024-06-07 12:49 spdk-test_gen_spec/docker/docker-compose.yaml 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/docker/spdk-app/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/docker/spdk-app/Dockerfile 00:46:58.260 -rwxr-xr-x vagrant/vagrant 801 2024-06-07 12:49 spdk-test_gen_spec/docker/spdk-app/init 00:46:58.260 -rw-r--r-- vagrant/vagrant 1912 2024-06-07 12:49 spdk-test_gen_spec/docker/spdk-app/proxy-container.conf 00:46:58.260 -rw-r--r-- vagrant/vagrant 1924 2024-06-07 12:49 spdk-test_gen_spec/docker/spdk-app/storage-target.conf 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/Dockerfile 00:46:58.260 -rw-r--r-- vagrant/vagrant 473 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/conf-nvme 00:46:58.260 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/conf-virtio 00:46:58.260 -rw-r--r-- vagrant/vagrant 203 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/fio-nvme.conf 00:46:58.260 -rwxr-xr-x vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/init 00:46:58.260 -rw-r--r-- vagrant/vagrant 224 2024-06-07 12:49 spdk-test_gen_spec/docker/traffic-generator/fio-virtio.conf 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isalbuild/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 632 2024-06-07 12:49 spdk-test_gen_spec/isalbuild/Makefile 00:46:58.260 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isalbuild/isa-l -> ../isa-l/include 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/rpmbuild/ 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1274 2024-06-07 12:49 spdk-test_gen_spec/rpmbuild/rpm-deps.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 5370 2024-06-07 12:49 spdk-test_gen_spec/rpmbuild/rpm.sh 00:46:58.260 -rw-r--r-- vagrant/vagrant 5332 2024-06-07 12:49 spdk-test_gen_spec/rpmbuild/spdk.spec 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/ 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/ 00:46:58.260 -rwxr-xr-x vagrant/vagrant 599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/build-dict.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 3722 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/build-tags.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-abi-version.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 2248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-abi.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-doc-vs-code.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-dup-includes.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1766 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-forbidden-tokens.awk 00:46:58.260 -rwxr-xr-x vagrant/vagrant 9124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-git-log.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 2892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-maintainers.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 4947 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-meson.py 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-spdx-tag.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 4833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-symbol-change.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 2934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/check-symbol-maps.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 15599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/checkpatches.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 2435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/dts-check-format.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/get-maintainer.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 3018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/git-log-fixes.sh 00:46:58.260 -rw-r--r-- vagrant/vagrant 1128 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/libabigail.abignore 00:46:58.260 -rw-r--r-- vagrant/vagrant 511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/load-devel-config 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/parse-flow-support.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 3973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/process-iwyu.py 00:46:58.260 -rwxr-xr-x vagrant/vagrant 10007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/test-meson-builds.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1042 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/test-null.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/update-abi.sh 00:46:58.260 -rwxr-xr-x vagrant/vagrant 1649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/update-patches.py 00:46:58.260 -rwxr-xr-x vagrant/vagrant 7054 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/update_version_map_abi.py 00:46:58.260 -rw-r--r-- vagrant/vagrant 533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/words-case.txt 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 1798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/mtod-offset.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/namespace_ethdev.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 2941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/nullfree.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 5018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/prefix_mbuf_offload_flags.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/strlcpy-with-header.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/strlcpy.cocci 00:46:58.260 -rw-r--r-- vagrant/vagrant 260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/devtools/cocci/zero_length_array.cocci 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/ 00:46:58.260 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/logo/ 00:46:58.260 -rw-r--r-- vagrant/vagrant 122755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/logo/DPDK_logo_horizontal_tag.png 00:46:58.261 -rw-r--r-- vagrant/vagrant 16055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/logo/DPDK_logo_vertical_rev_small.png 00:46:58.261 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/meson.build 00:46:58.261 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/ 00:46:58.261 -rw-r--r-- vagrant/vagrant 158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/custom.css 00:46:58.261 -rw-r--r-- vagrant/vagrant 7287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/doxy-api-index.md 00:46:58.261 -rw-r--r-- vagrant/vagrant 6193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/doxy-api.conf.in 00:46:58.261 -rwxr-xr-x vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/generate_doxygen.py 00:46:58.261 -rwxr-xr-x vagrant/vagrant 1121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/generate_examples.py 00:46:58.261 -rw-r--r-- vagrant/vagrant 3862 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/api/meson.build 00:46:58.261 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/ 00:46:58.261 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/ 00:46:58.261 -rw-r--r-- vagrant/vagrant 7300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/af_xdp_cni.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 1285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/avx512.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 14888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/debug_troubleshoot.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/flow_bifurcation.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/index.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 15573 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/lm_bond_virtio_sriov.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 10502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/lm_virtio_vhost_user.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/openwrt.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/packet_capture_framework.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 10272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/pvp_reference_benchmark.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 8841 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/rte_flow.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4480 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/telemetry.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 10916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/vfd.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 8248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/virtio_user_as_exception_path.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/virtio_user_for_container_networking.rst 00:46:58.261 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/ 00:46:58.261 -rw-r--r-- vagrant/vagrant 2955 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_consumer_ring.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 2035 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_crypto.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 4587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_distributor_worker.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 2697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_mempool.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 5670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_pdump.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 2955 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_producer_ring.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 3228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_qos_tx.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 2501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_rx_rate.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 4405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_rx_tx_drop.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 16002 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_sample_app_model.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 1796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/dtg_service.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 24082 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/flow_bifurcation_overview.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 26834 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/lm_bond_virtio_sriov.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 25603 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/lm_vhost_user.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 19203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/packet_capture_framework.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 22465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/pvp_2nics.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 23114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/use_models_for_running_dpdk_in_containers.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 18641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/vf_daemon_overview.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 11833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/virtio_user_as_exception_path.svg 00:46:58.261 -rw-r--r-- vagrant/vagrant 22233 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/howto/img/virtio_user_for_container_networking.svg 00:46:58.261 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ 00:46:58.261 -rw-r--r-- vagrant/vagrant 4257 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/bbdev_app.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/cmd_line.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 1657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/compiling.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 6357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/dist_app.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 12001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/dma.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3590 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ethtool.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 5145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/eventdev_pipeline.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 5981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/fips_validation.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 8771 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/flow_filtering.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 2613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/hello_world.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 1000 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/index.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/intro.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 5086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ip_frag.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 19435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ip_pipeline.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 9246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ip_reassembly.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 36607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ipsec_secgw.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 9664 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ipv4_multicast.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/keep_alive.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 7141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_cat.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 12132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_crypto.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 17187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_event.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 14341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_job_stats.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_macsec.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 12410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l2_forward_real_virtual.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 20780 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l3_forward.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 11114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l3_forward_graph.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 16007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/l3_forward_power_man.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 11442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/link_status_intr.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 16212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/multi_process.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 2688 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ntb.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 2261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/packet_ordering.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/pipeline.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 6200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/ptpclient.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 5802 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/qos_metering.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 10695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/qos_scheduler.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 4225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/rxtx_callbacks.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 8097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/server_node_efd.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3630 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/service_cores.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 5347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/skeleton.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 25681 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/test_pipeline.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/timer.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 3845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vdpa.rst 00:46:58.261 -rw-r--r-- vagrant/vagrant 7751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vhost.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vhost_blk.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2910 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vhost_crypto.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 30399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vm_power_management.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 7209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vmdq_dcb_forwarding.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 5781 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/vmdq_forwarding.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 192400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/client_svr_sym_multi_proc_app.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 18749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/dist_app.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 18460 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/dist_perf.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 4342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/example_rules.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 26583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/exception_path_example.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 35653 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/ipsec_endpoints.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 2791 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/ipv4_acl_rule.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 36245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/kernel_nic.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 28871 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/l2_fwd_benchmark_setup.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 11053 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/l2_fwd_encrypt_flow.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 86633 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/l2_fwd_virtenv_benchmark_setup.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 11403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/l2_fwd_vm2vm.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 96131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/load_bal_app_arch.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 80224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/overlay_networking.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 16728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/pipeline_overview.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 23058 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/ptpclient.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 65558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/qos_sched_app_arch.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 30748 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/quickassist_block_diagram.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 32456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/ring_pipeline_perf_setup.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 75594 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/server_node_efd.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 198226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/sym_multi_proc_app.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 67410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/test_pipeline_app.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 15578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/threads_pipelines.png 00:46:58.262 -rw-r--r-- vagrant/vagrant 66249 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/vm_power_mgr_highlevel.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 36223 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/vm_power_mgr_vm_request_seq.svg 00:46:58.262 -rw-r--r-- vagrant/vagrant 37145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/sample_app_ug/img/vmdq_dcb_example.svg 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 6819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/acc100.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 8500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/fpga_5gnr_fec.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 7640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/fpga_lte_fec.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/index.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 3011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/la12xx.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1565 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/null.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/overview.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 6445 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/turbo_sw.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 7409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/vrb1.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 8214 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/vrb2.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/acc100.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/acc101.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/default.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 251 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/fpga_5gnr_fec.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 223 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/fpga_lte_fec.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/la12xx.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/null.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/turbo_sw.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 365 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/vrb1.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/bbdevs/features/vrb2.ini 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 2262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/amd_platform.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 8591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/build_dpdk.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 8139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/build_sample_apps.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 11159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/cross_build_dpdk_for_arm64.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 3431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/cross_build_dpdk_for_loongarch.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 3290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/cross_build_dpdk_for_riscv.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/doc_roadmap.include.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 5947 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/eal_args.include.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 7619 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/enable_func.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/index.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/intro.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 15535 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/linux_drivers.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 3797 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/linux_eal_parameters.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 6287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/nic_perf_intel_platform.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 9815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/linux_gsg/sys_reqs.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/build_app.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/index.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/intro.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 19738 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/run_app.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 173987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/testpmd_app_ug/testpmd_funcs.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/index.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 6623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/isal.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/mlx5.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/nitrox.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2470 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/octeontx.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/overview.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1590 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/qat_comp.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/zlib.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/default.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 413 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/isal.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/mlx5.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/nitrox.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/octeontx.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/qat.ini 00:46:58.262 -rw-r--r-- vagrant/vagrant 212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/compressdevs/features/zlib.ini 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 3142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/cnxk.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 315 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/index.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/octeontx.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 1834 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/ring.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 2011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mempool/stack.rst 00:46:58.262 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/ 00:46:58.262 -rw-r--r-- vagrant/vagrant 4479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/comp_perf.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 18047 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/cryptoperf.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 3802 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/devbind.rst 00:46:58.262 -rw-r--r-- vagrant/vagrant 4023 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/dmaperf.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 29904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/dts.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/dumpcap.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 11858 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/flow-perf.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 20029 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/graph.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/hugepages.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/index.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/pdump.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/pmdinfo.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 5703 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/proc_info.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/securityperf.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 20455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/testbbdev.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 29961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/testeventdev.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 14098 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/testmldev.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/testregex.rst 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 62582 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_order_atq_test.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 65288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_order_queue_test.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 114696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_perf_atq_test.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 93685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_perf_queue_test.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 121005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_pipeline_atq_test_generic.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 116397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_pipeline_atq_test_internal_port.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 131020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_pipeline_queue_test_generic.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 138442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/eventdev_pipeline_queue_test_internal_port.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 3990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/graph-usecase-l2fwd.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 9775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/graph-usecase-l3fwd.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 35347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_inference_interleave.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 27216 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_inference_ordered.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 22326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_model_ops_subtest_a.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 22356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_model_ops_subtest_b.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 19062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_model_ops_subtest_c.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 22446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/tools/img/mldev_model_ops_subtest_d.svg 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 17288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/abi_policy.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 21428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/abi_versioning.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/cheatsheet.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 41678 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/coding_style.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 11105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/design.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 25692 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/documentation.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/index.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/new_library.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 29407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/patches.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 6338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/stable.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 16400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/unit_test.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 11172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/vulnerability.rst 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/img/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 48932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/img/abi_stability_policy.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 78391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/img/patch_cheatsheet.svg 00:46:58.263 -rw-r--r-- vagrant/vagrant 17779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/contributing/img/what_is_an_abi.svg 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mldevs/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 14188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mldevs/cnxk.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/mldevs/index.rst 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/ 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 1256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/default.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/ifcvf.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 686 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/mlx5.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 191 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/nfp.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features/sfc.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 2958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/features_overview.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/ifc.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/index.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 5512 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/mlx5.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/nfp.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/vdpadevs/sfc.rst 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 3453 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/aesni_gcm.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 5959 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/aesni_mb.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/armv8.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3127 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/bcmfs.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/caam_jr.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/ccp.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/chacha20_poly1305.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 7715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/cnxk.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 7051 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/dpaa2_sec.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4318 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/dpaa_sec.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/index.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4673 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/kasumi.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 8071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/mlx5.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3091 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/mvsam.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 1328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/nitrox.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/null.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3880 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/octeontx.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/openssl.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2785 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/overview.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 34364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/qat.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 7463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/scheduler.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/snow3g.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 3977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/uadk.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 2731 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/virtio.rst 00:46:58.263 -rw-r--r-- vagrant/vagrant 4044 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/zuc.rst 00:46:58.263 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/ 00:46:58.263 -rw-r--r-- vagrant/vagrant 948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/aesni_gcm.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1738 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/aesni_mb.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 721 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/armv8.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/bcmfs.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1069 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/caam_jr.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/ccp.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/chacha20_poly1305.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 2342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/cn10k.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 2150 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/cn9k.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 2725 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/default.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/dpaa2_sec.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 1291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/dpaa_sec.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/kasumi.ini 00:46:58.263 -rw-r--r-- vagrant/vagrant 821 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/mlx5.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1208 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/mvsam.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 985 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/nitrox.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 627 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/null.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/octeontx.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/openssl.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/qat.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 733 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/snow3g.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 945 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/uadk.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/virtio.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/features/zuc.ini 00:46:58.264 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/img/ 00:46:58.264 -rw-r--r-- vagrant/vagrant 18246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/cryptodevs/img/scheduler-overview.svg 00:46:58.264 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ 00:46:58.264 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ 00:46:58.264 -rw-r--r-- vagrant/vagrant 329 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/af_xdp.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 150 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/afpacket.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ark.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/atlantic.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/avp.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/axgbe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/bnx2x.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/bnxt.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2691 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cnxk.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cnxk_vec.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cnxk_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1096 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cpfl.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cxgbe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/cxgbevf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 5201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/default.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 650 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/dpaa.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1235 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/dpaa2.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/e1000.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ena.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/enetc.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 318 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/enetfec.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1809 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/enic.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 722 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/failsafe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/fm10k.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 725 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/fm10k_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/gve.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/hinic.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/hns3.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1052 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/hns3_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/i40e.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/iavf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ice.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 817 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ice_dcf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 658 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/idpf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/igb.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 723 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/igb_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1160 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/igc.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ionic.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1490 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ipn3ke.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ixgbe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1006 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ixgbe_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/mana.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/memif.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/mlx4.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 3715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/mlx5.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/mvneta.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 910 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/mvpp2.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 600 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/netvsc.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/nfb.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/nfp.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/ngbe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/octeon_ep.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/octeontx.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/pcap.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/pfe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/qede.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/qede_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 2170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/sfc.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/tap.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/thunderx.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/txgbe.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/txgbe_vf.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 349 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/vhost.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/virtio.ini 00:46:58.264 -rw-r--r-- vagrant/vagrant 699 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features/vmxnet3.ini 00:46:58.264 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/ 00:46:58.264 -rw-r--r-- vagrant/vagrant 40850 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/console.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 355905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/fast_pkt_proc.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 8849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/forward_stats.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 16487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/host_vm_comms.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 15383 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/host_vm_comms_qemu.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 54061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/ice_dcf.svg 00:46:58.264 -rw-r--r-- vagrant/vagrant 25386 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/intel_perf_test_setup.svg 00:46:58.264 -rw-r--r-- vagrant/vagrant 370244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/inter_vm_comms.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 4992 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/mvpp2_tm.svg 00:46:58.264 -rw-r--r-- vagrant/vagrant 392248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/perf_benchmark.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 425314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/single_port_nic.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 172288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/vm_vm_comms.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 107542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/vmxnet3_int.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 123082 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/img/vswitch_vm.png 00:46:58.264 -rw-r--r-- vagrant/vagrant 3060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/af_packet.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 8291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/af_xdp.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 15125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ark.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 1092 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/atlantic.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 3196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/avp.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 2133 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/axgbe.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 7062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/bnx2x.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 34761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/bnxt.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 3992 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/build_and_test.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 25646 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/cnxk.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 6843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/cpfl.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 29450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/cxgbe.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 13017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/dpaa.rst 00:46:58.264 -rw-r--r-- vagrant/vagrant 22669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/dpaa2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/e1000em.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 11027 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ena.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3616 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/enetc.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 4609 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/enetfec.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 25519 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/enic.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 8981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/fail_safe.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 31785 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/features.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5758 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/fm10k.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3795 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/gve.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/hinic.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 11408 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/hns3.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 39354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/i40e.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 20470 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ice.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3094 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/idpf.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/igb.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/igc.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/index.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 33164 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/intel_vf.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ionic.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3645 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ipn3ke.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 15495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ixgbe.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2540 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/mana.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 13942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/memif.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 15849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/mlx4.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 114531 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/mlx5.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3964 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/mvneta.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 22951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/mvpp2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5771 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/netvsc.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 4885 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/nfb.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 17225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/nfp.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/ngbe.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/null.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/octeon_ep.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/octeontx.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/overview.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 12296 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/pcap_ring.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 6566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/pfe.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 11981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/qede.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 16532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/sfc_efx.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 15313 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/softnic.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 12803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/tap.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 14271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/thunderx.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/txgbe.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/vdev_netvsc.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/vhost.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 19012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/virtio.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 7326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/nics/vmxnet3.rst 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/windows_gsg/ 00:46:58.265 -rw-r--r-- vagrant/vagrant 4436 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/windows_gsg/build_dpdk.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/windows_gsg/index.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 894 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/windows_gsg/intro.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/windows_gsg/run_apps.rst 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/ 00:46:58.265 -rw-r--r-- vagrant/vagrant 3239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/cnxk.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/dpaa.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/dpaa2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 1247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/hisilicon.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 10248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/idxd.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/index.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/dmadevs/ioat.rst 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/ 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/img/ 00:46:58.265 -rw-r--r-- vagrant/vagrant 121341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/img/cnxk_packet_flow_hw_accelerators.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 108446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/img/cnxk_resource_virtualization.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 3543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/bluefield.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 23372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/cnxk.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 4197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/dpaa.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3555 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/dpaa2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/index.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 20966 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/mlx5.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/platform/octeontx.rst 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/ 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/ 00:46:58.265 -rw-r--r-- vagrant/vagrant 937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/cnxk.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 1498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/default.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 596 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/dlb2.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 720 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/dpaa.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 783 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/dpaa2.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 612 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/dsw.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/octeontx.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/opdl.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/features/sw.ini 00:46:58.265 -rw-r--r-- vagrant/vagrant 8198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/cnxk.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 22548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/dlb2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/dpaa.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/dpaa2.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 2770 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/dsw.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/index.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 3507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/octeontx.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 4209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/opdl.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/overview.rst 00:46:58.265 -rw-r--r-- vagrant/vagrant 5040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/eventdevs/sw.rst 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/ 00:46:58.265 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ 00:46:58.265 -rw-r--r-- vagrant/vagrant 98243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/anatomy_of_a_node.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 36963 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/architecture-overview.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 55303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/blk_diag_dropper.png 00:46:58.265 -rw-r--r-- vagrant/vagrant 23055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-0.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 25875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-1.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 25640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-2.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 25845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-3.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 28182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-4.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 23436 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-mode-5.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 6680 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/bond-overview.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 4533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/crypto_op.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 11034 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/crypto_xform_chain.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 58769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/data_struct_per_port.png 00:46:58.265 -rw-r--r-- vagrant/vagrant 10728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/dir_24_8_alg.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 10827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/dmadev.svg 00:46:58.265 -rw-r--r-- vagrant/vagrant 3205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/drop_probability_eq3.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 2737 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/drop_probability_eq4.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 62349 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/drop_probability_graph.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 6753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 25027 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i10.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 20605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i11.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 65567 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i12.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 15502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 34794 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i3.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 11625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i4.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 10038 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i5.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 75594 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i6.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 51707 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i7.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 11561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i8.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 26016 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/efd_i9.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 1614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/eq2_expression.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/eq2_factor.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 44358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/event_crypto_adapter_op_forward.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 43511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/event_crypto_adapter_op_new.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 44450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/event_dma_adapter_op_forward.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 43890 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/event_dma_adapter_op_new.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 33218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/eventdev_usage.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 840 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ewma_filter_eq_1.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 1462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ewma_filter_eq_2.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 32578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ex_data_flow_tru_dropper.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 11603 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure32.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 65216 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure33.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 11581 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure34.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 75012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure35.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 6934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure37.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 7372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure38.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 55986 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/figure39.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 30870 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/flow_tru_dropper.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 13254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/graph_inbuilt_node_flow.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 285798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/graph_mem_layout.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 12391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/gro-key-algorithm.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 23272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/gso-output-segment-format.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 36328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/gso-three-seg-mbuf.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 36328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/hier_sched_blk.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 185839 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/kernel_nic_intf.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 366308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/kni_traffic_flow.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 159847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/link_the_nodes.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 27656 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/linuxapp_launch.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 1261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/m_definition.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 16393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/malloc_heap.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 20020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/mbuf1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 43866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/mbuf2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 103228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 2270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 7324 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i3.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 23186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i4.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 8766 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i5.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 18366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i6.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 21584 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/member_i7.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 67930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/memory-management.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 72894 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/memory-management2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 86066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/mempool.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 35767 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/mldev_flow.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 23732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/multi_process_memory.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 99482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/packet_distributor1.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 102867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/packet_distributor2.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 8132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/pdcp_functional_overview.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 71898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/pipe_prefetch_sm.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 46368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/pkt_drop_probability.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 51088 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/pkt_flow_kni.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 93198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/pkt_proc_pipeline_qos.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 35642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/predictable_snat_1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 35646 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/predictable_snat_2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 56358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/prefetch_pipeline.png 00:46:58.266 -rw-r--r-- vagrant/vagrant 31629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rcu_general_info.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 9014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rib_internals.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 9615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rib_pic.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 24444 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-dequeue1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 23174 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-dequeue2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 22912 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-dequeue3.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 21264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-enqueue1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 23015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-enqueue2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 22750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-enqueue3.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 28418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-modulo1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 29784 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-modulo2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 26876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-mp-enqueue1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 28524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-mp-enqueue2.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 35020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-mp-enqueue3.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 34890 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-mp-enqueue4.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 29446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring-mp-enqueue5.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 12110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/ring1.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 35826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rss_queue_assign.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 5597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rte_flow_async_init.svg 00:46:58.266 -rw-r--r-- vagrant/vagrant 10851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rte_flow_async_usage.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 128054 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/rte_mtr_meter_chaining.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 26949 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/sched_hier_per_port.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 6662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/stateful-op.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 7303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/stateless-op-shared.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 8267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/stateless-op.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 95193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/tbl24_tbl8.png 00:46:58.267 -rw-r--r-- vagrant/vagrant 114003 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/tbl24_tbl8_tbl8.png 00:46:58.267 -rw-r--r-- vagrant/vagrant 66727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/turbo_tb_decode.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 89461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/turbo_tb_encode.svg 00:46:58.267 -rw-r--r-- vagrant/vagrant 251431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/img/vhost_net_arch.png 00:46:58.267 -rw-r--r-- vagrant/vagrant 6305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/argparse_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3766 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/asan.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 69612 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/bbdev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 2543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/bpf_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 8603 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/build-sdk-meson.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/build_app.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 19146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/cmdline.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 27153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/compressdev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 47575 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/cryptodev_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 15914 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/dispatcher_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 5380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/dmadev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 18172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/efd_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 54036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/env_abstraction_layer.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 15515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/event_crypto_adapter.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 11057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/event_dma_adapter.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 14063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/event_ethernet_rx_adapter.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 9831 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/event_ethernet_tx_adapter.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 12800 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/event_timer_adapter.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 17994 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/eventdev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 5338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/fib_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 8209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/generic_receive_offload_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10493 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/generic_segmentation_offload_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/glossary.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 9307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/gpudev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 21824 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/graph_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 20055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/hash_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/index.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 2454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/intro.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/ip_fragment_reassembly_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/ipsec_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 24660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/link_bonding_poll_mode_drv_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/log_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/lpm6_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10692 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/lpm_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1375 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/lto.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 11847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/mbuf_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 22319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/member_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 7044 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/mempool_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 2941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/meson_ut.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10471 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/metrics_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 8491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/mldev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 18189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/multi_proc_support.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 6472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/overview.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 20509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/packet_classif_access_ctrl.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/packet_distrib_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 93961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/packet_framework.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/pcapng_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 10013 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/pdcp_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/pdump_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/perf_opt_guidelines.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 33854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/poll_mode_drv.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 9382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/power_man.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/profile_app.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 120115 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/qos_framework.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/rawdev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 13184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/rcu_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 6834 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/regexdev.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3758 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/reorder_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 5868 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/rib_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 16147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/ring_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 171294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/rte_flow.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 37462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/rte_security.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3084 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/service_cores.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/source_org.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/stack_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 32518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/switch_representation.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 5430 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/telemetry_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/thread_safety_dpdk_functions.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 3994 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/timer_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 11581 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/toeplitz_hash_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 12539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/trace_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 11576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/traffic_management.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 6475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/traffic_metering_and_policing.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 19219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/vhost_lib.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 12241 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/prog_guide/writing_efficient_code.rst 00:46:58.267 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/faq/ 00:46:58.267 -rw-r--r-- vagrant/vagrant 11566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/faq/faq.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/faq/index.rst 00:46:58.267 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/ 00:46:58.267 -rw-r--r-- vagrant/vagrant 7101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/cnxk_bphy.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 6269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/cnxk_gpio.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 2023 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/dpaa2_cmdif.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 14845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/ifpga.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/index.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 6213 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rawdevs/ntb.rst 00:46:58.267 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/ 00:46:58.267 -rw-r--r-- vagrant/vagrant 8950 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/build_dpdk.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 4539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/build_sample_apps.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/freebsd_eal_parameters.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/index.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 5163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/install_from_ports.rst 00:46:58.267 -rw-r--r-- vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/freebsd_gsg/intro.rst 00:46:58.267 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/ 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/features/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/features/cn9k.ini 00:46:58.268 -rw-r--r-- vagrant/vagrant 1004 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/features/default.ini 00:46:58.268 -rw-r--r-- vagrant/vagrant 187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/features/mlx5.ini 00:46:58.268 -rw-r--r-- vagrant/vagrant 1687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/cn9k.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 1843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/features_overview.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/index.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 1529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/regexdevs/mlx5.rst 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/ 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/features/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/features/cuda.ini 00:46:58.268 -rw-r--r-- vagrant/vagrant 501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/features/default.ini 00:46:58.268 -rw-r--r-- vagrant/vagrant 7351 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/cuda.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/index.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/gpus/overview.rst 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 7478 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/deprecation.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 801 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/index.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 30651 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/known_issues.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 21327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_16_04.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 14694 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_16_07.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 14826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_16_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 18708 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_17_02.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 22951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_17_05.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17295 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_17_08.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 25008 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_17_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_18_02.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 34708 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_18_05.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 14751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_18_08.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 28229 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_18_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 18811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_19_02.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 19094 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_19_05.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 20149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_19_08.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 27387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_19_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 1006 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_1_8.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 16187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_20_02.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 20469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_20_05.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17906 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_20_08.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 35182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_20_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17602 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_21_02.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 20245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_21_05.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 13291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_21_08.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 32473 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_21_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_22_03.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 16216 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_22_07.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 29425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_22_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 15870 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_23_03.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 17898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_23_07.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 24369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_23_11.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 19097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_24_03.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 2941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_2_0.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 29451 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_2_1.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 21037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/rel_notes/release_2_2.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 16635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/conf.py 00:46:58.268 -rw-r--r-- vagrant/vagrant 299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/custom.css 00:46:58.268 -rw-r--r-- vagrant/vagrant 597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/index.rst 00:46:58.268 -rw-r--r-- vagrant/vagrant 920 2024-06-07 12:49 spdk-test_gen_spec/dpdk/doc/guides/meson.build 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/meson.build 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 28811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_dev.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 3644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_dev.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 25269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_model.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 11145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_model.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 14587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_ocm.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 2253 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_ocm.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 48012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_ops.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 9205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cn10k_ml_ops.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 661 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_dev.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 2508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_dev.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 3270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_io.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 1974 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_io.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 3172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_model.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 4208 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_model.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 44742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_ops.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 1924 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_ops.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_utils.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 318 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_utils.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 3594 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/cnxk_ml_xstats.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 2293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/meson.build 00:46:58.268 -rw-r--r-- vagrant/vagrant 4689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_dev.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_dev.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 13507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_model.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 2263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_model.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 20148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_ops.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 2613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_ops.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 2649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_stubs.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 1660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/ml/cnxk/mvtvm_ml_stubs.h 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/ 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/turbo_sw/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 56463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/turbo_sw/bbdev_turbo_software.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 1219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/turbo_sw/meson.build 00:46:58.268 -rw-r--r-- vagrant/vagrant 312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/meson.build 00:46:58.268 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/ 00:46:58.268 -rw-r--r-- vagrant/vagrant 6912 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/acc100_pf_enum.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 6607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/acc100_pmd.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 2740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/acc100_vf_enum.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/acc_common.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 43582 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/acc_common.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 1221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/meson.build 00:46:58.268 -rw-r--r-- vagrant/vagrant 145166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/rte_acc100_pmd.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 1125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/rte_acc_cfg.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 3000 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/rte_acc_common_cfg.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 159282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/rte_vrb_pmd.c 00:46:58.268 -rw-r--r-- vagrant/vagrant 91 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/version.map 00:46:58.268 -rw-r--r-- vagrant/vagrant 5160 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb1_pf_enum.h 00:46:58.268 -rw-r--r-- vagrant/vagrant 3310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb1_vf_enum.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 5855 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb2_pf_enum.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 5093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb2_vf_enum.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 1247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb_cfg.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 13996 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/acc/vrb_pmd.h 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 10017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/agx100_pmd.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 9403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/fpga_5gnr_fec.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/meson.build 00:46:58.269 -rw-r--r-- vagrant/vagrant 108033 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/rte_fpga_5gnr_fec.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/rte_pmd_fpga_5gnr_fec.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 3231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/vc_5gnr_pmd.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_5gnr_fec/version.map 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_lte_fec/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 76897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_lte_fec/fpga_lte_fec.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_lte_fec/fpga_lte_fec.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 160 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_lte_fec/meson.build 00:46:58.269 -rw-r--r-- vagrant/vagrant 100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/fpga_lte_fec/version.map 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 29090 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/bbdev_la12xx.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/bbdev_la12xx.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 7680 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/bbdev_la12xx_ipc.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 799 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/bbdev_la12xx_pmd_logs.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/la12xx/meson.build 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/null/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 9204 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/null/bbdev_null.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 155 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/baseband/null/meson.build 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_xdp/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_xdp/af_xdp_deps.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 2198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_xdp/compat.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 2486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_xdp/meson.build 00:46:58.269 -rw-r--r-- vagrant/vagrant 62510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_xdp/rte_eth_af_xdp.c 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 4640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/clip_tbl.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/clip_tbl.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 4494 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 5458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_compat.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 49919 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_ethdev.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 38792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_filter.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 8672 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_filter.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 38042 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_flow.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1213 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_flow.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 64376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_main.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 2379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_ofld.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 3193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbe_pfvf.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 6185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbevf_ethdev.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 8279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/cxgbevf_main.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 5604 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/l2t.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/l2t.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/meson.build 00:46:58.269 -rw-r--r-- vagrant/vagrant 7600 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/mps_tcam.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/mps_tcam.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 73443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/sge.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 5804 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/smt.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 894 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/smt.h 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 24604 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/adapter.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 20705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/common.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 1619 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_chip_type.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 155011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_hw.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 3922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_hw.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 13872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_msg.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 7866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_pci_id_tbl.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 31939 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_regs.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 4376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_regs_values.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 1168 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4_tcb.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 75162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4fw_interface.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 24163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4vf_hw.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cxgbe/base/t4vf_hw.h 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 32022 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_ethdev.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 11870 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_ethdev.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_logs.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 6060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_rss.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 2981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_rss.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 12574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_rx.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 10740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_rx_dqo.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 18388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_tx.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 11048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_tx_dqo.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_version.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/gve_version.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/meson.build 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 1450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 29792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_adminq.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 12161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_adminq.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 4559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_desc.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 5875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_desc_dqo.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 4103 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_osdep.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/gve/base/gve_register.h 00:46:58.269 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ 00:46:58.269 -rw-r--r-- vagrant/vagrant 7537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_82599_bypass.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 9222 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_bypass.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 1341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_bypass.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 7097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_bypass_api.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 3485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_bypass_defines.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 241872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_ethdev.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 26313 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_ethdev.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 45362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_fdir.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 99821 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_flow.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 21316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_ipsec.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 3192 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_ipsec.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 1161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_logs.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 26769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_pf.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 4265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_recycle_mbufs_vec_common.c 00:46:58.269 -rw-r--r-- vagrant/vagrant 10951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_regs.h 00:46:58.269 -rw-r--r-- vagrant/vagrant 182163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_rxtx.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 11942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_rxtx.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 7298 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_rxtx_vec_common.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 20407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_rxtx_vec_neon.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 25888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_rxtx_vec_sse.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 33921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_testpmd.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 28515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_tm.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 6974 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/ixgbe_vf_representor.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/meson.build 00:46:58.270 -rw-r--r-- vagrant/vagrant 25273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/rte_pmd_ixgbe.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 20561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/rte_pmd_ixgbe.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 1300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/version.map 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 1169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/README 00:46:58.270 -rw-r--r-- vagrant/vagrant 40201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_82598.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1099 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_82598.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 77659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_82599.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1713 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_82599.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 48931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_api.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 9405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_api.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 150521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_common.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 7838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_common.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 20047 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 4718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 9498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb_82598.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 2569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb_82598.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 16480 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb_82599.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 4139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_dcb_82599.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 5990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_hv_vf.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_hv_vf.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 18566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_mbx.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 5970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_mbx.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 4425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_osdep.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 73735 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_phy.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 7568 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_phy.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 174716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_type.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 20992 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_vf.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_vf.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 29746 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_x540.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1588 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_x540.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 131135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_x550.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 4732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/ixgbe_x550.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 836 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ixgbe/base/meson.build 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/meson.build 00:46:58.270 -rw-r--r-- vagrant/vagrant 86245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_ethdev.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 10648 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_ethdev.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 1221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_logs.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 19587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_pf.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 8297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_ptypes.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 7066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_ptypes.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 1082 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_regs_group.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 89553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_rxtx.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 15019 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/ngbe_rxtx.h 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 7265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_mbx.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3606 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_mbx.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 10153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_mng.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 2544 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_mng.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 5495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_osdep.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 10481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1944 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 9649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_mvl.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3825 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_mvl.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 10358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_rtl.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3115 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_rtl.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 16072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_yt.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_phy_yt.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 58107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_regs.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 3147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_status.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 12488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_type.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 544 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/meson.build 00:46:58.270 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 3158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_devids.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 10573 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_dummy.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 7055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_eeprom.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_eeprom.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 55531 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_hw.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 3446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ngbe/base/ngbe_hw.h 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 1196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/meson.build 00:46:58.270 -rw-r--r-- vagrant/vagrant 68883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/rte_eth_tap.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 4118 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/rte_eth_tap.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 2483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_bpf.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 4182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_bpf_api.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 74690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_bpf_insns.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 60532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_flow.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1968 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_flow.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 2888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_intr.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_log.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 9958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_netlink.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1257 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_netlink.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 1063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_rss.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 7086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_tcmsgs.c 00:46:58.270 -rw-r--r-- vagrant/vagrant 1108 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/tap_tcmsgs.h 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/Makefile 00:46:58.270 -rw-r--r-- vagrant/vagrant 7483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/bpf_api.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 1287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/bpf_elf.h 00:46:58.270 -rw-r--r-- vagrant/vagrant 2553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/bpf_extract.py 00:46:58.270 -rw-r--r-- vagrant/vagrant 6654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/tap/bpf/tap_bpf_program.c 00:46:58.270 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ 00:46:58.270 -rw-r--r-- vagrant/vagrant 1842 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ddm.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2985 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ddm.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 27040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ethdev.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 16424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ethdev_rx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 1032 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ethdev_rx.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 11408 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ethdev_tx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 783 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ethdev_tx.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 7954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 4196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_global.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_logs.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 2627 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_mpu.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_mpu.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 11187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktchkr.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktchkr.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 1176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktdir.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 1020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktdir.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 11769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktgen.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2726 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_pktgen.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 2669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_udm.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 3211 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/ark_udm.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ark/meson.build 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/ 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/ 00:46:58.271 -rw-r--r-- vagrant/vagrant 1532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/dpaa_integration.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 15350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 12245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_lib.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 203764 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_pcd_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 111666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_port_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 3348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_vsp.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 4175 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/fm_vsp_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 4879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/ncsw_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 16202 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/fmlib/net_ext.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 62333 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_ethdev.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 6749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_ethdev.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 27584 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_flow.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_flow.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 13207 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_fmc.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 34183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_rxtx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 9980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/dpaa_rxtx.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/meson.build 00:46:58.271 -rw-r--r-- vagrant/vagrant 654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/rte_pmd_dpaa.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa/version.map 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/ 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/ 00:46:58.271 -rw-r--r-- vagrant/vagrant 5846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_compat.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 4338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_csr.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 27441 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_api_cmd.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 8335 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_api_cmd.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 6776 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_cfg.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 3175 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_cfg.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 12210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_cmd.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 23110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_cmdq.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 4186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_cmdq.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 12445 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_eqs.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_eqs.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 44881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_hwdev.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 9456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_hwdev.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 14575 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_hwif.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 3189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_hwif.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 26995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_mbox.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2143 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_mbox.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 21928 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_mgmt.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 3164 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_mgmt.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 58775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_niccfg.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 20094 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_niccfg.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 22579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_nicio.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 7029 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_nicio.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 4059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_wq.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 3913 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/hinic_pmd_wq.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/base/meson.build 00:46:58.271 -rw-r--r-- vagrant/vagrant 89537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_ethdev.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 8714 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_ethdev.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 96465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_flow.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 28259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_rx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_rx.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 38145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_tx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 2864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/hinic_pmd_tx.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hinic/meson.build 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/ 00:46:58.271 -rw-r--r-- vagrant/vagrant 10145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/gdma.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 37440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/mana.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 15601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/mana.h 00:46:58.271 -rw-r--r-- vagrant/vagrant 1311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/meson.build 00:46:58.271 -rw-r--r-- vagrant/vagrant 7510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/mp.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 8098 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/mr.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 16448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/rx.c 00:46:58.271 -rw-r--r-- vagrant/vagrant 13285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mana/tx.c 00:46:58.271 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/null/ 00:46:58.271 -rw-r--r-- vagrant/vagrant 142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/null/meson.build 00:46:58.271 -rw-r--r-- vagrant/vagrant 19203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/null/rte_eth_null.c 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/ 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/meson.build 00:46:58.272 -rw-r--r-- vagrant/vagrant 700 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_bsvf.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 853 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_bsvf.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 25141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_hw.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 6394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_hw.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 28477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_hw_defs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 13054 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_mbox.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 7449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_mbox.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 2162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/base/nicvf_plat.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/meson.build 00:46:58.272 -rw-r--r-- vagrant/vagrant 64715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_ethdev.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 3539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_ethdev.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 1209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_logs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 20443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_rxtx.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 3406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_rxtx.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 2602 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_struct.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 729 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_svf.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 592 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/thunderx/nicvf_svf.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 1285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/meson.build 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 2338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_common.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 50561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_ethdev.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 3749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_ethdev.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 1132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_hw_regs.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 1406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_hw_regs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_logs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 30343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_rxtx.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 4495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/atl_types.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/meson.build 00:46:58.272 -rw-r--r-- vagrant/vagrant 1943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/rte_pmd_atlantic.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 3791 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/rte_pmd_atlantic.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/version.map 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 13862 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_b0.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 1458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_b0.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 4505 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_b0_internal.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 45613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_llh.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 23772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_llh.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 96896 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_llh_internal.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 22811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_utils.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 14355 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_utils.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 18491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/atlantic/hw_atl/hw_atl_utils_fw2x.c 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/ 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/base/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 13517 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/base/dpaa2_hw_dpni.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 11076 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/base/dpaa2_hw_dpni_annot.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 3191 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/base/dpaa2_tlu_hash.c 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 33072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/dpdmux.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 2123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/dpkg.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 96621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/dpni.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 16322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/dprtc.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 15525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dpdmux.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 6050 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dpdmux_cmd.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 5379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dpkg.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 61431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dpni.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 21234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dpni_cmd.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 2833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dprtc.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 2921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_dprtc_cmd.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 21954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/mc/fsl_net.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 81178 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_ethdev.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 9180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_ethdev.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 107987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_flow.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 10058 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_mux.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 1265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_pmd_logs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 3819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_ptp.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 20927 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_recycle.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 59521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_rxtx.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 11140 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_sparser.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 6220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_sparser.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 28632 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_tm.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/dpaa2_tm.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/meson.build 00:46:58.272 -rw-r--r-- vagrant/vagrant 2492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/rte_pmd_dpaa2.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/dpaa2/version.map 00:46:58.272 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/ 00:46:58.272 -rw-r--r-- vagrant/vagrant 22667 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_cmd.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 29614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_cmd.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 28174 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_common.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 2186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_common.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 44417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_dcb.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 5521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_dcb.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 28378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_dump.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_dump.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 179493 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_ethdev.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 32279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_ethdev.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 65316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_ethdev_vf.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 30790 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_fdir.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 4929 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_fdir.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 77939 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_flow.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_flow.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 73588 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_intr.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 5878 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_intr.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 1598 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_logs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 14524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_mbx.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 6642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_mbx.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 6982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_mp.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_mp.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 7390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_ptp.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_ptp.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 14007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_regs.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 4516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_regs.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 33042 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rss.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 6144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rss.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 135314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx.c 00:46:58.272 -rw-r--r-- vagrant/vagrant 27232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx.h 00:46:58.272 -rw-r--r-- vagrant/vagrant 5298 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx_vec.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 3938 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx_vec.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 8984 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx_vec_neon.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 10263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_rxtx_vec_sve.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 44973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_stats.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 5622 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_stats.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 38618 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_tm.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 3215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/hns3_tm.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 1904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/hns3/meson.build 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/ 00:46:58.273 -rw-r--r-- vagrant/vagrant 5083 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/memif.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 29497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/memif_socket.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 2851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/memif_socket.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/meson.build 00:46:58.273 -rw-r--r-- vagrant/vagrant 55357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/rte_eth_memif.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 6330 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/memif/rte_eth_memif.h 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/ 00:46:58.273 -rw-r--r-- vagrant/vagrant 4404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_rx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 5266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_rx.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 4130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_rx_avx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 5058 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_rx_neon.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_rx_sse.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 6743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_tx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 12530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_vf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 7276 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/cnxk_ep_vf.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 1202 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/meson.build 00:46:58.273 -rw-r--r-- vagrant/vagrant 17296 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx2_ep_vf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 6522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx2_ep_vf.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 15887 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_common.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 24810 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_ethdev.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 8960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_mbox.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_mbox.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 24194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_rxtx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 2166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_rxtx.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 11623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_vf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeon_ep/otx_ep_vf.h 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/ 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/ 00:46:58.273 -rw-r--r-- vagrant/vagrant 621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/meson.build 00:46:58.273 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 10518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_dcb.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 3691 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_dcb.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 7763 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_dcb_hw.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 848 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_dcb_hw.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 1611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_devids.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 20469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_dummy.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 10730 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_eeprom.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 2659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_eeprom.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 102449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_hw.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_hw.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 15727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_mbx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_mbx.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 11880 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_mng.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 4856 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_mng.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 5615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_osdep.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 82400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_phy.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 22730 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_phy.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 83272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_regs.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 3875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_status.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 23247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_type.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 17724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_vf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 1807 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/base/txgbe_vf.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/meson.build 00:46:58.273 -rw-r--r-- vagrant/vagrant 1061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/rte_pmd_txgbe.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 151058 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ethdev.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 22742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ethdev.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 37038 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ethdev_vf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 26763 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_fdir.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 90899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_flow.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 20282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ipsec.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 2215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ipsec.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_logs.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 23842 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_pf.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 26081 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ptypes.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 13576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_ptypes.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 1099 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_regs_group.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 141061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_rxtx.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 17849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_rxtx.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 27046 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/txgbe/txgbe_tm.c 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/ 00:46:58.273 -rw-r--r-- vagrant/vagrant 63676 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/avp_ethdev.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/avp_logs.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/meson.build 00:46:58.273 -rw-r--r-- vagrant/vagrant 11061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/rte_avp_common.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 2522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/avp/rte_avp_fifo.h 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/ 00:46:58.273 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/ 00:46:58.273 -rw-r--r-- vagrant/vagrant 1284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/README 00:46:58.273 -rw-r--r-- vagrant/vagrant 42162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_80003es2lan.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 2556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_80003es2lan.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 18965 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82540.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 34380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82541.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 1980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82541.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 14510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82542.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 43080 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82543.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82543.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 55864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82571.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 1084 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82571.h 00:46:58.273 -rw-r--r-- vagrant/vagrant 97114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82575.c 00:46:58.273 -rw-r--r-- vagrant/vagrant 16795 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_82575.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 37518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_api.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 6186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_api.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 5319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_base.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 4649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_base.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 65778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_defines.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 26390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_hw.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 25875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_i210.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 2685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_i210.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 175552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_ich8lan.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 12552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_ich8lan.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 67260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_mac.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 3262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_mac.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 15055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_manage.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 2484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_manage.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 19328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_mbx.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 3692 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_mbx.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 35248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_nvm.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 2350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_nvm.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_osdep.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 4836 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_osdep.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 119634 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_phy.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 13584 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_phy.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 37513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_regs.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 15511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_vf.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 7288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/e1000_vf.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 971 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/base/meson.build 00:46:58.274 -rw-r--r-- vagrant/vagrant 18630 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/e1000_ethdev.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/e1000_logs.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 1273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/e1000_logs.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 52877 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/em_ethdev.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 59788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/em_rxtx.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 155762 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/igb_ethdev.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 53264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/igb_flow.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 13855 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/igb_pf.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 5451 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/igb_regs.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 87461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/igb_rxtx.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 330 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/e1000/meson.build 00:46:58.274 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/ 00:46:58.274 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/ 00:46:58.274 -rw-r--r-- vagrant/vagrant 1072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/README 00:46:58.274 -rw-r--r-- vagrant/vagrant 35139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_adminq.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 4019 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_adminq.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 92008 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_adminq_cmd.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 1184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_alloc.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 255789 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_common.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 40528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_dcb.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 6299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_dcb.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 2126 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_devids.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 4220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_diag.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_diag.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 10219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_hmc.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 7295 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_hmc.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 41228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_lan_hmc.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 4918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_lan_hmc.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 53135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_nvm.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 6662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_osdep.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 31151 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_prototype.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 404706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_register.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 2497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_status.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 63431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/i40e_type.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/meson.build 00:46:58.274 -rw-r--r-- vagrant/vagrant 26359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/base/virtchnl.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 364840 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_ethdev.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 56123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_ethdev.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 60341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_fdir.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 120539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_flow.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 40319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_hash.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 820 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_hash.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 1149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_logs.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 44623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_pf.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 1122 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_pf.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 4350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_recycle_mbufs_vec_common.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 53877 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_regs.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 103649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 32510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 6867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_common_avx.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 19490 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_altivec.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 31049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_avx2.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 33247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_avx512.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 6671 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_common.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 25078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_neon.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 26663 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_rxtx_vec_sse.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 81654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_testpmd.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 27925 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_tm.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 14951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/i40e_vf_representor.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 2570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/meson.build 00:46:58.274 -rw-r--r-- vagrant/vagrant 82154 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/rte_pmd_i40e.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 30022 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/rte_pmd_i40e.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 1406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/i40e/version.map 00:46:58.274 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/ 00:46:58.274 -rw-r--r-- vagrant/vagrant 4135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/meson.build 00:46:58.274 -rw-r--r-- vagrant/vagrant 36874 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 8708 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 23620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_ethdev.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 43952 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_flow.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 1937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_flow.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 6659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_glue.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 3312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_glue.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 9772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_intr.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 9832 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_mp.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 41338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_mr.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 3847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_mr.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 4666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_prm.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 24040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_rxq.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 40433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_rxtx.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 8737 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_rxtx.h 00:46:58.274 -rw-r--r-- vagrant/vagrant 13897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_txq.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 4767 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_utils.c 00:46:58.274 -rw-r--r-- vagrant/vagrant 3181 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx4/mlx4_utils.h 00:46:58.274 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/meson.build 00:46:58.275 -rw-r--r-- vagrant/vagrant 48159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_ethdev.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 5100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_ethdev.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 7892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_ethdev_ops.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 1170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_logs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 1812 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_rxtx.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 16304 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_rxtx.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 1139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/octeontx_stats.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 60 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/version.map 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 564 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/meson.build 00:46:58.275 -rw-r--r-- vagrant/vagrant 9062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_bgx.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 5531 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_bgx.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 3336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_io.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 5661 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_pki_var.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 5005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_pkivf.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 8536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_pkivf.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 13224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_pkovf.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 2664 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/octeontx/base/octeontx_pkovf.h 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vdev_netvsc/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vdev_netvsc/meson.build 00:46:58.275 -rw-r--r-- vagrant/vagrant 22475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vdev_netvsc/vdev_netvsc.c 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 58287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_common.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 35533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_dev.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 73107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_ethdev.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 18685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_ethdev.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 8125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_i2c.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_logs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 31549 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_mdio.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 7687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_phy.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 61489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_phy_impl.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 7626 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_regs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 32624 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_rxtx.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 6358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_rxtx.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 2725 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/axgbe_rxtx_vec_sse.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/axgbe/meson.build 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 101830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_com.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 38793 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_com.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 19796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_eth_com.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 6988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_eth_com.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_plat.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 11915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_plat_dpdk.h 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 61724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_admin_defs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_common_defs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 38619 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_eth_io_defs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_gen_info.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_includes.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 7116 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/base/ena_defs/ena_regs_defs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 123839 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ena_ethdev.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 9775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ena_ethdev.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 1019 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ena_logs.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ena_platform.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 16333 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/ena_rss.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ena/meson.build 00:46:58.275 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/ 00:46:58.275 -rw-r--r-- vagrant/vagrant 17414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 86067 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_ethdev.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 50952 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_fdir.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 21145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_fsub.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 60134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_generic_flow.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 23063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_generic_flow.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 55688 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_hash.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 55889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_ipsec_crypto.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 4427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_ipsec_crypto.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 6954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_ipsec_crypto_capabilities.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 1198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_log.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 147972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 28863 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 63629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx_vec_avx2.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 84398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx_vec_avx512.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 16531 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx_vec_common.h 00:46:58.275 -rw-r--r-- vagrant/vagrant 13865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx_vec_neon.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 49295 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_rxtx_vec_sse.c 00:46:58.275 -rw-r--r-- vagrant/vagrant 2183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_testpmd.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 26629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_tm.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 62448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/iavf_vchnl.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 2198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/meson.build 00:46:58.276 -rw-r--r-- vagrant/vagrant 6717 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/rte_pmd_iavf.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 453 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/iavf/version.map 00:46:58.276 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/ 00:46:58.276 -rw-r--r-- vagrant/vagrant 2059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/meson.build 00:46:58.276 -rw-r--r-- vagrant/vagrant 103092 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 96693 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 6844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_defs.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 49353 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_devx.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_devx.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 21487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_ethdev.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 355388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 116727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 57978 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_aso.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 618358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_dv.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 45390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_flex.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 28533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_geneve.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 442016 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_hw.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 97601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_meter.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 22908 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_quota.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 63291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_flow_verbs.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 37467 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_hws_cnt.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 22996 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_hws_cnt.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 5397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_mac.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 5551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rss.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 44316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rx.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 23815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rx.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 3556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxmode.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 91813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxq.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 13715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1663 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 17332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 2514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 48078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec_altivec.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 32942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec_neon.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec_null.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 31377 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_rxtx_vec_sse.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 8620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_stats.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 39005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_testpmd.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 632 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_testpmd.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 673 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_trace.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_trace.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 50606 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_trigger.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 22340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 121255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 2271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx_empw.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx_mpw.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1824 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx_nompw.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_tx_txpp.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 35530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_txpp.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 39599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_txq.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 32811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_utils.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 19621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_utils.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 4198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/mlx5_vlan.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 11529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/rte_pmd_mlx5.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/version.map 00:46:58.276 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/ 00:46:58.276 -rw-r--r-- vagrant/vagrant 591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/meson.build 00:46:58.276 -rw-r--r-- vagrant/vagrant 29583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 120357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_action.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 9412 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_action.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 3953 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_buddy.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_buddy.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 44972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_cmd.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 8959 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_cmd.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 6710 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_context.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_context.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 6789 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_crc32.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_crc32.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 15835 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_debug.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_debug.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 118510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_definer.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 20064 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_definer.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 2342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_internal.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 48285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_matcher.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 4588 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_matcher.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 13092 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_pat_arg.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 2656 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_pat_arg.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 16835 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_pool.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 4337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_pool.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 31448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_rule.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_rule.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 28545 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_send.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 6988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_send.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 16144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_table.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1884 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/hws/mlx5dr_table.h 00:46:58.276 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/ 00:46:58.276 -rw-r--r-- vagrant/vagrant 316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/meson.build 00:46:58.276 -rw-r--r-- vagrant/vagrant 48852 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_ethdev_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 2685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_flow_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 12818 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_flow_os.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 9823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_mp_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 93990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 1323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_os.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 5477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_socket.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 34141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_verbs.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_verbs.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 4272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/linux/mlx5_vlan_os.c 00:46:58.276 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/tools/ 00:46:58.276 -rwxr-xr-x vagrant/vagrant 8385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/tools/mlx5_trace.py 00:46:58.276 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/ 00:46:58.276 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/meson.build 00:46:58.276 -rw-r--r-- vagrant/vagrant 10285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_ethdev_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 11308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_flow_os.c 00:46:58.276 -rw-r--r-- vagrant/vagrant 10743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_flow_os.h 00:46:58.276 -rw-r--r-- vagrant/vagrant 1552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_mp_os.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 24287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_os.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 693 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_os.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mlx5/windows/mlx5_vlan_os.c 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/meson.build 00:46:58.277 -rw-r--r-- vagrant/vagrant 41622 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/pcap_ethdev.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 438 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/pcap_osdep.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/pcap_osdep_freebsd.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 801 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/pcap_osdep_linux.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 2836 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pcap/pcap_osdep_windows.c 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vhost/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vhost/meson.build 00:46:58.277 -rw-r--r-- vagrant/vagrant 43075 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vhost/rte_eth_vhost.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1251 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vhost/rte_eth_vhost.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vhost/version.map 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 324661 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 68755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 23580 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_ethdev.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1939 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_ethdev.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1661 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_logs.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 664 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_osal.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 13723 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_rxtx.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 4115 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_rxtx.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 50281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_stats.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 22112 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_stats.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 20318 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_vfpf.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 8212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/bnx2x_vfpf.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 16849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_fw_defs.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 226927 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_hsi.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 23918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_init.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 25549 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_init_ops.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 7185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_mfw_req.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 293363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_reg.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 154971 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_sp.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 56798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/ecore_sp.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 483244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/elink.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 24509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/elink.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 541 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnx2x/meson.build 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 2884 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/enetc.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 25537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/enetc_ethdev.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1330 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/enetc_logs.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 11088 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/enetc_rxtx.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/meson.build 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/base/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 8694 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetc/base/enetc_hw.h 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ 00:46:58.277 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ 00:46:58.277 -rw-r--r-- vagrant/vagrant 588 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/README 00:46:58.277 -rw-r--r-- vagrant/vagrant 18057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_acl.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 7079 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_acl.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 35278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_acl_ctrl.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 112102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_adminq_cmd.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 371 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_alloc.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 14932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_bitops.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 9399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_bst_tcam.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_bst_tcam.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 2544 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_cgu_regs.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 180071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_common.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 12022 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_common.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 35995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_controlq.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 2729 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_controlq.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 44611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_dcb.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 7299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_dcb.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 69050 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ddp.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 13452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ddp.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_defs.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 3286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_devids.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 188523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_fdir.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 9543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_fdir.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 120616 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flex_pipe.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 3305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flex_pipe.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 20331 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flex_type.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flg_rd.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flg_rd.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 150575 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flow.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 20954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_flow.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 503830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_hw_autogen.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 8459 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_imem.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1950 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_imem.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 70198 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_lan_tx_rx.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 5286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_metainit.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 800 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_metainit.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 1363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_mk_grp.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_mk_grp.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 41576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_nvm.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 4021 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_nvm.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 10915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_osdep.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 15425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_parser.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 4366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_parser.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 21808 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_parser_rt.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_parser_rt.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_parser_util.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 11205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_pg_cam.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 1657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_pg_cam.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 2878 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_proto_grp.c 00:46:58.277 -rw-r--r-- vagrant/vagrant 594 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_proto_grp.h 00:46:58.277 -rw-r--r-- vagrant/vagrant 10421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_protocol_type.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 8129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ptp_consts.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 147415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ptp_hw.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 19618 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ptp_hw.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 2007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ptype_mk.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_ptype_mk.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_sbq_cmd.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 167429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_sched.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 9652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_sched.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_status.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 295268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_switch.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 18625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_switch.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_tmatch.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 49345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_type.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 12970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_vlan_mode.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_vlan_mode.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 6382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_xlt_kb.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 860 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/ice_xlt_kb.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/base/meson.build 00:46:58.278 -rw-r--r-- vagrant/vagrant 27757 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_acl_filter.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 36203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 5017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 56956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 2444 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf_ethdev.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 13190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf_parent.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 24506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf_sched.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 13374 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_dcf_vf_representor.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 24876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_diagnose.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 185788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 23129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_ethdev.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 70566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_fdir_filter.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 67076 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_generic_flow.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 22023 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_generic_flow.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 44761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_hash.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 1135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_logs.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 139302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 14481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 7409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx_common_avx.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 32570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx_vec_avx2.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 36628 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx_vec_avx512.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 10516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx_vec_common.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 25886 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_rxtx_vec_sse.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 63484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_switch_filter.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 6326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_testpmd.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 24085 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/ice_tm.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 2231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/meson.build 00:46:58.278 -rw-r--r-- vagrant/vagrant 189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ice/version.map 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/ 00:46:58.278 -rw-r--r-- vagrant/vagrant 538 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/meson.build 00:46:58.278 -rw-r--r-- vagrant/vagrant 22455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/mvneta_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 2392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/mvneta_ethdev.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 24739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/mvneta_rxtx.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 1203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvneta/mvneta_rxtx.h 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/ 00:46:58.278 -rw-r--r-- vagrant/vagrant 401 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/meson.build 00:46:58.278 -rw-r--r-- vagrant/vagrant 1634 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_eth.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 27729 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 16125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_hal.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 21256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_hif.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 3361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_hif.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 13898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_hif_lib.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 5370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_hif_lib.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_logs.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/pfe_mod.h 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/ 00:46:58.278 -rw-r--r-- vagrant/vagrant 2132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 12387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/pfe.h 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/ 00:46:58.278 -rw-r--r-- vagrant/vagrant 946 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/bmu.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 11614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/class_csr.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 10117 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/emac_mtip.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1914 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/gpi.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 2680 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/hif.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/hif_nocpy.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 6526 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/tmu_csr.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1701 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/pfe/base/cbus/util_csr.h 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/ 00:46:58.278 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/ 00:46:58.278 -rw-r--r-- vagrant/vagrant 3493 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 15522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost_kernel.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 4332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost_kernel_tap.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 1481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost_kernel_tap.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 23530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost_user.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 17639 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/vhost_vdpa.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 35570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/virtio_user_dev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 2873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user/virtio_user_dev.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 2215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/meson.build 00:46:58.278 -rw-r--r-- vagrant/vagrant 1708 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 9553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 6134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_cvq.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 4136 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_cvq.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 75332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 3827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_ethdev.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_logs.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 18319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_pci.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 4620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_pci.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 5609 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_pci_ethdev.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 5422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_ring.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 52166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 1113 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 3147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_packed.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 9336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_packed.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 7097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_packed_avx.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 9197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_packed_neon.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 1411 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_simple.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 1529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_simple.h 00:46:58.278 -rw-r--r-- vagrant/vagrant 6550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_simple_altivec.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 6678 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_simple_neon.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 5981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_rxtx_simple_sse.c 00:46:58.278 -rw-r--r-- vagrant/vagrant 21500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtio_user_ethdev.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 12558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtqueue.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 24408 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/virtio/virtqueue.h 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/ 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ 00:46:58.279 -rw-r--r-- vagrant/vagrant 1831 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_tf_common.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 15487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_tf_pmd_shim.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 2402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_tf_pmd_shim.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 69677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_ulp.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 12098 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_ulp.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 26990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_ulp_flow.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 24238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/bnxt_ulp_meter.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 741 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/meson.build 00:46:58.279 -rw-r--r-- vagrant/vagrant 15732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_def_rules.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 20843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_fc_mgr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4793 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_fc_mgr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 51774 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_flow_db.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 11348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_flow_db.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 10344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_gen_hash.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_gen_hash.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 12038 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_gen_tbl.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_gen_tbl.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 18934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_ha_mgr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 1409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_ha_mgr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 126186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_mapper.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 3788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_mapper.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 8277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_mark_mgr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 2736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_mark_mgr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 4018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_matcher.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_matcher.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 20384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_port_db.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 9068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_port_db.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 15475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_rte_handler_tbl.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 95093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_rte_parser.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 10810 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_rte_parser.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 12827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_template_struct.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 6462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_tun.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 1418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_tun.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 26327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_utils.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 13846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/ulp_utils.h 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ 00:46:58.279 -rw-r--r-- vagrant/vagrant 426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/meson.build 00:46:58.279 -rw-r--r-- vagrant/vagrant 176103 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_act.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 1244570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_class.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 140892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_enum.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 77534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_field.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 663976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_tbl.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 3245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_tbl.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 261450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_thor_act.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 1105160 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_thor_class.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 189625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_wh_plus_act.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 430011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_ulp/generic_templates/ulp_template_db_wh_plus_class.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 40095 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 12710 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_cpr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4199 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_cpr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 175995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_ethdev.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 5180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_filter.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 7105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_filter.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 64243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_flow.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 219465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_hwrm.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 17055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_hwrm.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 4310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_irq.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_irq.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 1969 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_nvm_defs.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 23397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_reps.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 2062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_reps.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 24854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_ring.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 3805 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_ring.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 17146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxq.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 2467 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxq.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 46218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 18107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 30397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxtx_vec_avx2.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4723 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxtx_vec_common.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 14637 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxtx_vec_neon.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 22857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_rxtx_vec_sse.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 43358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_stats.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_stats.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 5190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_txq.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 1500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_txq.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 19740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_txr.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 3528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_txr.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_util.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_util.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 38906 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_vnic.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 5831 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/bnxt_vnic.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 2239194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hsi_struct_def_dpdk.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 1693 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/meson.build 00:46:58.279 -rw-r--r-- vagrant/vagrant 20391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/rte_pmd_bnxt.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 8691 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/rte_pmd_bnxt.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 564 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/version.map 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/ 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/ 00:46:58.279 -rw-r--r-- vagrant/vagrant 3558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 3424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_common.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 9529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_defs.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 6464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_p4.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 5354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_p4.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 2447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_p58.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 4073 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/hcapi_cfa_p58.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/hcapi/cfa/meson.build 00:46:58.279 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/ 00:46:58.279 -rw-r--r-- vagrant/vagrant 8400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/bitalloc.c 00:46:58.279 -rw-r--r-- vagrant/vagrant 3569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/bitalloc.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 10415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_resource_types.h 00:46:58.279 -rw-r--r-- vagrant/vagrant 59518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 11325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 2997 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_device.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 5455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_hwop_msg.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_hwop_msg.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 33537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_p4.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_p4.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 33527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_p58.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_p58.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 3724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_sbmp.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 10128 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_session.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 1363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/cfa_tcam_mgr_session.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 7899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/dpool.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 6448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/dpool.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 1070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/ll.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 1651 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/ll.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 6042 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/lookup3.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 1021 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/meson.build 00:46:58.280 -rw-r--r-- vagrant/vagrant 804 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/rand.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/rand.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 1750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/stack.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 2100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/stack.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 975 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_common.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 45744 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_core.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 58678 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_core.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 13976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 24508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 15706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device_p4.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 5489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device_p4.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 25732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device_p58.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 5359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_device_p58.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 11869 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 29303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em_common.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 4228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em_common.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 3705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em_hash_internal.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 12819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em_host.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 8352 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_em_internal.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 6119 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_ext_flow_handle.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 4399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_global_cfg.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 2903 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_global_cfg.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 3872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_hash.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_hash.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 6569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_identifier.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 3850 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_identifier.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 4015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_if_tbl.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 4116 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_if_tbl.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 63898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_msg.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 16356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_msg.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 1244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_msg_common.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_project.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 258 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_resources.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 34516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_rm.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 13361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_rm.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 25947 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_session.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 16151 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_session.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 21612 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_sram_mgr.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 6261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_sram_mgr.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 14359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tbl.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 6017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tbl.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 18130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tbl_sram.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 3220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tbl_sram.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 16589 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 6482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 7556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam_mgr_msg.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 1326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam_mgr_msg.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 3265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam_shared.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 3636 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_tcam_shared.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 4005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_util.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 2158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tf_util.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 2969 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tfp.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 5415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bnxt/tf_core/tfp.h 00:46:58.280 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/ 00:46:58.280 -rw-r--r-- vagrant/vagrant 19210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_ethdev.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 3623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_ethdev.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_pmd_logs.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 5372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_regs.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 7322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_rxtx.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 8041 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_uio.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 1838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/enet_uio.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enetfec/meson.build 00:46:58.280 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/ 00:46:58.280 -rw-r--r-- vagrant/vagrant 38643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_ethdev.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 2268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_ethdev.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_logs.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 25483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_rxtx.c 00:46:58.280 -rw-r--r-- vagrant/vagrant 1739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_rxtx.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 2896 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/idpf_rxtx_vec_common.h 00:46:58.280 -rw-r--r-- vagrant/vagrant 875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/idpf/meson.build 00:46:58.280 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/ 00:46:58.280 -rw-r--r-- vagrant/vagrant 599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/meson.build 00:46:58.281 -rw-r--r-- vagrant/vagrant 79153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_ethdev.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 6197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_ethdev.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 52712 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_flow.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_flow.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 12992 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_mtr.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_mtr.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 32933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_qos.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 2918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_qos.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 29459 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_tm.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 331 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/mvpp2/mrvl_tm.h 00:46:58.281 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/ 00:46:58.281 -rw-r--r-- vagrant/vagrant 503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/meson.build 00:46:58.281 -rw-r--r-- vagrant/vagrant 229390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_debug.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 26277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_debug.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 83338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_ethdev.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 9937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_ethdev.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 25971 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_filter.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 6506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_if.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 2153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_logs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 20753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_main.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_regs.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 80677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_rxtx.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_rxtx.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 5506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_sriov.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/qede_sriov.h 00:46:58.281 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ 00:46:58.281 -rw-r--r-- vagrant/vagrant 8823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/bcm_osal.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 15346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/bcm_osal.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 63094 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/common_hsi.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 28551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 337629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_attn_values.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 23068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_chain.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 65651 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_cxt.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_cxt.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_cxt_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 48360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_dcbx.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 1755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_dcbx.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 6388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_dcbx_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 196938 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_dev.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 17528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_dev_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1737 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_gtt_reg_addr.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_gtt_values.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 94909 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_common.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 30690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_debug_tools.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 100790 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_eth.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_func_common.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 3367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_init_func.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 11099 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hsi_init_tool.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 32084 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hw.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8229 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hw.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1668 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_hw_defs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 65987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_init_fw_funcs.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 19547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_init_fw_funcs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 15685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_init_ops.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 2036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_init_ops.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 84347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_int.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 6677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_int.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 9479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_int_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 19155 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_iov_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 12522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_iro.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 9059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_iro_values.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 72269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_l2.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 4495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_l2.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 13959 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_l2_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 132397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_mcp.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 15562 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_mcp.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 32107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_mcp_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 47142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_mng_tlv.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 2857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_proto_if.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 32384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_rt_defs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_sp_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 21049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_sp_commands.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 4429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_sp_commands.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 31029 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_spq.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 7640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_spq.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 145001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_sriov.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 7477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_sriov.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 676 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_status.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_utils.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 56860 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_vf.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_vf.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 4507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_vf_api.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 17983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/ecore_vfpf_if.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 25618 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/eth_common.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 75589 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/mcp_public.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/meson.build 00:46:58.281 -rw-r--r-- vagrant/vagrant 150520 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/nvm_cfg.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 95934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/qede/base/reg_addr.h 00:46:58.281 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/ 00:46:58.281 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/ 00:46:58.281 -rw-r--r-- vagrant/vagrant 597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/README 00:46:58.281 -rw-r--r-- vagrant/vagrant 2547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/upt1_defs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 89 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/vmware_pack_begin.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 89 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/vmware_pack_end.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 29764 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/vmxnet3_defs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/base/vmxnet3_osdep.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/meson.build 00:46:58.281 -rw-r--r-- vagrant/vagrant 58733 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/vmxnet3_ethdev.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/vmxnet3_ethdev.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 1297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/vmxnet3_logs.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 4277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/vmxnet3_ring.h 00:46:58.281 -rw-r--r-- vagrant/vagrant 40500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/vmxnet3/vmxnet3_rxtx.c 00:46:58.281 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/ 00:46:58.281 -rw-r--r-- vagrant/vagrant 25755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/bonding_testpmd.c 00:46:58.281 -rw-r--r-- vagrant/vagrant 8940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/eth_bond_8023ad_private.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 9221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/eth_bond_private.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/meson.build 00:46:58.282 -rw-r--r-- vagrant/vagrant 10754 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 49017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_8023ad.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 8824 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_8023ad.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 8501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_alb.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 3214 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_alb.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 30839 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_api.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 6387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_args.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 6693 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_flow.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 120246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/rte_eth_bond_pmd.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 945 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/bonding/version.map 00:46:58.282 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/ 00:46:58.282 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/ 00:46:58.282 -rw-r--r-- vagrant/vagrant 2915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/cq_desc.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 9827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/cq_enet_desc.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 1379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/rq_enet_desc.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 2289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_cq.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_cq.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 32614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_dev.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 8329 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_dev.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 36484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_devcmd.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 3276 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_enet.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 9166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_flowman.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 1254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_intr.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_intr.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 2351 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_nic.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 2273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_resource.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 3653 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_rq.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 3586 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_rq.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_rss.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 1314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_stats.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 4104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_wq.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 4916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/vnic_wq.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 3452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/base/wq_enet_desc.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 14422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 1878 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_compat.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 39320 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_ethdev.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 50615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_flow.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 101440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_fm_flow.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 54788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_main.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 9944 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_res.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_res.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 19066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_rxtx.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 8856 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_rxtx_common.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 28483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_rxtx_vec_avx2.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 20302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/enic_vf_representor.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 1195 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/enic/meson.build 00:46:58.282 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/ 00:46:58.282 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/ 00:46:58.282 -rw-r--r-- vagrant/vagrant 968 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/README 00:46:58.282 -rw-r--r-- vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_82571.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 13383 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_82575.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 46904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_api.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 4803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_api.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 5130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_base.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 4532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_base.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 72649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_defines.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 26191 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_hw.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 37065 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_i225.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 4713 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_i225.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 12167 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_ich8lan.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 60997 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_mac.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_mac.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 14617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_manage.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_manage.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 33398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_nvm.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 2234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_nvm.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_osdep.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 4409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_osdep.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 120397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_phy.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 14067 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_phy.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 38124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/igc_regs.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/base/meson.build 00:46:58.282 -rw-r--r-- vagrant/vagrant 81187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_ethdev.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 8065 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_ethdev.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 9224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_filter.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 1035 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_filter.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 25173 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_flow.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_flow.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_logs.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 1183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_logs.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 63796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_txrx.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 7045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/igc_txrx.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/igc/meson.build 00:46:58.282 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/ 00:46:58.282 -rw-r--r-- vagrant/vagrant 37147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_ethdev.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 991 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_logs.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 12477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_nvs.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 5943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_nvs.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 29989 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_rndis.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 1268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_rndis.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 41379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_rxtx.c 00:46:58.282 -rw-r--r-- vagrant/vagrant 8264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_var.h 00:46:58.282 -rw-r--r-- vagrant/vagrant 20004 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/hn_vf.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 324 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/meson.build 00:46:58.283 -rw-r--r-- vagrant/vagrant 11548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/ndis.h 00:46:58.283 -rw-r--r-- vagrant/vagrant 11798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/netvsc/rndis.h 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ring/ 00:46:58.283 -rw-r--r-- vagrant/vagrant 176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ring/meson.build 00:46:58.283 -rw-r--r-- vagrant/vagrant 19731 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ring/rte_eth_ring.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 1434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ring/rte_eth_ring.h 00:46:58.283 -rw-r--r-- vagrant/vagrant 76 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ring/version.map 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/ 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/ 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/ 00:46:58.283 -rw-r--r-- vagrant/vagrant 418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_0_15.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_0_15_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_0_15_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_0_15_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_112_127.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_112_127_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_112_127_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_112_127_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_16_31.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_16_31_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_16_31_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_16_31_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_32_47.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_32_47_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_32_47_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_32_47_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_48_63.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_48_63_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_48_63_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_48_63_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_64_79.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_64_79_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_64_79_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_64_79_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_80_95.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_80_95_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_80_95_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_80_95_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_96_111.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 430 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_96_111_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_96_111_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn10k/rx_96_111_vec_mseg.c 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/ 00:46:58.283 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_64_79.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_64_79_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_64_79_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_64_79_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_80_95.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_80_95_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_80_95_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_80_95_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_96_111.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 241 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_96_111_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_96_111_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_96_111_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 229 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_0_15.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_0_15_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_0_15_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_0_15_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_112_127.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_112_127_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_112_127_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_112_127_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_16_31.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_16_31_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_16_31_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_16_31_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_32_47.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_32_47_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_32_47_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_32_47_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_48_63.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_48_63_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_48_63_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rx/cn9k/rx_48_63_vec_mseg.c 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/ 00:46:58.283 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/ 00:46:58.283 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_0_15.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_0_15_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 335 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_0_15_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_0_15_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_112_127.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_112_127_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_112_127_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_112_127_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_16_31.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_16_31_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_16_31_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_16_31_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_32_47.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_32_47_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_32_47_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_32_47_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_48_63.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_48_63_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_48_63_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_48_63_vec_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_64_79.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_64_79_mseg.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_64_79_vec.c 00:46:58.283 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_64_79_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_80_95.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_80_95_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_80_95_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_80_95_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_96_111.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_96_111_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_96_111_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn10k/tx_96_111_vec_mseg.c 00:46:58.284 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/ 00:46:58.284 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_0_15.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_0_15_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_0_15_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_0_15_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_112_127.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_112_127_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_112_127_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 313 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_112_127_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_16_31.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_16_31_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_16_31_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_16_31_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_32_47.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_32_47_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_32_47_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_32_47_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_48_63.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_48_63_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_48_63_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_48_63_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_64_79.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_64_79_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_64_79_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_64_79_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_80_95.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_80_95_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_80_95_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_80_95_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_96_111.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 304 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_96_111_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_96_111_vec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/tx/cn9k/tx_96_111_vec_mseg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 28360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_ethdev.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_ethdev.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 36285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_ethdev_sec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 7897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 730 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_flow.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 86610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_rx.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 3491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_rx_select.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 5694 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_rxtx.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 119409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_tx.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 2452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn10k_tx_select.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 25420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_ethdev.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 4245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_ethdev.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 20245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_ethdev_sec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 2357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_flow.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 40835 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_rx.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 1998 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_rx_select.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 82327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_tx.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 2420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cn9k_tx_select.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 19798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_eswitch.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 6255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_eswitch.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 3513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_eswitch_devargs.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 12102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_eswitch_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 6104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_eswitch_rxtx.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 59213 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 23725 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 3983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_cman.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 11108 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_devargs.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 5128 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_dp.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 22197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_mcs.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 4068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_mcs.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 46925 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_mtr.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 34555 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_ops.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 13162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_sec.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 7912 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_sec_telemetry.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 2578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ethdev_telemetry.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 37732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 1916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_flow.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 3622 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_link.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 10184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_lookup.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 7180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_ptp.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 17305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 5339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 21703 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 19960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep_msg.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 4281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep_msg.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 20601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_rep_ops.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 7815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_stats.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 22205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_tm.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/cnxk_tm.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 6687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/meson.build 00:46:58.284 -rw-r--r-- vagrant/vagrant 11826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/rte_pmd_cnxk.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cnxk/version.map 00:46:58.284 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/ 00:46:58.284 -rw-r--r-- vagrant/vagrant 10010 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 10555 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_args.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 4049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_eal.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 16362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_ether.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 6171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_flow.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 13184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_intr.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 39073 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_ops.c 00:46:58.284 -rw-r--r-- vagrant/vagrant 12989 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_private.h 00:46:58.284 -rw-r--r-- vagrant/vagrant 4239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/failsafe_rxtx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 637 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/failsafe/meson.build 00:46:58.285 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ 00:46:58.285 -rw-r--r-- vagrant/vagrant 2648 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 8373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_dev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 9268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_dev.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 6871 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_dev_pci.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 3599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_dev_vdev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 35888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_ethdev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 908 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_ethdev.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 81244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_if.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 53560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_lif.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 7177 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_lif.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_logs.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_mac_api.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_mac_api.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 13332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_main.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 2707 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rx_filter.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rx_filter.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 24590 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rxtx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 5184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rxtx.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 13144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rxtx_sg.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 11038 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/ionic_rxtx_simple.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ionic/meson.build 00:46:58.285 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/ 00:46:58.285 -rw-r--r-- vagrant/vagrant 535 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/meson.build 00:46:58.285 -rw-r--r-- vagrant/vagrant 1252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 16404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_ethdev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 2720 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_rx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 5485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_rx.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 2402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_rxmode.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1468 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_rxmode.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_stats.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_stats.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 2256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_tx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 4183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfb/nfb_tx.h 00:46:58.285 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/ 00:46:58.285 -rw-r--r-- vagrant/vagrant 2677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/meson.build 00:46:58.285 -rw-r--r-- vagrant/vagrant 33477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 11757 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1095 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_debug.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 3465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_dp.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 4660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_dp.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 8497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_dp_rx.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 10270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_dp_tx.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 3732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef10.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1714 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef100.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 28601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef100_rx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 29190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef100_tx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 21554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef10_essb_rx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 22695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef10_rx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 5040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef10_rx_ev.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 31601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ef10_tx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 91505 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ethdev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1609 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ethdev_state.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 24438 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ev.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 6529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_ev.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 2888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_filter.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_filter.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 81365 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 6524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 11496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow_rss.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1938 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow_rss.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 10051 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow_tunnel.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 2860 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_flow_tunnel.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 8339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_intr.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 2527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_kvargs.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 3078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_kvargs.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 2649 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_log.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 164474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 13813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 28847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae_counter.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae_counter.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 5624 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae_ct.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mae_ct.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1856 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_mcdi.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 8110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_nic_dma.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 545 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_nic_dma.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_nic_dma_dp.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 18607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_port.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 27013 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_repr.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 912 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_repr.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 38668 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_repr_proxy.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 3632 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_repr_proxy.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_repr_proxy_api.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 50421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_rx.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 4667 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_rx.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 2104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_service.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_service.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 3916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_sriov.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 871 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_sriov.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1717 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_stats.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 22956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_sw_stats.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 1273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_sw_stats.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 17187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_switch.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 3928 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_switch.h 00:46:58.285 -rw-r--r-- vagrant/vagrant 1465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbl_meta.c 00:46:58.285 -rw-r--r-- vagrant/vagrant 747 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbl_meta.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 6082 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbl_meta_cache.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbl_meta_cache.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 4444 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbls.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 9753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tbls.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 4534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tso.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 2045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tso.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 1813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tweak.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 33657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tx.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 3859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/sfc/sfc_tx.h 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_packet/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_packet/meson.build 00:46:58.286 -rw-r--r-- vagrant/vagrant 30074 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/af_packet/rte_eth_af_packet.c 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 28987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_actions.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 22244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_controlq.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 2961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_controlq.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 11241 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_cpchnl.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 76768 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_ethdev.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 8569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_ethdev.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 7417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_flow.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 2947 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_flow.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 17046 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_flow_engine_fxp.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 52856 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_flow_parser.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 8231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_flow_parser.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 6417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_fxp_rule.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_fxp_rule.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 633 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_logs.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 17150 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_representor.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_representor.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 3673 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_rules.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 9246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_rules.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 43551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_rxtx.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 4062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_rxtx.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 3112 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_rxtx_vec_common.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 6413 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/cpfl_vchnl.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/cpfl/meson.build 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/ 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 9868 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_api.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_api.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 14823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_common.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 952 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_common.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 64635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_mbx.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 11196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_mbx.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 3858 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_osdep.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 66402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_pf.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 5145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_pf.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 25387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_tlv.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 6158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_tlv.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 27762 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_type.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 19089 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_vf.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1986 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/fm10k_vf.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/base/meson.build 00:46:58.286 -rw-r--r-- vagrant/vagrant 11138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/fm10k.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 92407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/fm10k_ethdev.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/fm10k_logs.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 19030 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/fm10k_rxtx.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 26306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/fm10k_rxtx_vec.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/fm10k/meson.build 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 15090 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_ethdev.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 46849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_ethdev.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 30050 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_flow.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 2685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_flow.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_logs.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 2168 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_rawdev_api.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 85518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_representor.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 54614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/ipn3ke_tm.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/meson.build 00:46:58.286 -rw-r--r-- vagrant/vagrant 70 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/ipn3ke/version.map 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/ 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfd3/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 2355 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfd3/nfp_nfd3.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 13145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfd3/nfp_nfd3_dp.c 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfdk/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 6867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfdk/nfp_nfdk.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 15303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfdk/nfp_nfdk_dp.c 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 23101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp6000_pcie.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp6000_pcie.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 9640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_cpp.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 21895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_cppcore.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 984 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_crc.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_crc.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 26520 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_elf.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_elf.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 6643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_hwinfo.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 375 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_hwinfo.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 3025 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_mip.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 2557 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_mip.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 9157 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_mutex.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 735 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_mutex.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 6457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nffw.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nffw.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 17422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nsp.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 8305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nsp.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 2278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nsp_cmds.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 17965 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_nsp_eth.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 6061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_resource.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_resource.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 14846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_rtsym.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_rtsym.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 7443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_sync.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_sync.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 25004 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_target.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 1092 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp_target.h 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp6000/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 804 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp6000/nfp6000.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfpcore/nfp6000/nfp_xpb.h 00:46:58.286 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/ 00:46:58.286 -rw-r--r-- vagrant/vagrant 5701 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_flow.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 23846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_representor.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 799 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_representor.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 4805 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_service.c 00:46:58.286 -rw-r--r-- vagrant/vagrant 456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_service.h 00:46:58.286 -rw-r--r-- vagrant/vagrant 44769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_conntrack.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_conntrack.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 22194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 3707 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 14547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_cmsg.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 45280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_cmsg.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 14183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_ctrl.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_ctrl.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 137888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/flower/nfp_flower_flow.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 1371 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/meson.build 00:46:58.287 -rw-r--r-- vagrant/vagrant 10010 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_cpp_bridge.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_cpp_bridge.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 57982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_ethdev.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 12052 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_ethdev_vf.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 34896 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_ipsec.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 6793 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_ipsec.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_logs.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 1211 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_logs.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 29547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_mtr.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 3140 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_mtr.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 1367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_cmsg.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 8339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_cmsg.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 64713 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_common.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 10282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_common.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 2558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_ctrl.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 10310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_ctrl.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 26259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_flow.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_flow.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 9873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_meta.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 3826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_net_meta.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 21763 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_rxtx.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 6404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_rxtx.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 3028 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_service.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/nfp/nfp_service.h 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 6410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/conn.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 989 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/conn.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 1572 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/firmware.cli 00:46:58.287 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/firmware.spec 00:46:58.287 -rw-r--r-- vagrant/vagrant 943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/firmware_rx.io 00:46:58.287 -rw-r--r-- vagrant/vagrant 943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/firmware_tx.io 00:46:58.287 -rw-r--r-- vagrant/vagrant 476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/meson.build 00:46:58.287 -rw-r--r-- vagrant/vagrant 11510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 2003 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 55746 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_cli.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 5327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_internals.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 2061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_mempool.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 5153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_pipeline.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 2069 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_swq.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 11830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/rte_eth_softnic_thread.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 82 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/net/softnic/version.map 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 2845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/bus_vmbus_driver.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/meson.build 00:46:58.287 -rw-r--r-- vagrant/vagrant 4576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/private.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 7340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/rte_bus_vmbus.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 8175 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/rte_vmbus_reg.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 582 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/version.map 00:46:58.287 -rw-r--r-- vagrant/vagrant 6323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/vmbus_bufring.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 10648 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/vmbus_channel.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 7833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/vmbus_common.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 6252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/vmbus_common_uio.c 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/linux/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 8130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/linux/vmbus_bus.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 10942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vmbus/linux/vmbus_uio.c 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/ 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/linux/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 2904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/linux/auxiliary.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 10035 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/auxiliary_common.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 1459 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/auxiliary_params.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 5332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/bus_auxiliary_driver.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/meson.build 00:46:58.287 -rw-r--r-- vagrant/vagrant 2548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/private.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 88 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/auxiliary/version.map 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 5279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/bus_cdx_driver.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 15873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/cdx.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/cdx_logs.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 17839 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/cdx_vfio.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/meson.build 00:46:58.287 -rw-r--r-- vagrant/vagrant 1464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/private.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/cdx/version.map 00:46:58.287 -rw-r--r-- vagrant/vagrant 304 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/meson.build 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/ 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/ 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/fman/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 20366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/fman/fman.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 16308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/fman/fman_hw.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 3690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/fman/netcfg_layer.c 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 8719 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/bman.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 14106 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/bman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 7549 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/bman_driver.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 2881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/bman_priv.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 1660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/dpaa_alloc.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 2440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/dpaa_sys.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/dpaa_sys.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 10854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/process.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 80277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/qman.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 23628 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/qman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 8926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/qman_driver.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 9646 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/base/qbman/qman_priv.h 00:46:58.287 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/ 00:46:58.287 -rw-r--r-- vagrant/vagrant 879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/dpaa_bits.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 3498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/dpaa_rbtree.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 13815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 10154 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fsl_bman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 5547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fsl_fman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 9344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fsl_fman_crc64.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 71957 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fsl_qman.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 2296 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/fsl_usd.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 1686 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/netcfg.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 3739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/include/process.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 6168 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/bus_dpaa_driver.h 00:46:58.287 -rw-r--r-- vagrant/vagrant 19538 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/dpaa_bus.c 00:46:58.287 -rw-r--r-- vagrant/vagrant 784 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/rte_dpaa_logs.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/dpaa/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 5787 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/bus_fslmc_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 16069 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/fslmc_bus.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 704 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/fslmc_logs.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 26567 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/fslmc_vfio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1733 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/fslmc_vfio.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 699 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/private.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2451 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 9368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpbp.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 15969 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpci.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 8796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpcon.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 12891 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpdmai.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 13266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 2365 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dpmng.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 3546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/dprc.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 2212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpbp.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpbp_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 6587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpci.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 4144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpci_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2118 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpcon.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1577 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpcon_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 5933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpdmai.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2862 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpdmai_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 3630 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpio.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpio_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpmng.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 729 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpmng_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dpopr.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dprc.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_dprc_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 4585 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_mc_cmd.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/fsl_mc_sys.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2563 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/mc/mc_sys.c 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 2878 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_dpbp.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 4513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_dpci.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 16356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_dpio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_dpio.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 2602 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_dprc.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 14769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/portal/dpaa2_hw_pvt.h 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 15039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/qbman_debug.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 78963 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/qbman_portal.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 6017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/qbman_portal.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 17373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/qbman_sys.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/qbman_sys_decl.h 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/include/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 2797 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/include/compat.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 7149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/include/fsl_qbman_base.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 7937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/include/fsl_qbman_debug.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 42825 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/fslmc/qbman/include/fsl_qbman_portal.h 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 4093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/bus_ifpga_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 11597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/ifpga_bus.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/ifpga_common.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/ifpga_logs.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 298 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/ifpga/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/ 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/bsd/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 13802 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/bsd/pci.c 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/linux/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 19045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/linux/pci.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 2810 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/linux/pci_init.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 16980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/linux/pci_uio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 34066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/linux/pci_vfio.c 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/windows/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 13279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/windows/pci.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 5097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/windows/pci_netuio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/windows/pci_netuio.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 6475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/bus_pci_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 23370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/pci_common.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 5854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/pci_common_uio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 2736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/pci_params.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 7830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/private.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 8089 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/rte_bus_pci.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 631 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/pci/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 5169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/bus_platform_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 13658 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/platform.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1627 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/platform_params.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 1297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/private.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 99 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/platform/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/uacce/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 7159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/uacce/bus_uacce_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/uacce/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 15396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/uacce/uacce.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/uacce/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 4151 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/bus_vdev_driver.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 1654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/rte_bus_vdev.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 16034 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/vdev.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/vdev_logs.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 1367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/vdev_params.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/vdev_private.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/bus/vdev/version.map 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/meson.build 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 9535 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 10127 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy_cgx.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy_cgx.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 3305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy_cgx_test.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 2124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy_irq.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/cnxk_bphy_irq.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/meson.build 00:46:58.288 -rw-r--r-- vagrant/vagrant 21939 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_bphy/rte_pmd_bphy.h 00:46:58.288 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/ 00:46:58.288 -rw-r--r-- vagrant/vagrant 18009 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/cnxk_gpio.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 790 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/cnxk_gpio.h 00:46:58.288 -rw-r--r-- vagrant/vagrant 3884 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/cnxk_gpio_irq.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 11045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/cnxk_gpio_selftest.c 00:46:58.288 -rw-r--r-- vagrant/vagrant 297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/meson.build 00:46:58.289 -rw-r--r-- vagrant/vagrant 9887 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/cnxk_gpio/rte_pmd_cnxk_gpio.h 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/dpaa2_cmdif/ 00:46:58.289 -rw-r--r-- vagrant/vagrant 7004 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/dpaa2_cmdif/dpaa2_cmdif.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 1347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/dpaa2_cmdif/dpaa2_cmdif_logs.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/dpaa2_cmdif/meson.build 00:46:58.289 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/dpaa2_cmdif/rte_pmd_dpaa2_cmdif.h 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/ 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ 00:46:58.289 -rw-r--r-- vagrant/vagrant 1444 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/README 00:46:58.289 -rw-r--r-- vagrant/vagrant 11339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_api.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 894 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_api.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 1604 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_compat.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 42398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_defines.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 26432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_enumerate.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_enumerate.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 10345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_feature_dev.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 6455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_feature_dev.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 40060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 7345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme_dperf.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 10141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme_error.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 20539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme_iperf.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 8679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme_pr.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 9897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_fme_rsu.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 3299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_hw.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 9948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_port.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 3942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_port_error.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 14221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_sec_mgr.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2701 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/ifpga_sec_mgr.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 1084 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/meson.build 00:46:58.289 -rw-r--r-- vagrant/vagrant 1456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_at24_eeprom.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_at24_eeprom.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 2926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_debug.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_debug.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 6551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_eth_group.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_eth_group.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 24923 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_hw_api.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 12405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_hw_api.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 11299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_i2c.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 4424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_i2c.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 2974 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_ifpga_hw_api.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 12131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_ifpga_hw_api.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 34605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_intel_max10.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 16607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_intel_max10.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 3669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_osdep.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 6697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_spi.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 5611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_spi.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 11679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/opae_spi_transaction.c 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/osdep_raw/ 00:46:58.289 -rw-r--r-- vagrant/vagrant 1446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/osdep_raw/osdep_generic.h 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/osdep_rte/ 00:46:58.289 -rw-r--r-- vagrant/vagrant 1987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/base/osdep_rte/osdep_generic.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 8708 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_core.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_core.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 7937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_hssi.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_hssi.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 9900 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_lpbk.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_lpbk.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 3864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_mem.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 1102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_he_mem.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 49554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_n3000.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 7451 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/afu_pmd_n3000.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 46343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/ifpga_rawdev.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 2636 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/ifpga_rawdev.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 678 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/meson.build 00:46:58.289 -rw-r--r-- vagrant/vagrant 2847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/rte_pmd_afu.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 8892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/rte_pmd_ifpga.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 6136 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/rte_pmd_ifpga.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 349 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ifpga/version.map 00:46:58.289 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/ 00:46:58.289 -rw-r--r-- vagrant/vagrant 228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/meson.build 00:46:58.289 -rw-r--r-- vagrant/vagrant 38574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/ntb.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 5756 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/ntb.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 15031 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/ntb_hw_intel.c 00:46:58.289 -rw-r--r-- vagrant/vagrant 4338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/ntb_hw_intel.h 00:46:58.289 -rw-r--r-- vagrant/vagrant 1095 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/ntb/rte_pmd_ntb.h 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/skeleton/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/skeleton/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 17566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/skeleton/skeleton_rawdev.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 3440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/skeleton/skeleton_rawdev.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 12816 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/raw/skeleton/skeleton_rawdev_test.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 13416 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/version.map 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 1281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 38079 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 18991 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_defs.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 19851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_devx.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 2979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_devx.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_log.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 6108 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_mp.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 3777 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_mp.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 60196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_mr.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 8302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_mr.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 6489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_pci.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_private.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 14865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_utils.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 10454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_common_utils.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 114162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_devx_cmds.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 26625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_devx_cmds.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 7689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_malloc.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 2262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_malloc.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 138069 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/mlx5_prm.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 4405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/version.map 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 10897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 4684 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_common_auxiliary.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 27376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_common_os.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 6707 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_common_os.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 3749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_common_verbs.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 39440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_glue.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 15097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_glue.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 51702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_nl.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 2844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/linux/mlx5_nl.h 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 10624 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_common_os.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 5977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_common_os.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 9155 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_glue.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 2853 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_glue.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 7988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_win_defs.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1003 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mlx5/windows/mlx5_win_ext.h 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mvep/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mvep/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 794 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mvep/mvep_common.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mvep/rte_mvep_common.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 70 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/mvep/version.map 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 4979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 4720 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 15774 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common_ctrl.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common_log.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common_log.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 6168 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common_pci.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_common_pci.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1989 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_dev.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_dev.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1077 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/nfp_platform.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nfp/version.map 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 1923 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_csr.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 3089 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_device.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_device.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 9193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_hal.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 5320 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_hal.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_logs.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_logs.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 3420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_qp.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 3167 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/nitrox_qp.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 91 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/nitrox/version.map 00:46:58.290 -rw-r--r-- vagrant/vagrant 223 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/meson.build 00:46:58.290 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/ 00:46:58.290 -rw-r--r-- vagrant/vagrant 32813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_security.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1998 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_security.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 4969 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_security_ar.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_telemetry.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1043 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_telemetry_bphy.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 23266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_telemetry_nix.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 7357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_telemetry_npa.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1208 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_telemetry_sso.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 1732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_utils.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/cnxk_utils.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 2420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/meson.build 00:46:58.290 -rw-r--r-- vagrant/vagrant 8885 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ae.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 2426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ae.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 89211 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ae_fpm_tables.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ae_fpm_tables.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 6750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_aes.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_aes.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 1838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_api.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bitfield.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 745 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bits.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 699 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy.h 00:46:58.290 -rw-r--r-- vagrant/vagrant 14072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy_cgx.c 00:46:58.290 -rw-r--r-- vagrant/vagrant 5951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy_cgx.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 3696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy_cgx_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 8148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_bphy_irq.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_constants.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 27550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_cpt.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 6917 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_cpt.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 8688 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_cpt_debug.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_cpt_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_cpt_sg.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 44818 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_dev.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 3482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_dev_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 3169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_dpi.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_dpi.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 927 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_dpi_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2891 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_errata.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 9231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_eswitch.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1964 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_eswitch.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 1734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_features.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 14622 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_hash.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_hash.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 6741 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_idev.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1087 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_idev.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 1682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_idev_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ie.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 6018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ie_on.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 1086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ie_ot.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 13278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ie_ot.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 4851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ie_ot_tls.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 7243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_io.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 3409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_io_generic.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 6911 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 11384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mbox.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 87888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mbox.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 7020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mbox_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 19917 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mcs.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 19986 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mcs.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 1694 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mcs_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 13314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mcs_sec_cfg.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 5039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_mcs_stats.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 17114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ml.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 7076 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ml.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ml_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 8656 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_model.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 6731 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_model.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 12181 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 32839 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 35364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_bpf.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 49430 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_debug.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 19524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_fc.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 45184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 7425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 26643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl_dev.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 10598 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl_dev_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1770 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl_dp.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_inl_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 12887 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 8190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_mac.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 4570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_mcast.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 2461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_npc.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 12970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_ops.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 14474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2976 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_ptp.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 38280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_queue.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 6506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_rss.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 13212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_stats.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 49228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_tm.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 8156 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_tm_mark.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 34564 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_tm_ops.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 31565 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_tm_utils.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 5888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_vlan.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 11530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_nix_xstats.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 30431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 22572 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 5732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa_debug.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 777 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa_dp.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 6743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1512 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npa_type.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 48078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 15982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 7930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_aging.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 29463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_mcam.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 22062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_mcam_dump.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 31507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_parse.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 16933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 24080 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_npc_utils.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 2461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_platform.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 13695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_platform.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 767 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 14172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ree.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 3981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ree.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_ree_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 17225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_se.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 10027 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_se.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 23999 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 3467 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso_debug.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso_dp.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 3960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_sso_priv.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 9755 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_tim.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 1710 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_tim.h 00:46:58.291 -rw-r--r-- vagrant/vagrant 2776 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_tim_irq.c 00:46:58.291 -rw-r--r-- vagrant/vagrant 745 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_tim_priv.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 251 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_util_priv.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 6364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_utils.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 320 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/roc_utils.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 13648 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/version.map 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 7774 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/cpt.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 3136 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/dpi.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 3677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/ml.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 77274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/nix.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 12200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/npa.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 13680 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/npc.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 3307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/ree.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 9440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/rvu.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 6461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/sdp.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 11697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/sso.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 3071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/ssow.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cnxk/hw/tim.h 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/octeontx/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/octeontx/meson.build 00:46:58.292 -rw-r--r-- vagrant/vagrant 7625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/octeontx/octeontx_mbox.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 1597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/octeontx/octeontx_mbox.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/octeontx/version.map 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 2791 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_common.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 75827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_fpm_tables.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 19081 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_hw_types.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 10237 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_mcode_defines.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1253 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_pmd_logs.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_pmd_ops_helper.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 1259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_pmd_ops_helper.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 88017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_ucode.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 29189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/cpt_ucode_asym.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/meson.build 00:46:58.292 -rw-r--r-- vagrant/vagrant 182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/cpt/version.map 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/ 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 6761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen1.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 1490 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen2.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 2676 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen3.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 10201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen4.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 1761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen5.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 8679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gen_lce.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 2888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/dev/qat_dev_gens.h 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 5671 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_pf2vf_msg.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 5815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_transport_access_macros.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_transport_access_macros_gen4.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_transport_access_macros_gen4vf.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1808 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_transport_access_macros_gen_lce.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/adf_transport_access_macros_gen_lcevf.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 12061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_fw.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 16591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_fw_comp.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 15409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_fw_la.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 75000 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_fw_mmp_ids.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 13583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_fw_pke.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 19786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_hw.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 6469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_hw_gen4_comp.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 11168 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/icp_qat_hw_gen4_comp_defs.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 10015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_adf/qat_pke.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 4001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/meson.build 00:46:58.292 -rw-r--r-- vagrant/vagrant 3639 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_common.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 2264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_common.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 12398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_device.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 4455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_device.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_logs.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 997 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_logs.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_pf2vf.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_pf2vf.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 27101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_qp.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 6142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/qat/qat_qp.h 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 295 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 9402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/compat.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 1982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaa_list.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 14079 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaa_of.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 4604 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaa_of.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 12973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaax_iova_table.c 00:46:58.292 -rw-r--r-- vagrant/vagrant 3362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaax_iova_table.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/dpaax_logs.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 412 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/meson.build 00:46:58.292 -rw-r--r-- vagrant/vagrant 445 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/version.map 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/ 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 8385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/fifo_load_store_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 5030 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/header_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 3932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/jump_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 4255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/key_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 8490 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/load_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 9005 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/math_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 10001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/move_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 5222 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/nfifo_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 15957 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/operation_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 21644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/protocol_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 20521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/sec_run_time_asm.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 4590 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/seq_in_out_ptr_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/signature_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 5412 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta/store_cmd.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 2697 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/compat.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 84235 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc.h 00:46:58.292 -rw-r--r-- vagrant/vagrant 41182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/rta.h 00:46:58.292 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/ 00:46:58.292 -rw-r--r-- vagrant/vagrant 27088 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/algo.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 3056 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/common.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 53563 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/ipsec.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 90546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/pdcp.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 32838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/dpaax/caamflib/desc/sdap.h 00:46:58.293 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/ 00:46:58.293 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ 00:46:58.293 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/README 00:46:58.293 -rw-r--r-- vagrant/vagrant 31011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_ev.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 13088 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_evb.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 58851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_filter.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 6844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_firmware_ids.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 22311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_image.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 36696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 2884 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_intr.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 31086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_mac.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 8814 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_mcdi.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 75792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 56101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_nvram.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 20742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_phy.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 11360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_proxy.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 32031 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_rx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 2512 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_signed_image_layout.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 34293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_tlv_layout.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 14720 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_tx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 8861 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/ef10_vpd.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 137736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 2620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_annote.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 24501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_bootcfg.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 11161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_check.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 4061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_crc32.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 37845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_ev.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 13201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_evb.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 40357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_filter.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 6750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_hash.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 54813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 13869 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_intr.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 36814 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_lic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 23670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_mac.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 107087 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_mae.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 88495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_mcdi.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 17355 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_mcdi.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 22846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_mon.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 41838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 25131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_nvram.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 7610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_pci.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 12893 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_phy.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 598 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_phy_ids.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 5226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_port.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 7427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_proxy.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 119769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 3638 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_counters_pkt_format.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 20960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_ef10.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 31502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_ef100.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 1624518 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_mcdi.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 149329 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_mcdi_aoe.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 7462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_mcdi_strs.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 55493 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_regs_pci.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 44711 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_rx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 6992 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_sram.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 12983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_table.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 17332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_tunnel.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 27639 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_tx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 78789 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_types.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 6265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_virtio.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 19631 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/efx_vpd.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 1040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/hunt_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 5873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/hunt_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 14953 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/mcdi_mon.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 1131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/mcdi_mon.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 699 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/medford2_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 4573 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/medford2_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/medford_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 4484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/medford_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 2141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/meson.build 00:46:58.293 -rw-r--r-- vagrant/vagrant 17248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_ev.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 10178 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 1595 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_intr.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 16224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 3204 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_pci.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 15830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_rx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 8422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_tunnel.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 3509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_tx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 9373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/rhead_virtio.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 7782 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_flash.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 10381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_impl.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 14487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_mac.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 5516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_mcdi.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 20628 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_nic.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 17189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_nvram.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 21306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_phy.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 4018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_sram.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 13319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/base/siena_vpd.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 20488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/efsys.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/meson.build 00:46:58.293 -rw-r--r-- vagrant/vagrant 2635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 848 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 764 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx_debug.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx_log.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 9057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx_mcdi.c 00:46:58.293 -rw-r--r-- vagrant/vagrant 1805 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/sfc_efx_mcdi.h 00:46:58.293 -rw-r--r-- vagrant/vagrant 6792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/sfc_efx/version.map 00:46:58.293 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/README 00:46:58.294 -rw-r--r-- vagrant/vagrant 26946 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_adminq.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 2625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_adminq.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 22599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_adminq_cmd.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 1164 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_alloc.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 34133 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_common.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_devids.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 1786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_impl.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 5055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_osdep.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 3569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_prototype.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 6414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_register.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 2492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_status.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 30653 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/iavf_type.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 174 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/version.map 00:46:58.294 -rw-r--r-- vagrant/vagrant 84086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/virtchnl.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 14024 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/iavf/virtchnl_inline_ipsec.h 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/ 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 572 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/README 00:46:58.294 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_alloc.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 10323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_common.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 20182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_controlq.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 5550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_controlq.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 5907 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_controlq_api.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 4520 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_controlq_setup.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_devids.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 5476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_lan_pf_regs.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 9430 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_lan_txrx.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 5130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_lan_vf_regs.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 8621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_osdep.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 1376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_prototype.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/idpf_type.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 1640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/siov_regs.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 62068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/virtchnl2.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 20587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/base/virtchnl2_lan_desc.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 22633 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_device.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 6348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_device.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 1115 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_logs.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 42286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_rxtx.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 10338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_rxtx.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 51424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_rxtx_avx512.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 33498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_virtchnl.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 3185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/idpf_common_virtchnl.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 1472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 1857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/idpf/version.map 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/ionic_common.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 5512 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/ionic_common_uio.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/ionic_osdep.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 5189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/ionic_regs.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 108 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/common/ionic/version.map 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/ 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/cn9k/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 22760 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/cn9k/cn9k_regexdev.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 1063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/cn9k/cn9k_regexdev.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/cn9k/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/meson.build 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 4857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 3579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 7002 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex_control.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 1132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex_devx.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 23621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex_fastpath.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_regex_utils.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 8221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_rxp.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 4059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/regex/mlx5/mlx5_rxp.h 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/ 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/meson.build 00:46:58.294 -rw-r--r-- vagrant/vagrant 4164 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/otx_zip.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 10500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/otx_zip.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 17180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/otx_zip_pmd.c 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/include/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 22807 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/octeontx/include/zip_regs.h 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 39485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/qat_comp.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 3743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/qat_comp.h 00:46:58.294 -rw-r--r-- vagrant/vagrant 23060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/qat_comp_pmd.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 3452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/qat_comp_pmd.h 00:46:58.294 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/ 00:46:58.294 -rw-r--r-- vagrant/vagrant 5290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gen1.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gen2.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gen3.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 6418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gen4.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 2399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gen5.c 00:46:58.294 -rw-r--r-- vagrant/vagrant 1172 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/qat/dev/qat_comp_pmd_gens.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/zlib/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/zlib/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 10492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/zlib/zlib_pmd.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 7224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/zlib/zlib_pmd_ops.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1848 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/zlib/zlib_pmd_private.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/meson.build 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/isal/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 21736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/isal/isal_compress_pmd.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 10009 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/isal/isal_compress_pmd_ops.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/isal/isal_compress_pmd_private.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/isal/meson.build 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/mlx5/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 698 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/mlx5/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 28510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/mlx5/mlx5_compress.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 572 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/mlx5/mlx5_compress_utils.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 15813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/nitrox_comp.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/nitrox_comp.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 31316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/nitrox_comp_reqmgr.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/compress/nitrox/nitrox_comp_reqmgr.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/meson.build 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 46569 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/ifcvf_vdpa.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/meson.build 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/base/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 12547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/base/ifcvf.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 4788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/base/ifcvf.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/ifc/base/ifcvf_osdep.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 985 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 27876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 14528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 8503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_cthread.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 18166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_event.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 4655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_lm.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 12146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_mem.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 8289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_steer.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 540 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_utils.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 27143 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/mlx5/mlx5_vdpa_virtq.c 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 20162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/nfp_vdpa.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 4817 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/nfp_vdpa_core.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1181 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/nfp_vdpa_core.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/nfp_vdpa_log.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/nfp/nfp_vdpa_log.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 626 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 9019 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 3413 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_debug.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 3472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_filter.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 10630 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_hw.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1681 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_log.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_mcdi.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 22717 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_ops.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/vdpa/sfc/sfc_vdpa_ops.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 129167 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/dpaa2_sec_dpseci.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/dpaa2_sec_event.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/dpaa2_sec_logs.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 22075 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/dpaa2_sec_priv.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 27560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/dpaa2_sec_raw_dp.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 88 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/version.map 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/mc/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 22563 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/mc/dpseci.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 13003 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/mc/fsl_dpseci.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 5845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa2_sec/mc/fsl_dpseci_cmd.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 20353 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/rte_cryptodev_scheduler.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 7706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/rte_cryptodev_scheduler.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/rte_cryptodev_scheduler_operations.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 5916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_failover.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 11350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_multicore.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 11864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_pkt_size_distr.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 15162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_pmd.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 20025 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_pmd_ops.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 7034 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_pmd_private.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 5359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/scheduler_roundrobin.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/scheduler/version.map 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 102307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/dpaa_sec.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 22916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/dpaa_sec.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/dpaa_sec_event.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1327 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/dpaa_sec_log.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 25478 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/dpaa_sec_raw_dp.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 86 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/dpaa_sec/version.map 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/uadk/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 680 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/uadk/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 25617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/uadk/uadk_crypto_pmd.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 1905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/uadk/uadk_crypto_pmd_private.h 00:46:58.295 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/ 00:46:58.295 -rw-r--r-- vagrant/vagrant 12007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/ipsec_mb_ops.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 5505 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/ipsec_mb_private.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 13165 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/ipsec_mb_private.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 1421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/meson.build 00:46:58.295 -rw-r--r-- vagrant/vagrant 22811 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_aesni_gcm.c 00:46:58.295 -rw-r--r-- vagrant/vagrant 4322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_aesni_gcm_priv.h 00:46:58.295 -rw-r--r-- vagrant/vagrant 76593 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_aesni_mb.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 20561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_aesni_mb_priv.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 10970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_chacha_poly.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_chacha_poly_priv.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 13615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_kasumi.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_kasumi_priv.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 17070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_snow3g.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1935 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_snow3g_priv.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 11346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_zuc.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ipsec_mb/pmd_zuc_priv.h 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_crypto_algs.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 1200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_crypto_capabilities.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 38868 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_cryptodev.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_cryptodev.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 2966 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_logs.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 11621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_pci.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 8509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_pci.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 3833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_ring.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 14805 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtio_rxtx.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtqueue.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 4617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/virtio/virtqueue.h 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 13887 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 5140 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 4453 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto_dek.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 31718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto_gcm.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 564 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto_utils.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 19461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mlx5/mlx5_crypto_xts.c 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 2617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/mrvl_pmd_private.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/rte_mrvl_compat.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 36067 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/rte_mrvl_pmd.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 22535 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/mvsam/rte_mrvl_pmd_ops.c 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/armv8/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 5406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/armv8/armv8_pmd_private.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/armv8/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 23913 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/armv8/rte_armv8_pmd.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 7277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/armv8/rte_armv8_pmd_ops.c 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 19742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 304 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 3261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym_capabilities.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 313 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym_capabilities.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 1716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym_ctx.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 21825 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym_reqmgr.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/nitrox/nitrox_sym_reqmgr.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/meson.build 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/ 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/hw/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 21543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/hw/bcmfs4_rm.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 19576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/hw/bcmfs5_rm.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 2125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/hw/bcmfs_rm_common.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/hw/bcmfs_rm_common.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 738 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_dev_msg.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 7059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_device.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_device.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 793 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_hw_defs.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 586 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_logs.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_logs.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 8934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_qp.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 3879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_qp.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 7666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 13368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_capabilities.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_capabilities.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 1118 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_defs.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 32422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_engine.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 3349 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_engine.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 11504 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_pmd.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_pmd.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 1464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_req.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 6522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_session.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 2284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_sym_session.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 2129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_vfio.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/bcmfs_vfio.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/bcmfs/meson.build 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/null/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/null/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 6312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/null/null_crypto_pmd.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 7390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/null/null_crypto_pmd_ops.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 1440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/null/null_crypto_pmd_private.h 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 68140 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 5297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_capabilities.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_capabilities.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 6769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_config.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 8264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_desc.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 9689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_hw.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 15070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_hw_specific.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 1242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_log.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 8428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_pvt.h 00:46:58.296 -rw-r--r-- vagrant/vagrant 15080 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/caam_jr_uio.c 00:46:58.296 -rw-r--r-- vagrant/vagrant 489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/caam_jr/meson.build 00:46:58.296 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/ 00:46:58.296 -rw-r--r-- vagrant/vagrant 863 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/meson.build 00:46:58.296 -rw-r--r-- vagrant/vagrant 2914 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 12286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_capabilities.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_capabilities.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 17104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_hw_access.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 9283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_hw_access.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 4486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_mbox.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 2148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_mbox.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 27696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_ops.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/otx_cryptodev_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 119 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/octeontx/version.map 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 84039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_crypto.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 8936 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_crypto.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 16219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_dev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 11881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_dev.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 30985 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_pmd_ops.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 3203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/ccp_pmd_private.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/meson.build 00:46:58.297 -rw-r--r-- vagrant/vagrant 8308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/ccp/rte_ccp_pmd.c 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 3994 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/compat.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/meson.build 00:46:58.297 -rw-r--r-- vagrant/vagrant 5455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/openssl_pmd_private.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 82382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/rte_openssl_pmd.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 34542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/openssl/rte_openssl_pmd_ops.c 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 3825 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev_event_dp.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 55798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev_ops.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 3814 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev_sec.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1529 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_cryptodev_sec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 10028 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_ipsec.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_ipsec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 8801 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_ipsec_la_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 27214 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_tls.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_tls.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 10706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn10k_tls_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 3790 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_cryptodev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_cryptodev.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 19006 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_cryptodev_ops.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 2018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_cryptodev_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 8266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_ipsec.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_ipsec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 7183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cn9k_ipsec_la_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 38109 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_ae.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 1716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 1502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 47384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_capabilities.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_capabilities.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 2039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_devargs.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 24139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_ops.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 5752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_ops.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_sec.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_cryptodev_sec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 5420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_ipsec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 98982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_se.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 7182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/cnxk_sg.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 1012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/meson.build 00:46:58.297 -rw-r--r-- vagrant/vagrant 1324 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/rte_pmd_cnxk_crypto.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/cnxk/version.map 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/ 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 3070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_asym_pmd_gen1.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 11899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gen2.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 27294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gen3.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 14180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gen4.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 9341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gen5.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 9556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gen_lce.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 32826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_crypto_pmd_gens.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 38981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/dev/qat_sym_pmd_gen1.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 48153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_asym.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 3158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_asym.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 5311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_crypto.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 2872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_crypto.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 8365 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_ec.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 12540 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_sym.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 14100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_sym.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 98745 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_sym_session.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 6362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/crypto/qat/qat_sym_session.h 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/ 00:46:58.297 -rwxr-xr-x vagrant/vagrant 5361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/dpdk_idxd_cfg.py 00:46:58.297 -rw-r--r-- vagrant/vagrant 9659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/idxd_bus.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 19662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/idxd_common.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 3829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/idxd_hw_defs.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 3646 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/idxd_internal.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 12521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/idxd_pci.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/idxd/meson.build 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ioat/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 22496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ioat/ioat_dmadev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 6829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ioat/ioat_hw_defs.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 1416 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ioat/ioat_internal.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/ioat/meson.build 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/skeleton/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/skeleton/meson.build 00:46:58.297 -rw-r--r-- vagrant/vagrant 17519 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/skeleton/skeleton_dmadev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 2257 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/skeleton/skeleton_dmadev.h 00:46:58.297 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/ 00:46:58.297 -rw-r--r-- vagrant/vagrant 728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/cnxk_dma_event_dp.h 00:46:58.297 -rw-r--r-- vagrant/vagrant 16554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/cnxk_dmadev.c 00:46:58.297 -rw-r--r-- vagrant/vagrant 4251 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/cnxk_dmadev.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 19562 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/cnxk_dmadev_fp.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/meson.build 00:46:58.298 -rw-r--r-- vagrant/vagrant 150 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/cnxk/version.map 00:46:58.298 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/meson.build 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 27885 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa/dpaa_qdma.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 5121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa/dpaa_qdma.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 1220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa/dpaa_qdma_logs.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 287 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa/meson.build 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 45166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/dpaa2_qdma.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 6719 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/dpaa2_qdma.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 1324 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/dpaa2_qdma_logs.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/meson.build 00:46:58.298 -rw-r--r-- vagrant/vagrant 4660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/rte_pmd_dpaa2_qdma.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/dpaa2/version.map 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/hisilicon/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 27016 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/hisilicon/hisi_dmadev.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 8944 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/hisilicon/hisi_dmadev.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/dma/hisilicon/meson.build 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/ 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/skeleton/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 179 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/skeleton/meson.build 00:46:58.298 -rw-r--r-- vagrant/vagrant 11678 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/skeleton/skeleton_eventdev.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 1147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/skeleton/skeleton_eventdev.h 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 2227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/event_ring.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 4277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/iq_chunk.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 299 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/meson.build 00:46:58.298 -rw-r--r-- vagrant/vagrant 29267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 10101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev_log.h 00:46:58.298 -rw-r--r-- vagrant/vagrant 16141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev_scheduler.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 91028 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev_selftest.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 5213 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev_worker.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 17630 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/sw/sw_evdev_xstats.c 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/ 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/ 00:46:58.298 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/ 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 634 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_0_15_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 549 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 555 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 555 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_112_127_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_16_31_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 576 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_32_47_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_48_63_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_64_79_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_tmo.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_tmo_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_tmo_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_80_95_tmo_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_seg.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_seg_burst.c 00:46:58.298 -rw-r--r-- vagrant/vagrant 426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn10k/deq_96_111_tmo_seg_burst.c 00:46:58.299 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/ 00:46:58.299 -rw-r--r-- vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 383 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_dual_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_0_15_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 386 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 395 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_dual_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_112_127_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_dual_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_16_31_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_dual_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_32_47_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_tmo.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_tmo_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_tmo_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_dual_tmo_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_seg.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_seg_burst.c 00:46:58.299 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_48_63_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_dual_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_64_79_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_dual_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_80_95_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_dual_tmo_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_seg_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_tmo.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_tmo_burst.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_tmo_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/deq/cn9k/deq_96_111_tmo_seg_burst.c 00:46:58.300 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/ 00:46:58.300 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/ 00:46:58.300 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_0_15.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_0_15_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_112_127.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_112_127_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_16_31.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_16_31_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_32_47.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_32_47_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_48_63.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_48_63_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_64_79.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_64_79_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_80_95.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_80_95_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_96_111.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn10k/tx_96_111_seg.c 00:46:58.300 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/ 00:46:58.300 -rw-r--r-- vagrant/vagrant 218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_0_15.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_0_15_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_0_15_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_0_15_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_112_127.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_112_127_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_112_127_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_112_127_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_16_31.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_16_31_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_16_31_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_16_31_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_32_47.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_32_47_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_32_47_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_32_47_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_48_63.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_48_63_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_48_63_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_48_63_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_64_79.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_64_79_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_64_79_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_64_79_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_80_95.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_80_95_dual.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_80_95_dual_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_80_95_seg.c 00:46:58.300 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_96_111.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 285 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_96_111_dual.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_96_111_dual_seg.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/tx/cn9k/tx_96_111_seg.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 35458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn10k_eventdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 757 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn10k_eventdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 8674 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn10k_tx_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 11781 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn10k_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 20534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn10k_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 34359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn9k_eventdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 3508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn9k_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 32188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cn9k_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 18528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 9665 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 21740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev_adptr.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 1182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev_dp.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 42021 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev_selftest.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 6611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_eventdev_stats.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 15978 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_tim_evdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 9585 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_tim_evdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 5878 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_tim_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 16414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_tim_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 880 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/cnxk_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 11117 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/rte_pmd_cnxk_eventdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 137 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/cnxk/version.map 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 134956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 8620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_avx512.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_iface.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2957 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_iface.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_inline_fns.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 748 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_log.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 20840 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_priv.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 37137 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_selftest.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 7041 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_sse.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 23014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_user.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 33695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/dlb2_xstats.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 1995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/rte_pmd_dlb2.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/rte_pmd_dlb2.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/version.map 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 15487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/dlb2_main.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 3306 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/dlb2_main.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 21435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/dlb2_pf.c 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 9421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_hw_types.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 6028 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_osdep.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 9601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_osdep_bitmap.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 3381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_osdep_list.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_osdep_types.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 179753 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_regs.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 182822 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_resource.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 67059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dlb2/pf/base/dlb2_resource.h 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 26244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa/dpaa_eventdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa/dpaa_eventdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa/meson.build 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 30462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/dpaa2_eventdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2688 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/dpaa2_eventdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/dpaa2_eventdev_logs.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 20654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/dpaa2_eventdev_selftest.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2773 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/dpaa2_hw_dpcon.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dpaa2/meson.build 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 11907 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/dsw_evdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 10098 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/dsw_evdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 40544 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/dsw_event.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 1194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/dsw_sort.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 9574 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/dsw_xstats.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/dsw/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/meson.build 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 24535 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_evdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 6471 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_evdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 40373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_evdev_selftest.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 6479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_probe.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 13435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 7165 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/ssovf_worker.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 11756 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/timvf_evdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 6831 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/timvf_evdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 3343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/timvf_probe.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 4696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/timvf_worker.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 11203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/octeontx/timvf_worker.h 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 17117 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_evdev.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 7875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_evdev.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 20462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_evdev_init.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 3705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_evdev_xstats.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_log.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 33191 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_ring.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 19417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_ring.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 22803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/event/opdl/opdl_test.c 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/ 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/common.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 36903 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/cuda.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 2336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/devices.h 00:46:58.301 -rw-r--r-- vagrant/vagrant 3394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/gdrcopy.c 00:46:58.301 -rw-r--r-- vagrant/vagrant 635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/cuda/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 140 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/gpu/meson.build 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/ 00:46:58.301 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/bucket/ 00:46:58.301 -rw-r--r-- vagrant/vagrant 400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/bucket/meson.build 00:46:58.301 -rw-r--r-- vagrant/vagrant 17099 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/bucket/rte_mempool_bucket.c 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 7190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cn10k_hwpool_ops.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 9973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cn10k_mempool_ops.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 2326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cn9k_mempool_ops.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 5376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cnxk_mempool.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 2282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cnxk_mempool.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 5073 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cnxk_mempool_ops.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 1244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/cnxk_mempool_telemetry.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 1093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/rte_pmd_cnxk_mempool.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/cnxk/version.map 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 9339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa/dpaa_mempool.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 1946 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa/dpaa_mempool.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 76 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa/version.map 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 12141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/dpaa2_hw_mempool.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 1834 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/dpaa2_hw_mempool.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 1293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/dpaa2_hw_mempool_logs.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 1226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/rte_dpaa2_mempool.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 195 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/dpaa2/version.map 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 18232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/octeontx_fpavf.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 3595 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/octeontx_fpavf.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/octeontx_pool_logs.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 4883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/octeontx/rte_mempool_octeontx.c 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/ring/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/ring/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 5148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/ring/rte_mempool_ring.c 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/stack/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/stack/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/stack/rte_mempool_stack.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/drivers/mempool/meson.build 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/ 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/.devcontainer/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 1315 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/.devcontainer/devcontainer.json 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 545 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/__init__.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 5230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/exception.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 6934 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/logger.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 29884 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/runner.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 8833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/settings.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 21422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/test_result.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 14727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/test_suite.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 7515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/utils.py 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/config/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 19987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/config/__init__.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 11427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/config/conf_yaml_schema.json 00:46:58.302 -rw-r--r-- vagrant/vagrant 2175 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/config/types.py 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 2015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/__init__.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 4873 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/interactive_remote_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 6434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/interactive_shell.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 845 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/python_shell.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 8968 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/remote_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 4065 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/ssh_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 8265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/remote_session/testpmd_shell.py 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 1128 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/__init__.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 14798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/cpu.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 8333 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/linux_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 12016 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/node.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 14774 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/os_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 2816 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/port.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 12049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/posix_session.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 18905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/sut_node.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 2954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/tg_node.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/virtual_device.py 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/traffic_generator/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 1837 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/traffic_generator/__init__.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 5841 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 12322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/traffic_generator/scapy.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 2893 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/framework/testbed_model/traffic_generator/traffic_generator.py 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/tests/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 2294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/tests/TestSuite_hello_world.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 1636 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/tests/TestSuite_os_udp.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 5899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/tests/TestSuite_pmd_buffer_scatter.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 5710 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/tests/TestSuite_smoke_tests.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 1173 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/Dockerfile 00:46:58.302 -rw-r--r-- vagrant/vagrant 3405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/README.md 00:46:58.302 -rw-r--r-- vagrant/vagrant 2941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/conf.yaml 00:46:58.302 -rwxr-xr-x vagrant/vagrant 991 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/main.py 00:46:58.302 -rw-r--r-- vagrant/vagrant 63706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/poetry.lock 00:46:58.302 -rw-r--r-- vagrant/vagrant 1407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/dts/pyproject.toml 00:46:58.302 -rw-r--r-- vagrant/vagrant 506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.editorconfig 00:46:58.302 -rw-r--r-- vagrant/vagrant 29 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.git 00:46:58.302 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.gitattributes 00:46:58.302 -rw-r--r-- vagrant/vagrant 1373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.gitignore 00:46:58.302 -rw-r--r-- vagrant/vagrant 72346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.mailmap 00:46:58.302 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/dpdk/ABI_VERSION 00:46:58.302 -rw-r--r-- vagrant/vagrant 50702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/MAINTAINERS 00:46:58.302 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/Makefile 00:46:58.302 -rw-r--r-- vagrant/vagrant 510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/README 00:46:58.302 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/dpdk/VERSION 00:46:58.302 -rw-r--r-- vagrant/vagrant 5364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 4641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/meson_options.txt 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/ 00:46:58.302 -rw-r--r-- vagrant/vagrant 1534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/Makefile 00:46:58.302 -rw-r--r-- vagrant/vagrant 6342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/commands.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 199 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/commands.h 00:46:58.302 -rw-r--r-- vagrant/vagrant 1081 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/main.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/meson.build 00:46:58.302 -rw-r--r-- vagrant/vagrant 2394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/parse_obj_list.c 00:46:58.302 -rw-r--r-- vagrant/vagrant 1637 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/cmdline/parse_obj_list.h 00:46:58.302 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 8081 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/action.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1929 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/action.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 155423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/cli.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/cli.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/common.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 6176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/conn.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 850 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/conn.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 3387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/cryptodev.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/cryptodev.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 5255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/link.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/link.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 4838 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/mempool.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/mempool.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/meson.build 00:46:58.303 -rw-r--r-- vagrant/vagrant 8705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/parser.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/parser.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 22979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/pipeline.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 8209 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/pipeline.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 1180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/swq.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/swq.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 1641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/tap.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/tap.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 68479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/thread.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/thread.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 5024 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/tmgr.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/tmgr.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 2507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/firewall.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/flow.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/flow_crypto.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/l2fwd.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/route.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/route_ecmp.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 5788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/rss.cli 00:46:58.303 -rw-r--r-- vagrant/vagrant 2685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/tap.cli 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 23541 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd/meson.build 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 11768 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/main.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/meson.build 00:46:58.303 -rw-r--r-- vagrant/vagrant 643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/rte_policer.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_meter/rte_policer.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_crypto/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_crypto/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 13959 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_crypto/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_crypto/meson.build 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 3189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/pkt_group.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/altivec/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/altivec/port_group.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/neon/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1208 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/neon/port_group.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/sse/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1098 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/common/sse/port_group.h 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_reassembly/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_reassembly/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 29690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_reassembly/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 353 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_reassembly/meson.build 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-graph/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-graph/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 38465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-graph/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 386 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-graph/meson.build 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 1564 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 4915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/app_thread.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 10949 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/args.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 12926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/cfg_file.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/cfg_file.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 23368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/cmdline.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 11400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/init.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 5200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 4061 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/main.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 477 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/meson.build 00:46:58.303 -rw-r--r-- vagrant/vagrant 2805 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/profile.cfg 00:46:58.303 -rw-r--r-- vagrant/vagrant 3501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/profile_ov.cfg 00:46:58.303 -rw-r--r-- vagrant/vagrant 4054 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/profile_pie.cfg 00:46:58.303 -rw-r--r-- vagrant/vagrant 4352 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/profile_red.cfg 00:46:58.303 -rw-r--r-- vagrant/vagrant 9865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/qos_sched/stats.c 00:46:58.303 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/ 00:46:58.303 -rw-r--r-- vagrant/vagrant 2161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/Makefile 00:46:58.303 -rw-r--r-- vagrant/vagrant 27798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_manager.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 8171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_manager.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 28201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_monitor.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_monitor.h 00:46:58.303 -rw-r--r-- vagrant/vagrant 10666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/main.c 00:46:58.303 -rw-r--r-- vagrant/vagrant 1122 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/meson.build 00:46:58.303 -rw-r--r-- vagrant/vagrant 1051 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 468 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor_nop.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 6766 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor_x86.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 2412 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/parse.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 334 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/parse.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 5487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/power_manager.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 5260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/power_manager.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 15987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/vm_power_cli.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/vm_power_cli.h 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 4511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/main.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 494 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/meson.build 00:46:58.304 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/parse.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/parse.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 14081 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/distributor/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/distributor/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 26719 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/distributor/main.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/distributor/meson.build 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/bypass_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 6186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 1343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/data_rxtx.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/linux_test.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2656 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/load_env.sh 00:46:58.304 -rwxr-xr-x vagrant/vagrant 3484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/pkttest.py 00:46:58.304 -rwxr-xr-x vagrant/vagrant 1281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/pkttest.sh 00:46:58.304 -rwxr-xr-x vagrant/vagrant 5372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/run_test.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2111 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 2049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh 00:46:58.304 -rwxr-xr-x vagrant/vagrant 6258 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py 00:46:58.304 -rw-r--r-- vagrant/vagrant 4475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4793 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4776 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 3690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh 00:46:58.304 -rw-r--r-- vagrant/vagrant 4387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh 00:46:58.304 -rwxr-xr-x vagrant/vagrant 18610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py 00:46:58.304 -rw-r--r-- vagrant/vagrant 1809 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 8542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ep0.cfg 00:46:58.304 -rw-r--r-- vagrant/vagrant 8068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ep1.cfg 00:46:58.304 -rw-r--r-- vagrant/vagrant 13358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/esp.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/esp.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 53063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/event_helper.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 8528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/event_helper.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 11340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/flow.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/flow.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 4525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipip.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 85746 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec-secgw.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 5452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec-secgw.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 29093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 11016 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 5231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 7748 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_neon.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 8433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_process.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 46170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_worker.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 16057 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_worker.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/meson.build 00:46:58.304 -rw-r--r-- vagrant/vagrant 12121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/parser.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 2068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/parser.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 5441 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/rt.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 47186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sa.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 3597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sad.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 4599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sad.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 15401 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sp4.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 19119 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sp6.c 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 76355 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/main.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/main.h 00:46:58.304 -rw-r--r-- vagrant/vagrant 415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/meson.build 00:46:58.304 -rw-r--r-- vagrant/vagrant 5184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/perf_core.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd-power/perf_core.h 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 8955 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/main.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/meson.build 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq/Makefile 00:46:58.304 -rw-r--r-- vagrant/vagrant 17527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq/main.c 00:46:58.304 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq/meson.build 00:46:58.304 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/dma/ 00:46:58.304 -rw-r--r-- vagrant/vagrant 1502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/dma/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 30716 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/dma/dmafwd.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/dma/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipv4_multicast/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipv4_multicast/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 22177 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipv4_multicast/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ipv4_multicast/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/em_default_v4.cfg 00:46:58.305 -rw-r--r-- vagrant/vagrant 1526 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/em_default_v6.cfg 00:46:58.305 -rw-r--r-- vagrant/vagrant 6671 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/em_route_parse.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 8184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 27671 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 1922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 2827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl_scalar.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 6414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_altivec.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 3201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_common.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 28973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 5681 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 9440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 1275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 4045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_sequential.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 9653 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 4032 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 10037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event_generic.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 9626 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event_internal_port.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 20696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_fib.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 18987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 2739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 4409 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 4735 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_neon.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 3979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_sse.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 5826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_neon.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 2567 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_route.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 5695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_sse.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 334 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_default_v4.cfg 00:46:58.305 -rw-r--r-- vagrant/vagrant 792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_default_v6.cfg 00:46:58.305 -rw-r--r-- vagrant/vagrant 6105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_route_parse.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 46298 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 618 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l3fwd/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/ 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1526 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/meson.build 00:46:58.305 -rw-r--r-- vagrant/vagrant 9375 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/node.c 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 3601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/args.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/args.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 9018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/init.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 1060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/init.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 8829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 395 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/shared/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1904 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/shared/common.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/server_node_efd/Makefile 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq_dcb/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq_dcb/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 18639 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq_dcb/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vmdq_dcb/meson.build 00:46:58.305 -rw-r--r-- vagrant/vagrant 3645 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 591 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1631 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 25131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/ethapp.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/ethapp.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 7625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/main.c 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/lib/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/lib/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 9750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/lib/rte_ethtool.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 12959 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ethtool/lib/rte_ethtool.h 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1557 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 21645 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/cat.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/cat.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 5278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/l2fwd-cat.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-cat/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/link_status_interrupt/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1520 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/link_status_interrupt/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 19818 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/link_status_interrupt/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/link_status_interrupt/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/service_cores/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1507 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/service_cores/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 6525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/service_cores/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/service_cores/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 12270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/meson.build 00:46:58.305 -rw-r--r-- vagrant/vagrant 4189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_common.h 00:46:58.305 -rw-r--r-- vagrant/vagrant 16093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 25252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 78014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/main.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/ 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/Makefile 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile 00:46:58.305 -rw-r--r-- vagrant/vagrant 6280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/client.c 00:46:58.305 -rw-r--r-- vagrant/vagrant 380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/meson.build 00:46:58.305 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/ 00:46:58.305 -rw-r--r-- vagrant/vagrant 1545 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 2975 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/args.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 199 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/args.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 7489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/init.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 1012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/init.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 7770 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/shared/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/shared/common.h 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1690 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 2104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/commands.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/commands.list 00:46:58.306 -rw-r--r-- vagrant/vagrant 750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/commands.list 00:46:58.306 -rw-r--r-- vagrant/vagrant 3478 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 586 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/meson.build 00:46:58.306 -rw-r--r-- vagrant/vagrant 1169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/mp_commands.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 315 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/mp_commands.h 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 13247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/meson.build 00:46:58.306 -rw-r--r-- vagrant/vagrant 265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/multi_process/Makefile 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/skeleton/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/skeleton/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 5962 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/skeleton/basicfwd.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 330 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/skeleton/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 2049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 43386 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_dev_self_test.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_dev_self_test.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 17290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 9321 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 10813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_aes.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 10391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_ccm.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 4916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_cmac.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 11670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_ecdsa.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 10369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_gcm.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 5180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_hmac.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 15372 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_rsa.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 9446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_sha.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 15874 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_tdes.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 6948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_xts.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 69075 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 946 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/fips_validation/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1658 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 4562 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_common.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 3614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_common.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 16323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 1876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 11269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event_generic.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 9830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 4895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_poll.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_poll.h 00:46:58.306 -rw-r--r-- vagrant/vagrant 20751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-event/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ntb/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1727 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ntb/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 519 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ntb/commands.list 00:46:58.306 -rw-r--r-- vagrant/vagrant 677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ntb/meson.build 00:46:58.306 -rw-r--r-- vagrant/vagrant 34600 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ntb/ntb_fwd.c 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/timer/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1504 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/timer/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 3271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/timer/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/timer/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bbdev_app/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bbdev_app/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 32132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bbdev_app/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bbdev_app/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/flow_filtering/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/flow_filtering/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 3388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/flow_filtering/flow_blocks.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 7513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/flow_filtering/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/flow_filtering/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1513 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 28095 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/packet_ordering/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/packet_ordering/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 20488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/packet_ordering/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/packet_ordering/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 395 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/commands.list 00:46:58.306 -rw-r--r-- vagrant/vagrant 12602 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/meson.build 00:46:58.306 -rw-r--r-- vagrant/vagrant 2091 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vdpa/vdpa_blk_compact.h 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bond/ 00:46:58.306 -rw-r--r-- vagrant/vagrant 1704 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bond/Makefile 00:46:58.306 -rw-r--r-- vagrant/vagrant 350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bond/commands.list 00:46:58.306 -rw-r--r-- vagrant/vagrant 21792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bond/main.c 00:46:58.306 -rw-r--r-- vagrant/vagrant 568 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bond/meson.build 00:46:58.306 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/helloworld/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/helloworld/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 1253 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/helloworld/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/helloworld/meson.build 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ka-agent/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 2942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ka-agent/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 1546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 20652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/meson.build 00:46:58.307 -rw-r--r-- vagrant/vagrant 3039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/shm.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 1792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/shm.h 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1593 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 92020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/cli.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/cli.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 6263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/conn.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/conn.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 3434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 505 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/meson.build 00:46:58.307 -rw-r--r-- vagrant/vagrant 5352 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/obj.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/obj.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 10041 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/thread.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 626 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/thread.h 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 673 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ethdev.io 00:46:58.307 -rw-r--r-- vagrant/vagrant 2197 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 3830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 2226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_nexthop_table.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 1522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_routing_table.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 1083 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/hash_func.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 2291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/hash_func.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1891 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.io 00:46:58.307 -rw-r--r-- vagrant/vagrant 1985 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 18867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec_sa.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 1113 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1911 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 776 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 848 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 806 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_pcap.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/learner.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 3062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/learner.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/meter.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/meter.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/mirroring.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1861 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/mirroring.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 3681 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/packet.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 817 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/pcap.io 00:46:58.307 -rw-r--r-- vagrant/vagrant 1107 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/recirculation.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1815 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/recirculation.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1083 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/registers.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1411 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/registers.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/rss.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 1682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/rss.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 3013 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.txt 00:46:58.307 -rw-r--r-- vagrant/vagrant 1065 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/varbit.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 2124 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/varbit.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan.cli 00:46:58.307 -rw-r--r-- vagrant/vagrant 4689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan.spec 00:46:58.307 -rw-r--r-- vagrant/vagrant 1036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_pcap.cli 00:46:58.307 -rwxr-xr-x vagrant/vagrant 2053 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_table.py 00:46:58.307 -rw-r--r-- vagrant/vagrant 7510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_table.txt 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 53391 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 3615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/main.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/meson.build 00:46:58.307 -rw-r--r-- vagrant/vagrant 11560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost/virtio_net.c 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/README 00:46:58.307 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/dummy.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 1415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/t1.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 817 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/t2.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 1149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/bpf/t3.c 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_fragmentation/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1517 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_fragmentation/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 27296 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_fragmentation/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ip_fragmentation/meson.build 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 44746 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/main.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/meson.build 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ptpclient/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ptpclient/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 331 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ptpclient/meson.build 00:46:58.307 -rw-r--r-- vagrant/vagrant 21109 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/ptpclient/ptpclient.c 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 1559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/Makefile 00:46:58.307 -rw-r--r-- vagrant/vagrant 2993 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/blk.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 2307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/blk_spec.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 539 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/meson.build 00:46:58.307 -rw-r--r-- vagrant/vagrant 21711 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk.c 00:46:58.307 -rw-r--r-- vagrant/vagrant 2503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk.h 00:46:58.307 -rw-r--r-- vagrant/vagrant 4166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk_compat.c 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.ci/ 00:46:58.307 -rwxr-xr-x vagrant/vagrant 5771 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.ci/linux-build.sh 00:46:58.307 -rwxr-xr-x vagrant/vagrant 413 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.ci/linux-setup.sh 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/ 00:46:58.307 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/ 00:46:58.307 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/BSDmakefile.meson 00:46:58.307 -rw-r--r-- vagrant/vagrant 1466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/meson.build 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/contigmem/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 165 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/contigmem/BSDmakefile 00:46:58.308 -rw-r--r-- vagrant/vagrant 8931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/contigmem/contigmem.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/contigmem/meson.build 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/nic_uio/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/nic_uio/BSDmakefile 00:46:58.308 -rw-r--r-- vagrant/vagrant 108 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/nic_uio/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 8893 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/freebsd/nic_uio/nic_uio.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 120 2024-06-07 12:49 spdk-test_gen_spec/dpdk/kernel/meson.build 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.github/ 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.github/workflows/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 10833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/.github/workflows/build.yml 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bitratestats/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 175 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bitratestats/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 3606 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bitratestats/rte_bitrate.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 1541 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bitratestats/rte_bitrate.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 136 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bitratestats/version.map 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 40694 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/rte_efd.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 9101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/rte_efd.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/rte_efd_arm64.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/rte_efd_x86.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 164 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/efd/version.map 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 6643 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/crypto.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 21375 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/esp_inb.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 21354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/esp_outb.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 7607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/iph.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 16109 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/ipsec_sad.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 6578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/ipsec_sqn.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 6261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/ipsec_telemetry.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 4068 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/misc.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1439 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/pad.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 5847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/rte_ipsec.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 3706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/rte_ipsec_group.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 5478 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/rte_ipsec_sa.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 4138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/rte_ipsec_sad.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 21232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/sa.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 5157 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/sa.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1128 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/ses.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 453 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ipsec/version.map 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 7387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/rte_metrics.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 6632 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/rte_metrics.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 14550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/rte_metrics_telemetry.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 1433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/rte_metrics_telemetry.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/metrics/version.map 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 5241 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/guest_channel.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 1855 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/guest_channel.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 13932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_acpi_cpufreq.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 5434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_acpi_cpufreq.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 17468 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_amd_pstate_cpufreq.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 5644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_amd_pstate_cpufreq.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 4402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_common.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 1065 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_common.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 16881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_cppc_cpufreq.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 5495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_cppc_cpufreq.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 12271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_intel_uncore.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 6073 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_intel_uncore.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 3143 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_kvm_vm.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 4080 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_kvm_vm.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 22225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_pstate_cpufreq.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 5464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/power_pstate_cpufreq.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 8158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 6273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 4050 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power_guest_channel.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 19696 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power_pmd_mgmt.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 4717 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power_pmd_mgmt.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 5310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power_uncore.c 00:46:58.308 -rw-r--r-- vagrant/vagrant 7494 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/rte_power_uncore.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/power/version.map 00:46:58.308 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/ 00:46:58.308 -rw-r--r-- vagrant/vagrant 1283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/meson.build 00:46:58.308 -rw-r--r-- vagrant/vagrant 1964 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_lru.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 1531 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_lru_arm64.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 2945 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_lru_x86.h 00:46:58.308 -rw-r--r-- vagrant/vagrant 616 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_hash_func.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 2212 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_keycmp.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_keycmp.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 10589 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 16843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_em.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_em.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 13901 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_learner.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 11583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_learner.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 12170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_selector.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 6288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_selector.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 9816 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_wm.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_swx_table_wm.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 9504 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 18398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_acl.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 1466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_acl.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 4756 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_array.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 853 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_array.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 3703 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 7027 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_cuckoo.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_cuckoo.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 28495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_ext.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 5121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_func.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_func_arm64.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 32332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_key16.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 33846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_key32.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 30892 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_key8.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 26334 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_hash_lru.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 8312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_lpm.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 3273 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_lpm.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 8442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_lpm_ipv6.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 3264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_lpm_ipv6.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 2027 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_stub.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/rte_table_stub.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/table_log.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/table_log.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 1181 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/table/version.map 00:46:58.309 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/ 00:46:58.309 -rw-r--r-- vagrant/vagrant 883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 15101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_convert.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 3475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_def.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 4097 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_dump.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 13752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_exec.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 1060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_impl.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 33693 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_jit_arm64.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 35911 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_jit_x86.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 2389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_load.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 7252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_load_elf.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 12737 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_pkt.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 863 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_stub.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 58374 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/bpf_validate.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 1079 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/meson.build 00:46:58.309 -rw-r--r-- vagrant/vagrant 6203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/rte_bpf.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 3475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/rte_bpf_ethdev.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bpf/version.map 00:46:58.309 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ 00:46:58.309 -rw-r--r-- vagrant/vagrant 22808 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_driver.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 66905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_driver.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 7559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_linux_ethtool.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 887 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_linux_ethtool.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 4318 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_pci.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 12033 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_private.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 2641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_private.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 1468 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_profile.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 902 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_profile.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 79353 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_trace.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 23958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_trace_points.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 1387 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/ethdev_vdev.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 1319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/meson.build 00:46:58.309 -rw-r--r-- vagrant/vagrant 4655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_class_eth.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 1105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_cman.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 1644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_dev_info.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 16437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_eth_ctrl.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 173278 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 243224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 2990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_cman.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 3999 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_core.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 44106 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_telemetry.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 2221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_trace_fp.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 86389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_flow.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 202344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_flow.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 16309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_flow_driver.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 10038 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_mtr.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 35958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_mtr.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 7752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_mtr_driver.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 13852 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_tm.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 75550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_tm.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 10552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/rte_tm_driver.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 14044 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_8079.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 10590 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_8472.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 30812 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_8636.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 20946 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_8636.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 11309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_common.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 5536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_common.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 3651 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_telemetry.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/sff_telemetry.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 9423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ethdev/version.map 00:46:58.309 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/jobstats/ 00:46:58.309 -rw-r--r-- vagrant/vagrant 244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/jobstats/meson.build 00:46:58.309 -rw-r--r-- vagrant/vagrant 5786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/jobstats/rte_jobstats.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 7196 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/jobstats/rte_jobstats.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/jobstats/version.map 00:46:58.309 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/ 00:46:58.309 -rw-r--r-- vagrant/vagrant 1022 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/meson.build 00:46:58.309 -rw-r--r-- vagrant/vagrant 2063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 14384 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 30276 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils_neon.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 3497 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils_neon_bfloat16.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 14872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils_scalar.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 2359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils_scalar.h 00:46:58.309 -rw-r--r-- vagrant/vagrant 4921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/mldev_utils_scalar_bfloat16.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 21190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/rte_mldev.c 00:46:58.309 -rw-r--r-- vagrant/vagrant 35457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/rte_mldev.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 14018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/rte_mldev_core.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/rte_mldev_pmd.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 2906 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/rte_mldev_pmd.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1616 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mldev/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 15532 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/rte_rawdev.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 18161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/rte_rawdev.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 16446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/rte_rawdev_pmd.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 768 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rawdev/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 11729 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/rte_telemetry.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 17435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 6521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry_data.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry_data.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 2487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry_internal.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 8743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry_json.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 6252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/telemetry_legacy.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/telemetry/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cfgfile/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cfgfile/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 13260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cfgfile/rte_cfgfile.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 9117 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cfgfile/rte_cfgfile.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 489 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cfgfile/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 4173 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/event_timer_adapter_pmd.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 46786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_pmd.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 3570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_pmd_pci.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 2259 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_pmd_vdev.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 3987 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_private.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 25112 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_trace.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 9961 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/eventdev_trace_points.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1118 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 39254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_crypto_adapter.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 26963 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_crypto_adapter.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 36312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_dma_adapter.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 20762 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_dma_adapter.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 102183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_eth_rx_adapter.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 28014 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 34423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_eth_tx_adapter.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 19018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_ring.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 10441 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_ring.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 39897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_timer_adapter.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 26194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_event_timer_adapter.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 52677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_eventdev.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 114256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_eventdev.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 2935 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_eventdev_core.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 3001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/rte_eventdev_trace_fp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 5300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eventdev/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/kvargs/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 157 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/kvargs/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 5948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/kvargs/rte_kvargs.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 6036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/kvargs/rte_kvargs.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/kvargs/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 4294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 823 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/net_crc.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 11417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/net_crc_avx512.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 6977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/net_crc_neon.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 7877 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/net_crc_sse.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1258 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_arp.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1861 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_arp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_dtls.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 4913 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ecpri.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_esp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 3271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ether.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 12663 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ether.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1868 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_geneve.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_gre.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 5203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_gtp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 2669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_higig.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ib.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_icmp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 22163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ip.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 7777 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_l2tpv2.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_macsec.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 780 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_mpls.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 13659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_net.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 6628 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_net.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 8002 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_net_crc.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1317 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_net_crc.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 3684 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_pdcp_hdr.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_ppp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 672 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_sctp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1473 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_tcp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 1286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_tls.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 707 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_udp.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 2469 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/rte_vxlan.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 218 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/net/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 1686 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/rcu_qsbr_pvt.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 12304 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/rte_rcu_qsbr.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 26187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/rte_rcu_qsbr.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rcu/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/timer/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 165 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/timer/meson.build 00:46:58.310 -rw-r--r-- vagrant/vagrant 30383 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/timer/rte_timer.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 16425 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/timer/rte_timer.h 00:46:58.310 -rw-r--r-- vagrant/vagrant 437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/timer/version.map 00:46:58.310 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/ 00:46:58.310 -rw-r--r-- vagrant/vagrant 3571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline.c 00:46:58.310 -rw-r--r-- vagrant/vagrant 1079 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 8345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_cirbuf.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5371 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_cirbuf.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_os_unix.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 3243 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_os_windows.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 11542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 6556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1398 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_etheraddr.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_etheraddr.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 3021 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_ipaddr.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5078 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_ipaddr.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 6578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_num.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_num.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_portlist.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_portlist.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 4378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_string.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 2239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_string.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2214 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_private.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 15362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_rdline.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 4998 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_rdline.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_socket.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_socket.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2034 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_vt100.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 2749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_vt100.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 808 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 1482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cmdline/version.map 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 14018 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/dir24_8.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 6582 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/dir24_8.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 4890 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/dir24_8_avx512.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/dir24_8_avx512.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 177 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/fib_log.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2445 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 7272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/rte_fib.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5119 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/rte_fib.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 7428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/rte_fib6.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/rte_fib6.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 15788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/trie.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 3642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/trie.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 11436 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/trie_avx512.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/trie_avx512.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/fib/version.map 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/latencystats/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 185 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/latencystats/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 9875 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/latencystats/rte_latencystats.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 3874 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/latencystats/rte_latencystats.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/latencystats/version.map 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 4623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ethdev_ctrl.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ethdev_rx.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1593 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ethdev_rx_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ethdev_tx.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1096 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ethdev_tx_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_local.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 6110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_lookup.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 6610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_lookup_neon.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 6645 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_lookup_sse.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 4701 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_reassembly.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_reassembly_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 9083 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_rewrite.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1581 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip4_rewrite_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 9991 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip6_lookup.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 8461 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip6_rewrite.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1563 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/ip6_rewrite_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 5711 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/kernel_rx.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1322 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/kernel_rx_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/kernel_tx.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 411 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/kernel_tx_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/log.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 903 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 2116 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/node_private.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/null.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5644 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/pkt_cls.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/pkt_cls_priv.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 520 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/pkt_drop.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/rte_node_eth_api.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/rte_node_ip4_api.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1691 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/rte_node_ip6_api.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/rte_node_udp4_input_api.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 5457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/udp4_input.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/node/version.map 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 14570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/rte_regexdev.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 53932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/rte_regexdev.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 6459 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/rte_regexdev_core.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1154 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/rte_regexdev_driver.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 837 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/regexdev/version.map 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 8279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/fd_man.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 1345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/fd_man.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 10588 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/iotlb.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 2189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/iotlb.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 1254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/meson.build 00:46:58.311 -rw-r--r-- vagrant/vagrant 4800 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/rte_vdpa.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 28734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 8483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost_async.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 3781 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost_crypto.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 29524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/socket.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 8350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vdpa.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 4064 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vdpa_driver.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 16833 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vduse.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vduse.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 2880 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/version.map 00:46:58.311 -rw-r--r-- vagrant/vagrant 51443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vhost.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 29941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vhost.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 44189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vhost_crypto.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 93457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vhost_user.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 5617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/vhost_user.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 12481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/virtio_crypto.h 00:46:58.311 -rw-r--r-- vagrant/vagrant 114621 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/virtio_net.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 7045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/virtio_net_ctrl.c 00:46:58.311 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/vhost/virtio_net_ctrl.h 00:46:58.311 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/ 00:46:58.311 -rw-r--r-- vagrant/vagrant 447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 5605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_comp.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 19242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_comp.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 16806 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 17370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev_internal.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev_pmd.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 9783 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev_pmd.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 1041 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/compressdev/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 23967 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/gpudev.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 3704 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/gpudev_driver.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 253 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 20132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/rte_gpudev.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gpudev/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 11751 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/log.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/log_freebsd.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/log_internal.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 1123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/log_linux.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/log_windows.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 16491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/rte_log.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/log/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 167 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 2550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/pcapng_proto.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 19734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/rte_pcapng.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 5319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/rte_pcapng.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pcapng/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/reorder/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/reorder/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 15280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/reorder/rte_reorder.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 6995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/reorder/rte_reorder.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/reorder/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 5996 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/cryptodev_pmd.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 19983 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/cryptodev_pmd.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 15511 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/cryptodev_trace.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 6443 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/cryptodev_trace_points.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 14171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 20297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto_asym.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 31931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto_sym.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 70785 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 64609 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 1814 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev_core.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3179 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/cryptodev/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 17428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 2916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_debug.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 3449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_ops.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 5131 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_pcap.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 2484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_pcap_private.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 5956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_populate.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 10829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_private.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 12246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/graph_stats.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 7423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/node.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 18246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 4786 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_model_mcore_dispatch.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 3281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 1288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_model_rtc.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 700 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_worker.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 1121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_worker.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 15496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/rte_graph_worker_common.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 1291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/graph/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 177 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/lpm_log.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 32157 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 11752 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 34528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm6.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 5233 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm6.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm_altivec.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3333 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm_neon.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm_scalar.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3400 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm_sse.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/rte_lpm_sve.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/lpm/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pci/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 137 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pci/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 3040 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pci/rte_pci.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 8685 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pci/rte_pci.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 97 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pci/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 199 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/rib_log.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 11947 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/rte_rib.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 5918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/rte_rib.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 13739 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/rte_rib6.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 7161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/rte_rib6.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/rib/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dispatcher/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dispatcher/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 15302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dispatcher/rte_dispatcher.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 12942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dispatcher/rte_dispatcher.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 439 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dispatcher/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 5279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 7565 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp4.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 3859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp4.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 6514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp6.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 4153 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp6.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 3169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_tcp_internal.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 10640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_udp4.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 7192 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_udp4.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 13924 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_vxlan_tcp4.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 4277 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_vxlan_tcp4.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 14954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_vxlan_udp4.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 4192 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/gro_vxlan_udp4.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/meson.build 00:46:58.312 -rw-r--r-- vagrant/vagrant 15186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/rte_gro.c 00:46:58.312 -rw-r--r-- vagrant/vagrant 5926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/rte_gro.h 00:46:58.312 -rw-r--r-- vagrant/vagrant 174 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gro/version.map 00:46:58.312 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/ 00:46:58.312 -rw-r--r-- vagrant/vagrant 182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/mbuf_log.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 28135 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 56437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 26357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_core.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 14996 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_dyn.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 12312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_dyn.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2226 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_pool_ops.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 2141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_pool_ops.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 6155 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_ptype.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 21286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_ptype.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 1086 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mbuf/version.map 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/ 00:46:58.313 -rw-r--r-- vagrant/vagrant 479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 4623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_cnt.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_cnt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 6522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_crypto.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_crypto.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_ctrl_pdu.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_ctrl_pdu.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 6428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_entity.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 36006 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_process.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 918 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_process.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_reorder.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 1623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/pdcp_reorder.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 8390 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/rte_pdcp.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 12605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/rte_pdcp.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 3289 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/rte_pdcp_group.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdcp/version.map 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/ 00:46:58.313 -rw-r--r-- vagrant/vagrant 603 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 13670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 25552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 5705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_c11_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 5463 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_core.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 23744 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_elem.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 11462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_elem_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 5025 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_generic_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 7963 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_hts.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 7161 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_hts_elem_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 11909 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 4637 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek_elem_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 16677 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek_zc.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 10991 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_rts.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 7551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/rte_ring_rts_elem_pvt.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ring/version.map 00:46:58.313 -rw-r--r-- vagrant/vagrant 10216 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meson.build 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/ 00:46:58.313 -rw-r--r-- vagrant/vagrant 7770 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 40049 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_bld.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 13601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_gen.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 199 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_log.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 6476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_altivec.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 9288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_altivec.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 758 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx2.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 7171 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx2.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 4682 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx512.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 14368 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx512_common.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 8205 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx512x16.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 5920 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_avx512x8.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_neon.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 8208 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_neon.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 4617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_scalar.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_sse.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 9669 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_run_sse.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/acl_vect.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 12468 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/rte_acl.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 11129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/rte_acl.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 952 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/rte_acl_osdep.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 1876 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/tb_mem.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 938 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/tb_mem.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 295 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/acl/version.map 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/ 00:46:58.313 -rw-r--r-- vagrant/vagrant 5554 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/distributor_private.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 438 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 21776 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 7854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor_match_generic.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 2264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor_match_sse.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 12163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor_single.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 7146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/rte_distributor_single.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 286 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/distributor/version.map 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/ 00:46:58.313 -rw-r--r-- vagrant/vagrant 3104 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_common.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 5184 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_common.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 1931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tcp4.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 1426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tcp4.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2807 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tunnel_tcp4.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 1470 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tunnel_tcp4.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2894 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tunnel_udp4.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 1431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_tunnel_udp4.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 2169 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_udp4.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 1350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/gso_udp4.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/meson.build 00:46:58.313 -rw-r--r-- vagrant/vagrant 2781 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/rte_gso.c 00:46:58.313 -rw-r--r-- vagrant/vagrant 4439 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/rte_gso.h 00:46:58.313 -rw-r--r-- vagrant/vagrant 53 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/gso/version.map 00:46:58.313 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/member.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1819 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 8972 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 20234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 9955 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_heap.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 15844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_ht.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_ht.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 14741 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_sketch.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 2149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_sketch.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2076 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_sketch_avx512.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_sketch_avx512.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 8718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_vbf.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 1254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_vbf.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1970 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_member_x86.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 3125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/rte_xxh64_avx512.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/member/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdump/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdump/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 18320 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdump/rte_pdump.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 7510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdump/rte_pdump.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 249 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pdump/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 413 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 8290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_approx.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 1679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_approx.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_pie.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 11293 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_pie.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2999 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_red.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 11717 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_red.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 81475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_sched.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 18395 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_sched.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1626 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_sched_common.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 187 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/rte_sched_log.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/sched/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/argparse/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 165 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/argparse/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 17340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/argparse/rte_argparse.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 6126 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/argparse/rte_argparse.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/argparse/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 316 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 26385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 36660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_core.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 5253 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_pmd.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 3403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_trace.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 4240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_trace_fp.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_trace_points.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 710 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/dmadev/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 2373 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_cmp_arm64.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_cmp_x86.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_crc_arm64.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_crc_generic.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 26731 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_crc_sw.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2910 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_crc_x86.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 70736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_cuckoo_hash.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 6660 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_cuckoo_hash.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 5116 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_fbk_hash.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 10230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_fbk_hash.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 23008 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_hash.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_hash_crc.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 3439 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_hash_crc.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 9382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_jhash.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 19914 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_thash.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 11670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_thash.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_thash_gfni.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 1859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_thash_gfni.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 6288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/rte_thash_x86_gfni.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/hash/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 4954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/mempool_trace.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/mempool_trace_points.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 41435 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 64508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 4794 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool_ops.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 4076 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool_ops_default.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 2982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool_trace_fp.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 1404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/mempool/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 37178 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_pipeline.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 27913 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_pipeline.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 10830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_port_in_action.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 9545 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_port_in_action.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 78692 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_ctl.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 41857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_ctl.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 2779 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_extern.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 42921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_ipsec.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 9593 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_ipsec.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 337679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_pipeline.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 36300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_pipeline.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 122246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_pipeline_internal.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 100743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_pipeline_spec.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 5272 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_swx_pipeline_spec.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 84926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_table_action.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 33377 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/rte_table_action.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 5203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/pipeline/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 15063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/rte_security.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 46342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/rte_security.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 10534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/rte_security_driver.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 763 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/security/version.map 00:46:58.314 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/ 00:46:58.314 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/meson.build 00:46:58.314 -rw-r--r-- vagrant/vagrant 29866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/rte_bbdev.c 00:46:58.314 -rw-r--r-- vagrant/vagrant 34850 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/rte_bbdev.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 42830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/rte_bbdev_op.h 00:46:58.314 -rw-r--r-- vagrant/vagrant 5411 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/rte_bbdev_pmd.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1264 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/bbdev/version.map 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/ 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 720 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_altivec.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 5491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_atomic.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_byteorder.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_cpuflags.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_cycles.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_io.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 5519 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_memcpy.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_pause.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_power_intrinsics.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 935 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_prefetch.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 635 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_rwlock.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1673 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_spinlock.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/include/rte_vect.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 245 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 3415 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/rte_cpuflags.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 852 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/rte_cycles.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/rte_hypervisor.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 893 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/ppc/rte_power_intrinsics.c 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/ 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 523 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 1119 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_atomic.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_byteorder.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_cpuflags.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2311 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_cycles.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_io.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_memcpy.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_pause.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_power_intrinsics.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_prefetch.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_rwlock.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_spinlock.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1099 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/include/rte_vect.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 2909 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/rte_cpuflags.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1773 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/rte_cycles.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 244 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/rte_hypervisor.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1009 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/riscv/rte_power_intrinsics.c 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 2905 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_debug.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1239 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_file.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2479 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_filesystem.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2888 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_firmware.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2998 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_unix_memory.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1174 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_unix_thread.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 596 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/eal_unix_timer.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 8203 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/unix/rte_thread.c 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/ 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 14149 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/dirent.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 4721 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/fnmatch.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/getopt.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 2242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/pthread.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2328 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/regex.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1625 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/rte_os.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/rte_os_shim.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/rte_virt2phys.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1237 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/rte_windows.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2056 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/sched.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 331 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/unistd.h 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/sys/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 27723 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/include/sys/queue.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 12879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 5262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_alarm.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_debug.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 600 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_dev.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_file.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_hugepages.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 5428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_interrupts.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 5655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_lcore.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 10691 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_memalloc.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 17406 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_memory.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 2158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_mp.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_thread.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1996 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_timer.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 3370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/eal_windows.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 11570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/getopt.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 784 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 12728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/windows/rte_thread.c 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 4284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/rte_cpuflags.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 966 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/rte_cycles.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/rte_hypervisor.c 00:46:58.315 -rw-r--r-- vagrant/vagrant 1980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/rte_power_intrinsics.c 00:46:58.315 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/ 00:46:58.315 -rw-r--r-- vagrant/vagrant 830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/meson.build 00:46:58.315 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_atomic.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 863 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_atomic_32.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 6757 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_atomic_64.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 1483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_byteorder.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cpuflags.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cpuflags_32.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 843 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cpuflags_64.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cycles.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2077 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cycles_32.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 2458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_cycles_64.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_io.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 3431 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_io_64.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_memcpy.h 00:46:58.315 -rw-r--r-- vagrant/vagrant 7558 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_memcpy_32.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 9745 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_memcpy_64.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 317 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_pause.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 334 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_pause_32.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 7951 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_pause_64.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_power_intrinsics.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_prefetch.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 917 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_prefetch_32.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_prefetch_64.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_rwlock.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1167 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_spinlock.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 4101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/arm/include/rte_vect.h 00:46:58.316 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/ 00:46:58.316 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/ 00:46:58.316 -rw-r--r-- vagrant/vagrant 704 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/meson.build 00:46:58.316 -rw-r--r-- vagrant/vagrant 6725 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 4417 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic_32.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 4120 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic_64.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2033 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 734 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder_32.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 665 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder_64.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 6399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_cpuflags.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1262 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_cycles.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_io.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 28176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_memcpy.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_pause.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_power_intrinsics.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1607 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_prefetch.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1022 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_rtm.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1017 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_rwlock.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 3837 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_spinlock.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2552 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_vect.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/meson.build 00:46:58.316 -rw-r--r-- vagrant/vagrant 7434 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_cpuflags.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_cpuid.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2759 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_cycles.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_hypervisor.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 9379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_power_intrinsics.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/x86/rte_spinlock.c 00:46:58.316 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/ 00:46:58.316 -rw-r--r-- vagrant/vagrant 6013 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_bus.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1325 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_class.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_config.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 869 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_cpuflags.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1074 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_debug.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 18364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_dev.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 9474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_devargs.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 16125 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_dynmem.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_errno.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 36382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_fbarray.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1686 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_hexdump.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 416 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_hypervisor.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 10849 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_interrupts.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2587 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_launch.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 17610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_lcore.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 3780 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_mcfg.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 8861 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_memalloc.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 42650 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_memory.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 11362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_memzone.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 56183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_options.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 30500 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_proc.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1828 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_string_fns.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 3594 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_tailqs.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 10562 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_thread.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2062 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_timer.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 12234 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_trace.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 8579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_trace_ctf.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 3652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_trace_points.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 8267 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_trace_utils.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 3201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_common_uuid.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2771 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_filesystem.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 799 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_firmware.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1075 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_hugepages.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 4251 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_internal_cfg.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 998 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_interrupts.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2393 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_memalloc.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 3060 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_memcfg.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 3509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_options.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 19294 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_private.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_thread.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 2778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/eal_trace.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 12284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/hotplug_mp.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/hotplug_mp.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 19462 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_elem.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 10859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_elem.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 39356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_heap.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2143 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_heap.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 21809 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_mp.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 1760 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/malloc_mp.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 1524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/meson.build 00:46:58.316 -rw-r--r-- vagrant/vagrant 4030 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_keepalive.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 15358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_malloc.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 5315 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_random.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 2410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_reciprocal.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 26045 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_service.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 968 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/common/rte_version.c 00:46:58.316 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/ 00:46:58.316 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/include/ 00:46:58.316 -rw-r--r-- vagrant/vagrant 163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/include/meson.build 00:46:58.316 -rw-r--r-- vagrant/vagrant 1933 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/include/rte_os.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 541 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/include/rte_os_shim.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 26527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 7227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_alarm.c 00:46:58.316 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_alarm_private.h 00:46:58.316 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_cpuflags.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_dev.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 4416 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_hugepage_info.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 16898 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_interrupts.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 953 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_lcore.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 1650 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_memalloc.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 11765 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_memory.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 981 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_thread.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 1268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/eal_timer.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 743 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/freebsd/meson.build 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/ 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/ 00:46:58.317 -rw-r--r-- vagrant/vagrant 27300 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_atomic.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6982 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_byteorder.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2726 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_cpuflags.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_cycles.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 9514 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_io.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2852 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_memcpy.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_pause.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 5288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_power_intrinsics.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4129 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_prefetch.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 8132 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_rwlock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 8103 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_spinlock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_vect.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 8432 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/bus_driver.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1031 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/dev_driver.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6777 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/eal_trace_internal.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/meson.build 00:46:58.317 -rw-r--r-- vagrant/vagrant 2305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_alarm.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 15891 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_bitmap.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 15450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_bitops.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1041 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_branch_prediction.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_bus.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_class.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 20895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_common.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_compat.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_debug.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 12848 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_dev.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6556 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_devargs.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 15176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3255 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal_memconfig.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2631 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal_paging.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2486 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal_trace.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_epoll.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1631 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_errno.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 14728 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_fbarray.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3586 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_function_versioning.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1118 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_hexdump.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 639 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_hypervisor.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 19715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_interrupts.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3773 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_keepalive.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4216 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_launch.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 11064 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_lcore.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2121 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_lock_annotations.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 18865 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_malloc.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 5605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_mcslock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 22662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_memory.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 13458 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_memzone.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 399 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_pci_dev_features.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_per_lcore.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_pflock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2204 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_random.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2101 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_reciprocal.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6441 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_seqcount.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_seqlock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 15857 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_service.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_service_component.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 6032 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_stdatomic.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3467 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_string_fns.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 3813 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_tailq.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1516 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_test.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 12506 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_thread.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 4896 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_ticketlock.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2404 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_time.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2929 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 12855 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace_point.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace_point_register.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2133 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_uuid.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 1370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_version.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 9192 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/include/rte_vfio.h 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/ 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/include/ 00:46:58.317 -rw-r--r-- vagrant/vagrant 163 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/include/meson.build 00:46:58.317 -rw-r--r-- vagrant/vagrant 1343 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/include/rte_os.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/include/rte_os_shim.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 36525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 6440 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_alarm.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 1578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_cpuflags.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 9796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_dev.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 17189 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_hugepage_info.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 39377 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_interrupts.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_lcore.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 46802 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_memalloc.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 55136 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_memory.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_thread.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 5655 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_timer.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 57077 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_vfio.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 3769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_vfio.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 2695 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/eal_vfio_mp_sync.c 00:46:58.317 -rw-r--r-- vagrant/vagrant 629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/linux/meson.build 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/ 00:46:58.317 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/ 00:46:58.317 -rw-r--r-- vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/meson.build 00:46:58.317 -rw-r--r-- vagrant/vagrant 866 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_atomic.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 893 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_byteorder.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_cpuflags.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 723 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_cycles.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_io.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 975 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_memcpy.h 00:46:58.317 -rw-r--r-- vagrant/vagrant 369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_pause.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_power_intrinsics.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 975 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_prefetch.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 700 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_rwlock.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 1186 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_spinlock.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 1252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/include/rte_vect.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 2256 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/rte_cpuflags.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 949 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/rte_cycles.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/rte_hypervisor.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 927 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/loongarch/rte_power_intrinsics.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 602 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 11648 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/eal/version.map 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 4571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/ip_frag_common.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 10487 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/ip_frag_internal.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2889 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/ip_reassembly.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 8935 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ip_frag.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 3448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ip_frag_common.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 12465 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ipv4_fragmentation.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 4882 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ipv4_reassembly.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 5282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ipv6_fragmentation.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 6109 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/rte_ipv6_reassembly.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/ip_frag/version.map 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meter/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 141 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meter/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 3526 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meter/rte_meter.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 17367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meter/rte_meter.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/meter/version.map 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 1280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/port_log.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/port_log.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 5730 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 12025 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ethdev.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1650 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ethdev.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 13674 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_eventdev.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_eventdev.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 11291 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_fd.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_fd.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 6457 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_frag.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_frag.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 8158 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ras.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1821 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ras.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 18055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ring.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2505 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_ring.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 6665 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_sched.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_sched.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 12614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_source_sink.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1371 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_source_sink.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 13232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_sym_crypto.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2145 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_port_sym_crypto.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 5036 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 9321 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_ethdev.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1235 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_ethdev.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 6920 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_fd.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1093 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_fd.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 8530 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_ring.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 973 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_ring.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 7846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_source_sink.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1501 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/rte_swx_port_source_sink.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 1176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/port/version.map 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 403 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 3928 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 6559 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 676 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_lf.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2671 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_lf.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 4969 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_lf_c11.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 4493 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_lf_generic.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 897 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_lf_stubs.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_std.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 2736 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/rte_stack_std.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 536 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/stack_pvt.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 90 2024-06-07 12:49 spdk-test_gen_spec/dpdk/lib/stack/version.map 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/ 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/dumpcap/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 25200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/dumpcap/main.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/dumpcap/meson.build 00:46:58.318 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/ 00:46:58.318 -rw-r--r-- vagrant/vagrant 5614 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_common.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 5134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_main.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 17742 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_options.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 8522 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_options.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 749 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_test.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 3499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/evt_test.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/meson.build 00:46:58.318 -rw-r--r-- vagrant/vagrant 5957 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/parser.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 1139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/parser.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 5481 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_order_atq.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 9194 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_order_common.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 4402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_order_common.h 00:46:58.318 -rw-r--r-- vagrant/vagrant 5855 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_order_queue.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 10794 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_perf_atq.c 00:46:58.318 -rw-r--r-- vagrant/vagrant 60881 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_perf_common.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 10698 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_perf_common.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 11326 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_perf_queue.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 19418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_pipeline_atq.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 21759 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_pipeline_common.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 6803 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_pipeline_common.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 20761 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-eventdev/test_pipeline_queue.c 00:46:58.319 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/ 00:46:58.319 -rw-r--r-- vagrant/vagrant 9508 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/commands.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 10058 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/meson.build 00:46:58.319 -rw-r--r-- vagrant/vagrant 12173 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/packet_burst_generator.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2276 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/packet_burst_generator.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 5120 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/process.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 5077 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/resource.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2831 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/resource.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 3193 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/sample_packet_forward.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 1444 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/sample_packet_forward.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 11442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 7662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 43474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_acl.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 20354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_acl.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 1662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_alarm.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 26668 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_argparse.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 17331 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_atomic.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 6270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_barrier.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2260 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_bitcount.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 5596 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_bitmap.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 3100 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_bitops.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 6183 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_bitratestats.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 72853 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_bpf.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 1542 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_byteorder.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 8154 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfile.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 7837 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cksum.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cksum_perf.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 1611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 1195 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 32534 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_cirbuf.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 4703 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_etheraddr.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 16988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_ipaddr.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 5466 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_lib.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 16679 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_num.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 4464 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_portlist.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 10663 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cmdline_string.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 8515 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_common.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 117336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_compressdev.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 13547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_compressdev_test_buffer.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 7485 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cpuflags.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 3816 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_crc.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 577922 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 7037 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 151123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_aead_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 191612 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_aes_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 106070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_asym.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 1977 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_asym_util.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 37561 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_blockcipher.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 3313 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_blockcipher.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 31617 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_crosscheck.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 36657 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_des_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 2338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_dh_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 3787 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_dsa_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 16087 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_ecdh_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 18427 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_ecdsa_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 10423 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_ecpm_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 27144 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_hash_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 2271 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_hmac_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 5796 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_kasumi_hash_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 11157 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_kasumi_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 51509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_mixed_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 52672 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_mod_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 12284 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_rsa_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 45379 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_docsis_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 31429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_ipsec.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 3613 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_ipsec.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 68567 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_ipsec_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 27499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_pdcp.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 131258 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_pdcp_sdap_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 1249 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_pdcp_test_func.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 280510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_pdcp_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 12297 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_tls_record.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 5883 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_tls_record.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 51191 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_security_tls_record_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 3859 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_sm2_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 20943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_sm4_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 18473 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_snow3g_hash_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 19940 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_snow3g_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 74502 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cryptodev_zuc_test_vectors.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 917 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cycles.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_debug.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 9356 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_devargs.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 23610 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_dispatcher.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 26021 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_distributor.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 7015 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_distributor_perf.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 40282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_dmadev.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 21788 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_dmadev_api.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 120 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_dmadev_api.h 00:46:58.319 -rw-r--r-- vagrant/vagrant 47334 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_eal_flags.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 4980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_eal_fs.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 12223 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_efd.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 9405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_efd_perf.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 2993 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_errno.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 5335 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ethdev_api.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 5414 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ethdev_link.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 48499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_crypto_adapter.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 24599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_dma_adapter.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 41579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_eth_rx_adapter.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 30162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_eth_tx_adapter.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 6484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_ring.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 58426 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_event_timer_adapter.c 00:46:58.319 -rw-r--r-- vagrant/vagrant 41382 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_eventdev.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 15885 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_external_mem.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 26246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_fbarray.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 11261 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_fib.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 11377 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_fib6.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 24937 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_graph.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 3684 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_fib6_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 10864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_fib_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 11980 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_func_reentrancy.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 25924 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_graph_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 62706 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 7276 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash_functions.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 7240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash_multiwriter.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 19429 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 19452 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash_readwrite.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 43345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_hash_readwrite_lf_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 15442 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_interrupts.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 13111 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ipfrag.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 66851 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ipsec.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 15162 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ipsec_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 28599 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ipsec_sad.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 7279 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_kvargs.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5476 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_latencystats.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 11872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lcores.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 183009 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_link_bonding.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 48070 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_link_bonding_mode4.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 18854 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_link_bonding_rssconf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4412 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_logs.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 39948 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lpm.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 46155 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lpm6.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 78448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lpm6_data.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 4087 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lpm6_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 19941 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_lpm_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 26290 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_malloc.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4317 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_malloc_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 79733 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_mbuf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5841 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_mcslock.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 26872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_member.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 20330 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_member_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 3359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_memcpy.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 12974 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_memcpy_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 2405 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_memory.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 28102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_mempool.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 10955 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_mempool_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 30235 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_memzone.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 20950 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_meter.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 9604 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_metrics.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5592 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_mp_secondary.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 3994 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_net_ether.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 10636 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pcapng.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 69048 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pdcp.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4899 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pdump.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 748 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pdump.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 2623 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_per_lcore.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5006 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pflock.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 26652 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pie.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 21930 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pmd_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 16585 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pmd_ring.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4583 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_pmd_ring_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4666 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_power.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 17921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_power_cpufreq.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 6765 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_power_intel_uncore.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 8870 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_power_kvm_vm.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 618 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_prefetch.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 2032 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rand_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 1446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rawdev.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 38056 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rcu_qsbr.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 17474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rcu_qsbr_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 26846 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_reassembly_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4247 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_reciprocal_division.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5475 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_reciprocal_division_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 50201 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_red.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 13750 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_reorder.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 2363 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_resource.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 9389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rib.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 9864 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rib6.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 32921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 7526 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_hts_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 686 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_mpmc_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_mt_peek_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 1159 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_mt_peek_stress_zc.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 14182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 732 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_rts_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 1130 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_st_peek_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 1371 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_st_peek_stress_zc.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 1381 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_stress.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 936 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_stress.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 9148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ring_stress_impl.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 11366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_rwlock.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 5615 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_sched.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 66310 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 74305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_inline_macsec.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 109654 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_inline_macsec_vectors.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 99986 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_inline_proto.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 19504 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_inline_proto_vectors.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 4551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_proto.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4571 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_security_proto.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 3847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_seqlock.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 33953 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_service_cores.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 8102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_spinlock.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 8308 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_stack.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 7830 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_stack_perf.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 6011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_string_fns.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4713 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 4988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 18302 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_acl.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_acl.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 20766 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_combined.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 785 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_combined.h 00:46:58.320 -rw-r--r-- vagrant/vagrant 13932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_pipeline.c 00:46:58.320 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_pipeline.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 5224 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_ports.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 296 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_ports.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 23869 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_tables.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_table_tables.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 3402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_tailq.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 18636 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_telemetry_data.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 5784 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_telemetry_json.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 27847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_thash.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 3608 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_thash_perf.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 7945 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_threads.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 8548 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_ticketlock.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 16166 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_timer.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 3895 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_timer_perf.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 4221 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_timer_racecond.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 5459 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_timer_secondary.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 5210 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_trace.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 369 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_trace.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 4605 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_trace_perf.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_trace_register.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 3611 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_vdev.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_version.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 2105 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_xmmt_ops.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 15597 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/virtual_pmd.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 1943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/virtual_pmd.h 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/suites/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 4960 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/suites/meson.build 00:46:58.321 -rwxr-xr-x vagrant/vagrant 771 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/suites/test_telemetry.sh 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/ 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/empty.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 36 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/empty_key_value.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/invalid_section.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/line_too_long.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/missing_section.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 2510 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/realloc_sections.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/sample1.ini 00:46:58.321 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test/test_cfgfiles/etc/sample2.ini 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/examples/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 953 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/examples/l2fwd.cli 00:46:58.321 -rw-r--r-- vagrant/vagrant 768 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/examples/l2fwd_pcap.cli 00:46:58.321 -rw-r--r-- vagrant/vagrant 2283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/examples/l3fwd.cli 00:46:58.321 -rw-r--r-- vagrant/vagrant 2195 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/examples/l3fwd_pcap.cli 00:46:58.321 -rw-r--r-- vagrant/vagrant 2055 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/cli.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 1111 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/cli.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 2402 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/commands.list 00:46:58.321 -rw-r--r-- vagrant/vagrant 5915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/conn.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 954 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/conn.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 17420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 704 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 932 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 2861 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev_rx.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 847 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev_rx.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ethdev_rx_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 10705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/graph.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/graph.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 560 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/graph_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 3380 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ip4_route.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 3640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/ip6_route.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 3772 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/l2fwd.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/l2fwd.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 3492 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/l3fwd.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 248 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/l3fwd.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 4347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 2190 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/mempool.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 375 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/mempool.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 305 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/mempool_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 806 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/meson.build 00:46:58.321 -rw-r--r-- vagrant/vagrant 566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/module_api.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 6004 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/neigh.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 269 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/neigh.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 495 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/neigh_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 663 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/route.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 176 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/route_priv.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 2182 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/utils.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/graph/utils.h 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-fib/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 31419 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-fib/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-fib/meson.build 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/pdump/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 24687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/pdump/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/pdump/meson.build 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 29020 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/actions_gen.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 700 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/actions_gen.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 826 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/config.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 1939 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/flow_gen.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 1072 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/flow_gen.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 9699 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/items_gen.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/items_gen.h 00:46:58.321 -rw-r--r-- vagrant/vagrant 62088 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 312 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-flow-perf/meson.build 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/proc-info/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 61778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/proc-info/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/proc-info/meson.build 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-gpudev/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 11410 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-gpudev/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-gpudev/meson.build 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/ 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/input/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/input/acl1v4_5_rule 00:46:58.321 -rw-r--r-- vagrant/vagrant 211 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/input/acl1v4_5_trace 00:46:58.321 -rw-r--r-- vagrant/vagrant 160 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/input/acl1v6_1_rule 00:46:58.321 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/input/acl1v6_1_trace 00:46:58.321 -rw-r--r-- vagrant/vagrant 31879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/main.c 00:46:58.321 -rw-r--r-- vagrant/vagrant 225 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/meson.build 00:46:58.321 -rw-r--r-- vagrant/vagrant 2456 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-acl/test-acl.sh 00:46:58.321 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ 00:46:58.321 -rw-r--r-- vagrant/vagrant 547 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/meson.build 00:46:58.321 -rw-r--r-- vagrant/vagrant 885 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_common.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 2314 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_main.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 8757 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_options.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 1345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_options.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 726 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_test.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 1867 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/ml_test.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 6188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/parser.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 1229 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/parser.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 4007 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_common.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 912 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_common.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 4640 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_device_ops.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_device_ops.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 31798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_inference_common.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2282 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_inference_common.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 2762 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_inference_interleave.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2566 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_inference_ordered.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2798 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_model_common.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_model_common.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 8702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_model_ops.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 378 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_model_ops.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 3254 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_stats.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-mldev/test_stats.h 00:46:58.322 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/ 00:46:58.322 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ 00:46:58.322 -rw-r--r-- vagrant/vagrant 106 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/bbdev_null.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 78844 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/fft_9.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 5142 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/fft_byp_28.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 29668 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_HARQ_1_0.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 61134 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_HARQ_1_1.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 82110 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_HARQ_1_2.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 541 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v11835.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 67551 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v2342_drop.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 474 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v7813.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 3010 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v8480.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 20448 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v8568.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 112620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_dec_v9503.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_enc_v11835.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 10827 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_enc_v2342.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_enc_v7813.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 3052 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_enc_v8568.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 15764 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/ldpc_enc_v9503.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 1439 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k40_r0_e17280_sbd_negllr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 58270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k6144_r0_e10376_crc24b_sbd_negllr_high_snr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 58270 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k6144_r0_e10376_crc24b_sbd_negllr_low_snr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 58215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k6144_r0_e34560_posllr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 113670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k6144_r0_e34560_sbd_negllr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 113670 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c1_k6144_r0_e34560_sbd_posllr.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 59872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_dec_c2_k3136_r0_e4920_sbd_negllr_crc24b.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 724 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k40_r0_e1190_rm.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 726 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k40_r0_e1194_rm.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 726 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k40_r0_e1196_rm.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k40_r0_e272_rm.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 2642 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k6144_r0_e120_rm_rvidx.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 9446 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k6144_r0_e18444.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 9482 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k6144_r0_e18448_crc24a.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 14687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c1_k6144_r0_e32256_crc24b_rm.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 7553 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_vectors/turbo_enc_c3_k4800_r2_e14412_crc24b.data 00:46:58.322 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/ldpc_dec_default.data -> test_vectors/ldpc_dec_v2342_drop.data 00:46:58.322 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/ldpc_enc_default.data -> test_vectors/ldpc_enc_v2342.data 00:46:58.322 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/turbo_dec_default.data -> test_vectors/turbo_dec_c1_k6144_r0_e10376_crc24b_sbd_negllr_high_snr.data 00:46:58.322 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/turbo_enc_default.data -> test_vectors/turbo_enc_c1_k6144_r0_e18444.data 00:46:58.322 -rw-r--r-- vagrant/vagrant 8528 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/main.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2979 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/main.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 659 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/meson.build 00:46:58.322 -rwxr-xr-x vagrant/vagrant 4358 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test-bbdev.py 00:46:58.322 -rw-r--r-- vagrant/vagrant 42949 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_bbdev.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 205012 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_bbdev_perf.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 60484 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_bbdev_vector.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-bbdev/test_bbdev_vector.h 00:46:58.322 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/ 00:46:58.322 -rw-r--r-- vagrant/vagrant 5266 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/config.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 5800 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/init.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 3956 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/main.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 2990 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/main.h 00:46:58.322 -rw-r--r-- vagrant/vagrant 437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/meson.build 00:46:58.322 -rw-r--r-- vagrant/vagrant 6792 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/pipeline_acl.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 12437 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/pipeline_hash.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 4383 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/pipeline_lpm.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 4450 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/pipeline_lpm_ipv6.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 3472 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/pipeline_stub.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 3152 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pipeline/runtime.c 00:46:58.322 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/ 00:46:58.322 -rw-r--r-- vagrant/vagrant 662 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/cmdline_test.c 00:46:58.322 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/cmdline_test.h 00:46:58.322 -rwxr-xr-x vagrant/vagrant 2418 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/cmdline_test.py 00:46:58.322 -rw-r--r-- vagrant/vagrant 8988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/cmdline_test_data.py 00:46:58.323 -rw-r--r-- vagrant/vagrant 9147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/commands.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-cmdline/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 1200 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/5tswap.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 3150 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/5tswap.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 5151 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/bpf_cmd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/bpf_cmd.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 14745 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmd_flex_item.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 432115 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 10519 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_cman.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_cman.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 386593 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_flow.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 62778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_mtr.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1298 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_mtr.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 96102 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_tm.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/cmdline_tm.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 186921 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/config.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 34281 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/csumonly.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 5323 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/flowgen.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 14188 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/icmpecho.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 6344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/ieee1588fwd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1449 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/iofwd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1504 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macfwd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1366 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macfwd.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 1629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macswap.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 952 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macswap.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 1138 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macswap_common.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 2543 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macswap_neon.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 2578 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/macswap_sse.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 1816 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/meson.build 00:46:58.323 -rw-r--r-- vagrant/vagrant 7965 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/noisy_vnf.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 53778 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/parameters.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/recycle_mbufs.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1509 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/rxonly.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 2737 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/shared_rxq_fwd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 120146 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/testpmd.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 44916 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/testpmd.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 14071 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/txonly.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 14775 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-pmd/util.c 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 1114 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 1759 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_options.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 16147 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_options_parse.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 17350 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_common.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1263 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_common.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 16035 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_cyclecount.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_cyclecount.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 10780 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_throughput.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 818 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_throughput.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 11879 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_verify.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 620 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/comp_perf_test_verify.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 14416 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/main.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-compress-perf/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-regex/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 20152 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-regex/main.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-regex/meson.build 00:46:58.323 -rw-r--r-- vagrant/vagrant 3388 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/ 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/configs/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 2117 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/configs/crypto-perf-aesni-gcm.json 00:46:58.323 -rw-r--r-- vagrant/vagrant 2527 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/configs/crypto-perf-aesni-mb.json 00:46:58.323 -rw-r--r-- vagrant/vagrant 2170 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/configs/crypto-perf-qat.json 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/data/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 35488 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/data/aes_cbc_128_sha.data 00:46:58.323 -rw-r--r-- vagrant/vagrant 35537 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/data/aes_cbc_192_sha.data 00:46:58.323 -rw-r--r-- vagrant/vagrant 35585 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/data/aes_cbc_256_sha.data 00:46:58.323 -rw-r--r-- vagrant/vagrant 6915 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/data/aes_gcm_128.data 00:46:58.323 -rw-r--r-- vagrant/vagrant 702 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 36759 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_ops.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 958 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_ops.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 4232 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_options.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 36059 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_options_parsing.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 8447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_common.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 689 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_common.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 10963 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_latency.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 646 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_latency.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 13228 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_pmd_cyclecount.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 705 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_pmd_cyclecount.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 9192 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_throughput.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 665 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_throughput.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 14053 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_vector_parsing.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 995 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_vector_parsing.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 41430 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_vectors.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1740 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_vectors.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 10332 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_verify.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 641 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/cperf_test_verify.h 00:46:58.323 -rwxr-xr-x vagrant/vagrant 12525 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/dpdk-graph-crypto-perf.py 00:46:58.323 -rw-r--r-- vagrant/vagrant 21770 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/main.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 629 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-crypto-perf/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-sad/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 16025 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-sad/main.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 227 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-sad/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 22123 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/benchmark.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 3926 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/config.ini 00:46:58.323 -rw-r--r-- vagrant/vagrant 18616 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/main.c 00:46:58.323 -rw-r--r-- vagrant/vagrant 1447 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/main.h 00:46:58.323 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-dma-perf/meson.build 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-security-perf/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-security-perf/meson.build 00:46:58.323 -rw-r--r-- vagrant/vagrant 13942 2024-06-07 12:49 spdk-test_gen_spec/dpdk/app/test-security-perf/test_security_perf.c 00:46:58.323 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/ 00:46:58.323 -rw-r--r-- vagrant/vagrant 3422 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/README 00:46:58.323 -rw-r--r-- vagrant/vagrant 1265 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/bsd-2-clause.txt 00:46:58.323 -rw-r--r-- vagrant/vagrant 1428 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/bsd-3-clause.txt 00:46:58.323 -rw-r--r-- vagrant/vagrant 1039 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/exceptions.txt 00:46:58.323 -rw-r--r-- vagrant/vagrant 18092 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/gpl-2.0.txt 00:46:58.323 -rw-r--r-- vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/isc.txt 00:46:58.323 -rw-r--r-- vagrant/vagrant 26521 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/lgpl-2.1.txt 00:46:58.324 -rw-r--r-- vagrant/vagrant 1054 2024-06-07 12:49 spdk-test_gen_spec/dpdk/license/mit.txt 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/ 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/pkg-config/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 2919 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/pkg-config/meson.build 00:46:58.324 -rw-r--r-- vagrant/vagrant 1089 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/pkg-config/set-static-linker-flags.py 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/subproject/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 1029 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/subproject/meson.build 00:46:58.324 -rwxr-xr-x vagrant/vagrant 1275 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/call-sphinx-build.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 2126 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/check-symbols.sh 00:46:58.324 -rw-r--r-- vagrant/vagrant 4503 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/coff.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 7396 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/dpdk-cmdline-gen.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 769 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/gen-pmdinfo-cfile.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 139 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/get-cpu-count.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 718 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/get-numa-count.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 1546 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/get-test-suites.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 632 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/has-hugepages.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 579 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/list-dir-globs.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 1480 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/map-list-symbol.sh 00:46:58.324 -rw-r--r-- vagrant/vagrant 1392 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/map_to_win.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 2063 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/meson.build 00:46:58.324 -rwxr-xr-x vagrant/vagrant 687 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/options-ibverbs-static.sh 00:46:58.324 -rwxr-xr-x vagrant/vagrant 7872 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/pmdinfogen.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 1524 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/symlink-drivers-solibs.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 533 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/symlink-drivers-solibs.sh 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/chkincs/ 00:46:58.324 -rwxr-xr-x vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/chkincs/gen_c_file_for_header.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 113 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/chkincs/main.c 00:46:58.324 -rw-r--r-- vagrant/vagrant 113 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/chkincs/main.cpp 00:46:58.324 -rw-r--r-- vagrant/vagrant 1931 2024-06-07 12:49 spdk-test_gen_spec/dpdk/buildtools/chkincs/meson.build 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/ 00:46:58.324 -rwxr-xr-x vagrant/vagrant 2011 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/cpu_layout.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 29868 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-devbind.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 9066 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-hugepages.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 9309 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-pmdinfo.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 13988 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-rss-flows.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 5601 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-telemetry-client.py 00:46:58.324 -rwxr-xr-x vagrant/vagrant 6001 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/dpdk-telemetry.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/dpdk/usertools/meson.build 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/ 00:46:58.324 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_armv8_linux_clang_ubuntu1804 -> arm64_armv8_linux_clang_ubuntu 00:46:58.324 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm32_armv8_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_altra_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_ampereone_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_armada_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 491 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_armv8_linux_clang_ubuntu 00:46:58.324 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_armv8_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 349 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_bluefield3_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_bluefield_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_capri_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 364 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_cdx_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_centriq2400_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_cn10k_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 357 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_cn9k_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 365 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_dpaa_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_elba_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_emag_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_ft2000plus_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_grace_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_graviton2_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_graviton3_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_graviton4_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_hip10_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_kunpeng920_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_kunpeng930_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_n1sdp_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_n2_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 359 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_odyssey_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 345 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_stingray_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_thunderx2_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_thunderxt83_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_thunderxt88_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 344 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_tys2500_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/arm64_v2_linux_gcc 00:46:58.324 -rwxr-xr-x vagrant/vagrant 582 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/armv8_machine.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 26850 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/arm/meson.build 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/loongarch/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 385 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/loongarch/loongarch_loongarch64_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 1023 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/loongarch/meson.build 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ppc/ 00:46:58.324 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ppc/ppc64le-power8-linux-gcc-ubuntu1804 -> ppc64le-power8-linux-gcc-ubuntu 00:46:58.324 -rw-r--r-- vagrant/vagrant 5215 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ppc/meson.build 00:46:58.324 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ppc/ppc64le-power8-linux-gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 258 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/ppc/ppc64le-power8-linux-gcc-ubuntu 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/riscv/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 3715 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/riscv/meson.build 00:46:58.324 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/riscv/riscv64_linux_gcc 00:46:58.324 -rw-r--r-- vagrant/vagrant 483 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/riscv/riscv64_sifive_u740_linux_gcc 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/x86/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 834 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/x86/binutils-avx512-check.py 00:46:58.324 -rw-r--r-- vagrant/vagrant 361 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/x86/cross-mingw 00:46:58.324 -rw-r--r-- vagrant/vagrant 3268 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/x86/meson.build 00:46:58.324 -rw-r--r-- vagrant/vagrant 18944 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/meson.build 00:46:58.324 -rw-r--r-- vagrant/vagrant 3943 2024-06-07 12:49 spdk-test_gen_spec/dpdk/config/rte_config.h 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isalcryptobuild/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 710 2024-06-07 12:49 spdk-test_gen_spec/isalcryptobuild/Makefile 00:46:58.324 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/isalcryptobuild/isa-l-crypto -> ../isa-l-crypto/include 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/ 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/ceph/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 1908 2024-06-07 12:49 spdk-test_gen_spec/scripts/ceph/ceph.conf 00:46:58.324 -rwxr-xr-x vagrant/vagrant 4274 2024-06-07 12:49 spdk-test_gen_spec/scripts/ceph/start.sh 00:46:58.324 -rwxr-xr-x vagrant/vagrant 278 2024-06-07 12:49 spdk-test_gen_spec/scripts/ceph/stop.sh 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/common/ 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/common/setup/ 00:46:58.324 -rwxr-xr-x vagrant/vagrant 6765 2024-06-07 12:49 spdk-test_gen_spec/scripts/common/setup/interactive.sh 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/ 00:46:58.324 -rw-r--r-- vagrant/vagrant 1888 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/README.md 00:46:58.324 -rwxr-xr-x vagrant/vagrant 2898 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/check_dpdk_pci_api.sh 00:46:58.324 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/22.11/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 588 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/22.11/22.11-rte_bus_pci.h.patch 00:46:58.325 -rw-r--r-- vagrant/vagrant 1627 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/22.11/23.07-rte_bus_pci.h.patch 00:46:58.325 -rw-r--r-- vagrant/vagrant 4611 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/22.11/23.11-rte_bus_pci.h.patch 00:46:58.325 -rw-r--r-- vagrant/vagrant 4370 2024-06-07 12:49 spdk-test_gen_spec/scripts/env_dpdk/22.11/23.11-rte_dev.h.patch 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/ 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/ 00:46:58.325 -rwxr-xr-x vagrant/vagrant 15050 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/collect-bmc-pm 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1796 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/collect-cpu-load 00:46:58.325 -rwxr-xr-x vagrant/vagrant 7329 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/collect-cpu-temp 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2942 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/collect-vmstat 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2365 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/pm/common 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/vhost/ 00:46:58.325 -rwxr-xr-x vagrant/vagrant 10878 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/vhost/conf-generator 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2204 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/vhost/run_vhost_test.sh 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvme/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 1033 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvme/README 00:46:58.325 -rw-r--r-- vagrant/vagrant 344 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvme/fio_test.conf 00:46:58.325 -rwxr-xr-x vagrant/vagrant 7373 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvme/run_fio_test.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 670 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvme/run_fio_test.sh 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 17335 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/README.md 00:46:58.325 -rw-r--r-- vagrant/vagrant 11268 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/common.py 00:46:58.325 -rw-r--r-- vagrant/vagrant 840 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/config.json 00:46:58.325 -rwxr-xr-x vagrant/vagrant 82892 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/run_nvmf.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1194 2024-06-07 12:49 spdk-test_gen_spec/scripts/perf/nvmf/set_xps_rxqs 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1217 2024-06-07 12:49 spdk-test_gen_spec/scripts/ar-xnvme-fixer 00:46:58.325 -rwxr-xr-x vagrant/vagrant 12805 2024-06-07 12:49 spdk-test_gen_spec/scripts/arm_cross_compile.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 6869 2024-06-07 12:49 spdk-test_gen_spec/scripts/backport.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 623 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpftrace.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 11044 2024-06-07 12:49 spdk-test_gen_spec/scripts/calc-iobuf.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 26200 2024-06-07 12:49 spdk-test_gen_spec/scripts/check_format.sh 00:46:58.325 -rw-r--r-- vagrant/vagrant 13575 2024-06-07 12:49 spdk-test_gen_spec/scripts/common.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3404 2024-06-07 12:49 spdk-test_gen_spec/scripts/core-collector.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3859 2024-06-07 12:49 spdk-test_gen_spec/scripts/detect_cc.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 15204 2024-06-07 12:49 spdk-test_gen_spec/scripts/dpdk_mem_info.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 658 2024-06-07 12:49 spdk-test_gen_spec/scripts/eofnl 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3797 2024-06-07 12:49 spdk-test_gen_spec/scripts/fio-wrapper 00:46:58.325 -rw-r--r-- vagrant/vagrant 11421 2024-06-07 12:49 spdk-test_gen_spec/scripts/gdb_macros.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1293 2024-06-07 12:49 spdk-test_gen_spec/scripts/gen_ftl.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3488 2024-06-07 12:49 spdk-test_gen_spec/scripts/gen_nvme.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3234 2024-06-07 12:49 spdk-test_gen_spec/scripts/gen_sma_goapi.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1472 2024-06-07 12:49 spdk-test_gen_spec/scripts/genconfig.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2040 2024-06-07 12:49 spdk-test_gen_spec/scripts/get-pmr 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1413 2024-06-07 12:49 spdk-test_gen_spec/scripts/histogram.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 16632 2024-06-07 12:49 spdk-test_gen_spec/scripts/iostat.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2029 2024-06-07 12:49 spdk-test_gen_spec/scripts/ledctl.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3732 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep.sh 00:46:58.325 -rw-r--r-- vagrant/vagrant 932 2024-06-07 12:49 spdk-test_gen_spec/scripts/posix.txt 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1835 2024-06-07 12:49 spdk-test_gen_spec/scripts/prep_benchmarks.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 5130 2024-06-07 12:49 spdk-test_gen_spec/scripts/qat_setup.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 216868 2024-06-07 12:49 spdk-test_gen_spec/scripts/rpc.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 4290 2024-06-07 12:49 spdk-test_gen_spec/scripts/rpc_http_proxy.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 6112 2024-06-07 12:49 spdk-test_gen_spec/scripts/rxe_cfg_small.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 29485 2024-06-07 12:49 spdk-test_gen_spec/scripts/setup.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2658 2024-06-07 12:49 spdk-test_gen_spec/scripts/sma-client.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 5056 2024-06-07 12:49 spdk-test_gen_spec/scripts/sma.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2318 2024-06-07 12:49 spdk-test_gen_spec/scripts/spdk-gpt.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3397 2024-06-07 12:49 spdk-test_gen_spec/scripts/spdkcli.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 5552 2024-06-07 12:49 spdk-test_gen_spec/scripts/spdx.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 5052 2024-06-07 12:49 spdk-test_gen_spec/scripts/sync_dev_uevents.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/scripts/lspci 00:46:58.325 -rwxr-xr-x vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/scripts/pc.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/scripts/pc_libs.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/scripts/pc_modules.sh 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/ 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2977 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/arch.sh 00:46:58.325 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/azurelinux.sh -> mariner.sh 00:46:58.325 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/centos.sh -> rhel.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 8479 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/common.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2415 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/debian.sh 00:46:58.325 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/fedora.sh -> rhel.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1199 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/freebsd.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2566 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/mariner.sh 00:46:58.325 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/openeuler.sh -> rhel.sh 00:46:58.325 -rw-r--r-- vagrant/vagrant 141 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/requirements.txt 00:46:58.325 -rwxr-xr-x vagrant/vagrant 8216 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/rhel.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1672 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/sles.sh 00:46:58.325 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/pkgdep/ubuntu.sh -> debian.sh 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 9013 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/README.md 00:46:58.325 -rw-r--r-- vagrant/vagrant 13568 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/Vagrantfile 00:46:58.325 -rw-r--r-- vagrant/vagrant 753 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/autorun-spdk.conf 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1544 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/create_nvme_img.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 11562 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/create_vbox.sh 00:46:58.325 -rw-r--r-- vagrant/vagrant 968 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/local.conf 00:46:58.325 -rwxr-xr-x vagrant/vagrant 5011 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/run-autorun.sh 00:46:58.325 -rwxr-xr-x vagrant/vagrant 3314 2024-06-07 12:49 spdk-test_gen_spec/scripts/vagrant/update.sh 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/bash-completion/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 6483 2024-06-07 12:49 spdk-test_gen_spec/scripts/bash-completion/spdk 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/ 00:46:58.325 -rwxr-xr-x vagrant/vagrant 2055 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/gen.py 00:46:58.325 -rwxr-xr-x vagrant/vagrant 1535 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/gen_enums.sh 00:46:58.325 -rw-r--r-- vagrant/vagrant 1494 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/intr-wakeups.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 798 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/nvme.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 6756 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/nvmf.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 131 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/nvmf_path.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/nvmf_timeout.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 83 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/readv.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 1764 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/sched.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 158 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/send_msg.bt 00:46:58.325 -rw-r--r-- vagrant/vagrant 150 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/syscalls.bt 00:46:58.325 -rwxr-xr-x vagrant/vagrant 23816 2024-06-07 12:49 spdk-test_gen_spec/scripts/bpf/trace.py 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/dpdkbuild/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 9011 2024-06-07 12:49 spdk-test_gen_spec/dpdkbuild/Makefile 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/event/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 458 2024-06-07 12:49 spdk-test_gen_spec/lib/event/Makefile 00:46:58.325 -rw-r--r-- vagrant/vagrant 45400 2024-06-07 12:49 spdk-test_gen_spec/lib/event/app.c 00:46:58.325 -rw-r--r-- vagrant/vagrant 21115 2024-06-07 12:49 spdk-test_gen_spec/lib/event/app_rpc.c 00:46:58.325 -rw-r--r-- vagrant/vagrant 718 2024-06-07 12:49 spdk-test_gen_spec/lib/event/event_internal.h 00:46:58.325 -rw-r--r-- vagrant/vagrant 8279 2024-06-07 12:49 spdk-test_gen_spec/lib/event/log_rpc.c 00:46:58.325 -rw-r--r-- vagrant/vagrant 39967 2024-06-07 12:49 spdk-test_gen_spec/lib/event/reactor.c 00:46:58.325 -rw-r--r-- vagrant/vagrant 983 2024-06-07 12:49 spdk-test_gen_spec/lib/event/scheduler_static.c 00:46:58.325 -rw-r--r-- vagrant/vagrant 854 2024-06-07 12:49 spdk-test_gen_spec/lib/event/spdk_event.map 00:46:58.325 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/lvol/ 00:46:58.325 -rw-r--r-- vagrant/vagrant 349 2024-06-07 12:49 spdk-test_gen_spec/lib/lvol/Makefile 00:46:58.326 -rw-r--r-- vagrant/vagrant 63107 2024-06-07 12:49 spdk-test_gen_spec/lib/lvol/lvol.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 828 2024-06-07 12:49 spdk-test_gen_spec/lib/lvol/spdk_lvol.map 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/Makefile 00:46:58.326 -rw-r--r-- vagrant/vagrant 9689 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/dev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 15308 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/lun.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 2174 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/port.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 3829 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/scsi.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 56857 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/scsi_bdev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 5907 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/scsi_internal.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 29038 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/scsi_pr.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1172 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/scsi_rpc.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1355 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/spdk_scsi.map 00:46:58.326 -rw-r--r-- vagrant/vagrant 6170 2024-06-07 12:49 spdk-test_gen_spec/lib/scsi/task.c 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 488 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/Makefile 00:46:58.326 -rw-r--r-- vagrant/vagrant 494 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/spdk_vfu_tgt.map 00:46:58.326 -rw-r--r-- vagrant/vagrant 20121 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/tgt_endpoint.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 619 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/tgt_internal.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 1371 2024-06-07 12:49 spdk-test_gen_spec/lib/vfu_tgt/tgt_rpc.c 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 1409 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_addr_utils.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3876 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_bitmap.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 2751 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_bitmap.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3955 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_conf.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_conf.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 720 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_defs.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 980 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_df.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 13421 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_layout_tracker_bdev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 4095 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_layout_tracker_bdev.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 881 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_log.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 27753 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_md.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 10356 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_md.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 6104 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_mempool.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 5274 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_mempool.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 7775 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_property.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 5566 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/utils/ftl_property.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 14433 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_reloc.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1924 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_rq.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 4387 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_sb.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 875 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_sb.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 2636 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_sb_common.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_sb_current.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 8972 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_trace.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1529 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_trace.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3591 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p_flat.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1135 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p_flat.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 25931 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_layout.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 9293 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_layout.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 70846 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_nv_cache.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 8371 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_nv_cache.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_nv_cache_io.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 15930 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_p2l.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 5233 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_internal.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 5014 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_io.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 8027 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_io.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 7081 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 2038 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 41528 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p_cache.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1290 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_l2p_cache.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 1999 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/Makefile 00:46:58.326 -rw-r--r-- vagrant/vagrant 16708 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_band.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 8466 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_core.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 5532 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_debug.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1161 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_debug.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 4250 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_init.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 8085 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_band.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 13906 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_band_ops.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 20678 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_core.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 441 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/spdk_ftl.map 00:46:58.326 -rw-r--r-- vagrant/vagrant 341 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_utils.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 5322 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_writer.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1725 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/ftl_writer.h 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 3104 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_bdev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1816 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3908 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_bdev.gcno 00:46:58.326 -rw-r--r-- vagrant/vagrant 3372 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.gcno 00:46:58.326 -rw-r--r-- vagrant/vagrant 4252 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_bdev.d 00:46:58.326 -rw-r--r-- vagrant/vagrant 80416 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_bdev.o 00:46:58.326 -rw-r--r-- vagrant/vagrant 4065 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.d 00:46:58.326 -rw-r--r-- vagrant/vagrant 77824 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.o 00:46:58.326 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_bdev.gcda 00:46:58.326 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/base/ftl_base_dev.gcda 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 19516 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_md.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 11645 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_misc.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 3597 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_p2l.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 30453 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_recovery.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 5082 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_self_test.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 2764 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_shutdown.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 8413 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_startup.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 6047 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_steps.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3502 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_upgrade.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1260 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_l2p.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 12782 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 10417 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 11978 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_band.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 7360 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_bdev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 4810 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/mngt/ftl_mngt_ioch.c 00:46:58.326 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ 00:46:58.326 -rw-r--r-- vagrant/vagrant 3708 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_bdev_vss.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 1855 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.c 00:46:58.326 -rw-r--r-- vagrant/vagrant 2585 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.h 00:46:58.326 -rw-r--r-- vagrant/vagrant 3396 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.gcno 00:46:58.326 -rw-r--r-- vagrant/vagrant 5840 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_bdev_vss.gcno 00:46:58.326 -rw-r--r-- vagrant/vagrant 4094 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_bdev_vss.d 00:46:58.326 -rw-r--r-- vagrant/vagrant 81320 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:46:58.326 -rw-r--r-- vagrant/vagrant 2697 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.d 00:46:58.327 -rw-r--r-- vagrant/vagrant 54952 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.o 00:46:58.327 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_bdev_vss.gcda 00:46:58.327 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/nvc/ftl_nvc_dev.gcda 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 4739 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_band_upgrade.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 4572 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_chunk_upgrade.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 9922 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_layout_upgrade.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 6097 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_layout_upgrade.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 3611 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_p2l_upgrade.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 3970 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_prev.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 4987 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_upgrade.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 565 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_upgrade.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 4699 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_v3.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 736 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_v3.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 21561 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_v5.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 901 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_sb_v5.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 3626 2024-06-07 12:49 spdk-test_gen_spec/lib/ftl/upgrade/ftl_trim_upgrade.c 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/mlx5/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/lib/mlx5/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 9454 2024-06-07 12:49 spdk-test_gen_spec/lib/mlx5/mlx5_crypto.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 197 2024-06-07 12:49 spdk-test_gen_spec/lib/mlx5/spdk_mlx5.map 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/sock/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 435 2024-06-07 12:49 spdk-test_gen_spec/lib/sock/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 21926 2024-06-07 12:49 spdk-test_gen_spec/lib/sock/sock.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 7341 2024-06-07 12:49 spdk-test_gen_spec/lib/sock/sock_rpc.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/lib/sock/spdk_sock.map 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 445 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 54363 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/rte_vhost_user.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 681 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/spdk_vhost.map 00:46:58.327 -rw-r--r-- vagrant/vagrant 12223 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/vhost.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 59285 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/vhost_blk.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 21689 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/vhost_internal.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 17048 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/vhost_rpc.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 49204 2024-06-07 12:49 spdk-test_gen_spec/lib/vhost/vhost_scsi.c 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 1338 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/accel_internal.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 464 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 92142 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/accel.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 14616 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/accel_rpc.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 20549 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/accel_sw.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 1577 2024-06-07 12:49 spdk-test_gen_spec/lib/accel/spdk_accel.map 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 423 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 47899 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/idxd.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 3662 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/idxd_internal.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 5562 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/idxd_kernel.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 16813 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/idxd_user.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 627 2024-06-07 12:49 spdk-test_gen_spec/lib/idxd/spdk_idxd.map 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 355 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 24586 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/nbd.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 614 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/nbd_internal.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 9767 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/nbd_rpc.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 161 2024-06-07 12:49 spdk-test_gen_spec/lib/nbd/spdk_nbd.map 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 363 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 18397 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/iobuf.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 2578 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/spdk_thread.map 00:46:58.327 -rw-r--r-- vagrant/vagrant 75339 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/thread.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 1100 2024-06-07 12:49 spdk-test_gen_spec/lib/thread/thread_internal.h 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 430 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 777 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/spdk_virtio.map 00:46:58.327 -rw-r--r-- vagrant/vagrant 17570 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/virtio.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 19796 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/virtio_pci.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 13967 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/virtio_vfio_user.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 27035 2024-06-07 12:49 spdk-test_gen_spec/lib/virtio/virtio_vhost_user.c 00:46:58.327 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/ 00:46:58.327 -rw-r--r-- vagrant/vagrant 479 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/Makefile 00:46:58.327 -rw-r--r-- vagrant/vagrant 293821 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/bdev.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 976 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/bdev_internal.h 00:46:58.327 -rw-r--r-- vagrant/vagrant 33468 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/bdev_rpc.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 5654 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/bdev_zone.c 00:46:58.327 -rw-r--r-- vagrant/vagrant 18591 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/part.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 8783 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/scsi_nvme.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 5279 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/spdk_bdev.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/lib/bdev/vtune.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/init/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/lib/init/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 19727 2024-06-07 12:49 spdk-test_gen_spec/lib/init/json_config.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 5416 2024-06-07 12:49 spdk-test_gen_spec/lib/init/rpc.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 401 2024-06-07 12:49 spdk-test_gen_spec/lib/init/spdk_init.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 6738 2024-06-07 12:49 spdk-test_gen_spec/lib/init/subsystem.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 840 2024-06-07 12:49 spdk-test_gen_spec/lib/init/subsystem.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 4163 2024-06-07 12:49 spdk-test_gen_spec/lib/init/subsystem_rpc.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/notify/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/lib/notify/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 2805 2024-06-07 12:49 spdk-test_gen_spec/lib/notify/notify.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 2722 2024-06-07 12:49 spdk-test_gen_spec/lib/notify/notify_rpc.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 156 2024-06-07 12:49 spdk-test_gen_spec/lib/notify/spdk_notify.map 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 931 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/spdk_trace.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 11188 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/trace.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 15562 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/trace_flags.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 340 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/trace_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 7632 2024-06-07 12:49 spdk-test_gen_spec/lib/trace/trace_rpc.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 351 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 4351 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/led.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 226 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/spdk_vmd.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 42404 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/vmd.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 3885 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/vmd_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 11876 2024-06-07 12:49 spdk-test_gen_spec/lib/vmd/vmd_spec.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 1383 2024-06-07 12:49 spdk-test_gen_spec/lib/Makefile 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 387 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 8928 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/blob_bs_dev.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 280704 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/blobstore.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 18719 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/blobstore.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 16373 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/request.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 6332 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/request.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 1819 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/spdk_blob.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 4164 2024-06-07 12:49 spdk-test_gen_spec/lib/blob/zeroes.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ioat/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/lib/ioat/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 16207 2024-06-07 12:49 spdk-test_gen_spec/lib/ioat/ioat.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1628 2024-06-07 12:49 spdk-test_gen_spec/lib/ioat/ioat_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/lib/ioat/spdk_ioat.map 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 1495 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 41368 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 33972 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_auth.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 151184 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ctrlr.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 28064 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ctrlr_cmd.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1729 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ctrlr_ocssd_cmd.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 41707 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_cuse.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 355 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_cuse.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 4134 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_discovery.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 19335 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_fabric.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 5579 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ns_ocssd_cmd.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 59988 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_opal.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 6195 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_opal_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 30010 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_pcie.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 53586 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_pcie_common.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 10948 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_pcie_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 8316 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_poll_group.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 41162 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_qpair.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 5186 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_quirks.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 94560 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_rdma.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1653 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_stubs.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 92882 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_tcp.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 24384 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_transport.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 9928 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_vfio_user.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 7690 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_zns.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 8296 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/spdk_nvme.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 51847 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_internal.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 5341 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_io_msg.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1841 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_io_msg.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 13391 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ns.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 45487 2024-06-07 12:49 spdk-test_gen_spec/lib/nvme/nvme_ns_cmd.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/trace_parser/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/lib/trace_parser/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/lib/trace_parser/spdk_trace_parser.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 10982 2024-06-07 12:49 spdk-test_gen_spec/lib/trace_parser/trace.cpp 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 72209 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/blobfs.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1599 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/cache_tree.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 902 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/spdk_blobfs.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 3448 2024-06-07 12:49 spdk-test_gen_spec/lib/blobfs/tree.c 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 515 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/Makefile 00:46:58.328 -rw-r--r-- vagrant/vagrant 44553 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/conn.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 6329 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/conn.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 12488 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/init_grp.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/init_grp.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 134036 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/iscsi.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 14255 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/iscsi.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 59014 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/iscsi_rpc.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 34957 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/iscsi_subsystem.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1107 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/md5.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 583 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/md5.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 29199 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/param.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 1765 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/param.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 11114 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/portal_grp.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 2847 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/portal_grp.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 130 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/spdk_iscsi.map 00:46:58.328 -rw-r--r-- vagrant/vagrant 2157 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/task.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 3791 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/task.h 00:46:58.328 -rw-r--r-- vagrant/vagrant 33274 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/tgt_node.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 4297 2024-06-07 12:49 spdk-test_gen_spec/lib/iscsi/tgt_node.h 00:46:58.328 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/ 00:46:58.328 -rw-r--r-- vagrant/vagrant 126201 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/tcp.c 00:46:58.328 -rw-r--r-- vagrant/vagrant 26963 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/transport.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 2370 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/transport.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 161975 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/vfio_user.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 103262 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/subsystem.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 154460 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/ctrlr.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 33200 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/ctrlr_bdev.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 7440 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/ctrlr_discovery.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 108083 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/fc.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 52968 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/fc_ls.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 7328 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/mdns_server.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 52422 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/nvmf.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 27497 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/nvmf_fc.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 19495 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/nvmf_internal.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 93642 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/nvmf_rpc.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 166529 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/rdma.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 4394 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/spdk_nvmf.map 00:46:58.329 -rw-r--r-- vagrant/vagrant 1151 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/stubs.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 1377 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 25770 2024-06-07 12:49 spdk-test_gen_spec/lib/nvmf/auth.c 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 91 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/spdk_ublk.map 00:46:58.329 -rw-r--r-- vagrant/vagrant 51814 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/ublk.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 2010 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/ublk_internal.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 9873 2024-06-07 12:49 spdk-test_gen_spec/lib/ublk/ublk_rpc.c 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/conf/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/lib/conf/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 11248 2024-06-07 12:49 spdk-test_gen_spec/lib/conf/conf.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 480 2024-06-07 12:49 spdk-test_gen_spec/lib/conf/spdk_conf.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/json/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 379 2024-06-07 12:49 spdk-test_gen_spec/lib/json/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 15414 2024-06-07 12:49 spdk-test_gen_spec/lib/json/json_parse.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 14474 2024-06-07 12:49 spdk-test_gen_spec/lib/json/json_util.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 17740 2024-06-07 12:49 spdk-test_gen_spec/lib/json/json_write.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 1999 2024-06-07 12:49 spdk-test_gen_spec/lib/json/spdk_json.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 1096 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 12939 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/common.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 7449 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/rdma_mlx5_dv.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 3830 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/rdma_verbs.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 519 2024-06-07 12:49 spdk-test_gen_spec/lib/rdma/spdk_rdma.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ut/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 365 2024-06-07 12:49 spdk-test_gen_spec/lib/ut/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 46 2024-06-07 12:49 spdk-test_gen_spec/lib/ut/spdk_ut.map 00:46:58.329 -rw-r--r-- vagrant/vagrant 5410 2024-06-07 12:49 spdk-test_gen_spec/lib/ut/ut.c 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/dma/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/lib/dma/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 7239 2024-06-07 12:49 spdk-test_gen_spec/lib/dma/dma.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 647 2024-06-07 12:49 spdk-test_gen_spec/lib/dma/spdk_dma.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 4750 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/jsonrpc_client.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 8725 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/jsonrpc_client_tcp.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 3881 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/jsonrpc_internal.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 11115 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/jsonrpc_server.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 10071 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/jsonrpc_server_tcp.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 762 2024-06-07 12:49 spdk-test_gen_spec/lib/jsonrpc/spdk_jsonrpc.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/reduce/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/lib/reduce/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 52803 2024-06-07 12:49 spdk-test_gen_spec/lib/reduce/reduce.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 278 2024-06-07 12:49 spdk-test_gen_spec/lib/reduce/spdk_reduce.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/ut_mock/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 351 2024-06-07 12:49 spdk-test_gen_spec/lib/ut_mock/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 1045 2024-06-07 12:49 spdk-test_gen_spec/lib/ut_mock/mock.c 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/ 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.07/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 10596 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.07/rte_bus.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 11515 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.07/rte_bus_pci.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 12961 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.07/rte_dev.h 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 4787 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/rte_bus_pci.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 13942 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/rte_dev.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 8432 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/bus_driver.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 6475 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/bus_pci_driver.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 1031 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/dev_driver.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 2749 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/22.11/rte_bus.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 18627 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/init.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 39320 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/memory.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 3272 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_ioat.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 793 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_virtio.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 614 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_vmd.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 2630 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/sigbus_handler.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 3008 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/spdk_env_dpdk.map 00:46:58.329 -rw-r--r-- vagrant/vagrant 1444 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/threads.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 9052 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/env.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 6522 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/env.mk 00:46:58.329 -rw-r--r-- vagrant/vagrant 1696 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/env_internal.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 1360 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 27981 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 5375 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_dpdk.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 3495 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_dpdk.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 5551 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_dpdk_2207.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 6100 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_dpdk_2211.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 6025 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_event.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 622 2024-06-07 12:49 spdk-test_gen_spec/lib/env_dpdk/pci_idxd.c 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 364 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/Makefile 00:46:58.329 -rw-r--r-- vagrant/vagrant 8763 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/keyring.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/keyring_internal.h 00:46:58.329 -rw-r--r-- vagrant/vagrant 917 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/keyring_rpc.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/lib/keyring/spdk_keyring.map 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/rocksdb/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 17968 2024-06-07 12:49 spdk-test_gen_spec/lib/rocksdb/env_spdk.cc 00:46:58.329 -rw-r--r-- vagrant/vagrant 1099 2024-06-07 12:49 spdk-test_gen_spec/lib/rocksdb/spdk.rocksdb.mk 00:46:58.329 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/util/ 00:46:58.329 -rw-r--r-- vagrant/vagrant 58569 2024-06-07 12:49 spdk-test_gen_spec/lib/util/dif.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 1354 2024-06-07 12:49 spdk-test_gen_spec/lib/util/fd.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 16236 2024-06-07 12:49 spdk-test_gen_spec/lib/util/base64_sve.c 00:46:58.329 -rw-r--r-- vagrant/vagrant 11736 2024-06-07 12:49 spdk-test_gen_spec/lib/util/bit_array.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 5971 2024-06-07 12:49 spdk-test_gen_spec/lib/util/cpuset.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 40280 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc16.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 1604 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc32.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 497 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc32_ieee.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 2982 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc32c.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 6859 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc64.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 523 2024-06-07 12:49 spdk-test_gen_spec/lib/util/crc_internal.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/lib/util/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 7659 2024-06-07 12:49 spdk-test_gen_spec/lib/util/base64.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 7649 2024-06-07 12:49 spdk-test_gen_spec/lib/util/base64_neon.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 10998 2024-06-07 12:49 spdk-test_gen_spec/lib/util/fd_group.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 943 2024-06-07 12:49 spdk-test_gen_spec/lib/util/file.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 1491 2024-06-07 12:49 spdk-test_gen_spec/lib/util/hexlify.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 4845 2024-06-07 12:49 spdk-test_gen_spec/lib/util/iov.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 1393 2024-06-07 12:49 spdk-test_gen_spec/lib/util/math.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 6812 2024-06-07 12:49 spdk-test_gen_spec/lib/util/pipe.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 4088 2024-06-07 12:49 spdk-test_gen_spec/lib/util/spdk_util.map 00:46:58.330 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/lib/util/strerror_tls.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 9377 2024-06-07 12:49 spdk-test_gen_spec/lib/util/string.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 1241 2024-06-07 12:49 spdk-test_gen_spec/lib/util/util_internal.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 4528 2024-06-07 12:49 spdk-test_gen_spec/lib/util/uuid.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 2744 2024-06-07 12:49 spdk-test_gen_spec/lib/util/xor.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 2468 2024-06-07 12:49 spdk-test_gen_spec/lib/util/zipf.c 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 1747 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 2800 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/mpool.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 734 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/mpool.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 4342 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/ocf_env.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 17165 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/ocf_env.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 324 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/ocf_env_headers.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 4312 2024-06-07 12:49 spdk-test_gen_spec/lib/env_ocf/ocf_env_list.h 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/log/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/lib/log/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 5555 2024-06-07 12:49 spdk-test_gen_spec/lib/log/log.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 3790 2024-06-07 12:49 spdk-test_gen_spec/lib/log/log_deprecated.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 3032 2024-06-07 12:49 spdk-test_gen_spec/lib/log/log_flags.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 708 2024-06-07 12:49 spdk-test_gen_spec/lib/log/spdk_log.map 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/rpc/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/lib/rpc/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 11881 2024-06-07 12:49 spdk-test_gen_spec/lib/rpc/rpc.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/lib/rpc/spdk_rpc.map 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/ 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 212 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/spdk_vfio_user.map 00:46:58.330 -rw-r--r-- vagrant/vagrant 9518 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/vfio_user.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 1926 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/vfio_user_internal.h 00:46:58.330 -rw-r--r-- vagrant/vagrant 10813 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/host/vfio_user_pci.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/lib/vfio_user/Makefile 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/shared_lib/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/shared_lib/Makefile 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/ 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/interrupt_tgt/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/examples/interrupt_tgt/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 564 2024-06-07 12:49 spdk-test_gen_spec/examples/interrupt_tgt/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 798 2024-06-07 12:49 spdk-test_gen_spec/examples/interrupt_tgt/interrupt_plugin.py 00:46:58.330 -rw-r--r-- vagrant/vagrant 4169 2024-06-07 12:49 spdk-test_gen_spec/examples/interrupt_tgt/interrupt_tgt.c 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/Makefile 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/perf/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 13006 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/perf/perf.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/perf/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/perf/Makefile 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/verify/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/verify/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 11376 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/verify/verify.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/examples/ioat/verify/.gitignore 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/ 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hello_world/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hello_world/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hello_world/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 15333 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hello_world/hello_world.c 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hotplug/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 442 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hotplug/hotplug_plugin.py 00:46:58.330 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hotplug/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hotplug/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 14004 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/hotplug/hotplug.c 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/nvme_manage/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/nvme_manage/.gitignore 00:46:58.330 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/nvme_manage/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 45295 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/nvme_manage/nvme_manage.c 00:46:58.330 -rw-r--r-- vagrant/vagrant 456 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/Makefile 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/pmr_persistence/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/pmr_persistence/Makefile 00:46:58.330 -rw-r--r-- vagrant/vagrant 9145 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/pmr_persistence/pmr_persistence.c 00:46:58.330 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/reconnect/ 00:46:58.330 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/reconnect/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 218 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/reconnect/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 28523 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/reconnect/reconnect.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/abort/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 214 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/abort/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 28540 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/abort/abort.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/abort/.gitignore 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/arbitration/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/arbitration/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/arbitration/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 29816 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/arbitration/arbitration.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/cmb_copy/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/cmb_copy/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 248 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/cmb_copy/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 8901 2024-06-07 12:49 spdk-test_gen_spec/examples/nvme/cmb_copy/cmb_copy.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/nvmf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 22967 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/nvmf/nvmf.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/nvmf/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 425 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/nvmf/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 1501 2024-06-07 12:49 spdk-test_gen_spec/examples/nvmf/nvmf/README.md 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 335 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/hello_world/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/hello_world/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/hello_world/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 14506 2024-06-07 12:49 spdk-test_gen_spec/examples/sock/hello_world/hello_sock.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 483 2024-06-07 12:49 spdk-test_gen_spec/examples/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/perf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/perf/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 383 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/perf/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 43124 2024-06-07 12:49 spdk-test_gen_spec/examples/accel/perf/accel_perf.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/thread/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 321 2024-06-07 12:49 spdk-test_gen_spec/examples/thread/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/thread/thread/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 374 2024-06-07 12:49 spdk-test_gen_spec/examples/thread/thread/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 15587 2024-06-07 12:49 spdk-test_gen_spec/examples/thread/thread/thread_ex.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/hello_world/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/hello_world/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 379 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/hello_world/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/hello_world/bdev.json 00:46:58.331 -rw-r--r-- vagrant/vagrant 9159 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/hello_world/hello_bdev.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 398 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/bdevperf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/bdevperf/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 522 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/bdevperf/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 76338 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/bdevperf/bdevperf.c 00:46:58.331 -rwxr-xr-x vagrant/vagrant 4147 2024-06-07 12:49 spdk-test_gen_spec/examples/bdev/bdevperf/bdevperf.py 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/util/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 328 2024-06-07 12:49 spdk-test_gen_spec/examples/util/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/util/zipf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 335 2024-06-07 12:49 spdk-test_gen_spec/examples/util/zipf/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 1516 2024-06-07 12:49 spdk-test_gen_spec/examples/util/zipf/zipf.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/cli/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/cli/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 487 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/cli/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 1950 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/cli/README.md 00:46:58.331 -rw-r--r-- vagrant/vagrant 42326 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/cli/blobcli.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/hello_world/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/hello_world/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 379 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/hello_world/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 12099 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/hello_world/hello_blob.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/hello_world/hello_blob.json 00:46:58.331 -rw-r--r-- vagrant/vagrant 330 2024-06-07 12:49 spdk-test_gen_spec/examples/blob/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/lsvmd/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/lsvmd/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 351 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/lsvmd/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 1492 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/lsvmd/lsvmd.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 324 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/led/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 4 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/led/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 347 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/led/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 4043 2024-06-07 12:49 spdk-test_gen_spec/examples/vmd/led/led.c 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/go/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/go/hello_gorpc/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 1455 2024-06-07 12:49 spdk-test_gen_spec/examples/go/hello_gorpc/hello_gorpc.go 00:46:58.331 -rw-r--r-- vagrant/vagrant 483 2024-06-07 12:49 spdk-test_gen_spec/examples/go/hello_gorpc/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 172 2024-06-07 12:49 spdk-test_gen_spec/examples/go/hello_gorpc/go.mod 00:46:58.331 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/examples/go/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/perf/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/perf/.gitignore 00:46:58.331 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/perf/Makefile 00:46:58.331 -rw-r--r-- vagrant/vagrant 23736 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/perf/perf.c 00:46:58.331 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/examples/idxd/Makefile 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/ 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 5228 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/settings.yml 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/workflows/ 00:46:58.331 -rwxr-xr-x vagrant/vagrant 865 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/workflows/coverity.sh 00:46:58.331 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/workflows/coverity.yml 00:46:58.331 -rwxr-xr-x vagrant/vagrant 1604 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/workflows/pull_request.sh 00:46:58.331 -rw-r--r-- vagrant/vagrant 1928 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.github/workflows/pull_request.yml 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 2482 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/ioregionfd.md 00:46:58.331 -rw-r--r-- vagrant/vagrant 1739 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/libvfio-user.drawio 00:46:58.331 -rw-r--r-- vagrant/vagrant 29709 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/libvfio-user.png 00:46:58.331 -rw-r--r-- vagrant/vagrant 4829 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/memory-mapping.md 00:46:58.331 -rw-r--r-- vagrant/vagrant 135 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/meson.build 00:46:58.331 -rw-r--r-- vagrant/vagrant 3750 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/spdk.md 00:46:58.331 -rw-r--r-- vagrant/vagrant 2854 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/testing.md 00:46:58.331 -rw-r--r-- vagrant/vagrant 76493 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/docs/vfio-user.rst 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 40862 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/libvfio-user.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/meson.build 00:46:58.331 -rw-r--r-- vagrant/vagrant 6154 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_defs.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 7048 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/vfio-user.h 00:46:58.331 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/ 00:46:58.331 -rw-r--r-- vagrant/vagrant 2838 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/common.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 2283 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/dsn.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/meson.build 00:46:58.331 -rw-r--r-- vagrant/vagrant 2726 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/msi.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 3086 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/msix.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 3046 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/pm.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 8932 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/include/pci_caps/px.h 00:46:58.331 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.ctags 00:46:58.332 -rw-r--r-- vagrant/vagrant 37 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.git 00:46:58.332 -rw-r--r-- vagrant/vagrant 40 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.gitignore 00:46:58.332 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/.gitmodules 00:46:58.332 -rw-r--r-- vagrant/vagrant 1516 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/LICENSE 00:46:58.332 -rw-r--r-- vagrant/vagrant 2080 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/Makefile 00:46:58.332 -rw-r--r-- vagrant/vagrant 11178 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/README.md 00:46:58.332 -rw-r--r-- vagrant/vagrant 2054 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/meson.build 00:46:58.332 -rw-r--r-- vagrant/vagrant 731 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/meson_options.txt 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/ 00:46:58.332 -rw-r--r-- vagrant/vagrant 184 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/.indent.pro 00:46:58.332 -rw-r--r-- vagrant/vagrant 3643 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/common.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 18705 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/dma.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 11655 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/dma.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 13014 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/irq.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 2091 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/irq.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 65513 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/libvfio-user.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 1014 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/meson.build 00:46:58.332 -rw-r--r-- vagrant/vagrant 18576 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/migration.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 3015 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/migration.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 4833 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/migration_priv.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 15507 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/pci.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 2532 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/pci.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 25587 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/pci_caps.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 2978 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/pci_caps.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 7151 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/private.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 10076 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 3214 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 11844 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran_pipe.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 2004 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran_pipe.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 18251 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran_sock.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 5376 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/lib/tran_sock.h 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/ 00:46:58.332 -rw-r--r-- vagrant/vagrant 42094 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/client.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 8004 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/gpio-pci-idio-16.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 4637 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/lspci.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 1835 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/meson.build 00:46:58.332 -rw-r--r-- vagrant/vagrant 4299 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/null.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 29014 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/rte_hash_crc.h 00:46:58.332 -rw-r--r-- vagrant/vagrant 19560 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/server.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 5577 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/shadow_ioeventfd_server.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 3706 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/samples/shadow_ioeventfd_speed_test.c 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/ 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/ 00:46:58.332 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/.gitignore 00:46:58.332 -rw-r--r-- vagrant/vagrant 34633 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/libvfio_user.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 2163 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/meson.build 00:46:58.332 -rw-r--r-- vagrant/vagrant 2317 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_destroy.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 3153 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_get_info.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 4374 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_get_irq_info.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 10591 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_get_region_info.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 2893 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_get_region_info_zero_size.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 14928 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_get_region_io_fds.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 12120 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_device_set_irqs.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 15733 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_dirty_pages.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 7979 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_dma_map.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 7184 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_dma_unmap.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 2917 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_irq_trigger.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 5550 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_migration.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 6410 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_negotiate.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 15669 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_pci_caps.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 11124 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_pci_ext_caps.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 9844 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_quiesce.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 7540 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_request_errors.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 8160 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_setup_region.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 4301 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_sgl_get_put.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 4351 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_shadow_ioeventfd.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 2288 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_vfu_create_ctx.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 3665 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/test_vfu_realize_ctx.py 00:46:58.332 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/py/valgrind.supp 00:46:58.332 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/lspci.expected.out.1 00:46:58.332 -rw-r--r-- vagrant/vagrant 1750 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/lspci.expected.out.2 00:46:58.332 -rw-r--r-- vagrant/vagrant 1751 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/lspci.expected.out.3 00:46:58.332 -rw-r--r-- vagrant/vagrant 2425 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/meson.build 00:46:58.332 -rw-r--r-- vagrant/vagrant 9416 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/mocks.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 2101 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/mocks.h 00:46:58.332 -rwxr-xr-x vagrant/vagrant 584 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/test-client-server.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 1982 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/test-linkage.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/test-lspci.sh 00:46:58.332 -rw-r--r-- vagrant/vagrant 23764 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/unit-tests.c 00:46:58.332 -rw-r--r-- vagrant/vagrant 130 2024-06-07 12:49 spdk-test_gen_spec/libvfio-user/test/valgrind.supp 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dd/ 00:46:58.332 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dd/dd.dump0 00:46:58.332 -rwxr-xr-x vagrant/vagrant 2124 2024-06-07 12:49 spdk-test_gen_spec/test/dd/basic_rw.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 2465 2024-06-07 12:49 spdk-test_gen_spec/test/dd/bdev_to_bdev.sh 00:46:58.332 -rw-r--r-- vagrant/vagrant 4227 2024-06-07 12:49 spdk-test_gen_spec/test/dd/common.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 883 2024-06-07 12:49 spdk-test_gen_spec/test/dd/dd.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 794 2024-06-07 12:49 spdk-test_gen_spec/test/dd/malloc.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 2411 2024-06-07 12:49 spdk-test_gen_spec/test/dd/negative_dd.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 3532 2024-06-07 12:49 spdk-test_gen_spec/test/dd/posix.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 2246 2024-06-07 12:49 spdk-test_gen_spec/test/dd/sparse.sh 00:46:58.332 -rwxr-xr-x vagrant/vagrant 2309 2024-06-07 12:49 spdk-test_gen_spec/test/dd/uring.sh 00:46:58.332 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dd/dd.dump1 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/ 00:46:58.332 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/filesystem/ 00:46:58.332 -rwxr-xr-x vagrant/vagrant 4230 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/filesystem/filesystem.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/reset/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1940 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/reset/reset.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 4738 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/common.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2481 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/iscsi_tgt.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/fio/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3987 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/fio/fio.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 663 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/fio/iscsi.json 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/resize/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2429 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/resize/resize.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/initiator/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/initiator/initiator.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/rpc_config/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 19825 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/rpc_config/rpc_config.py 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1315 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/rpc_config/rpc_config.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/ip_migration/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2442 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/ip_migration/ip_migration.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/sock/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 6376 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/sock/sock.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/login_redirection/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3797 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/login_redirection/login_redirection.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/trace_record/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3995 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/trace_record/trace_record.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/lvol/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2456 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/lvol/iscsi_lvol.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/bdev_io_wait/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/bdev_io_wait/bdev_io_wait.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/multiconnection/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2468 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/multiconnection/multiconnection.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/calsoft/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 4589 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/calsoft/calsoft.py 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1885 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/calsoft/calsoft.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 281 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/calsoft/iscsi.json 00:46:58.333 -rw-r--r-- vagrant/vagrant 181 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/calsoft/its.conf 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/perf/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1207 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/perf/iscsi_initiator.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 4076 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/perf/iscsi_target.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 260 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/perf/perf.job 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/digests/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3013 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/digests/digests.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/qos/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 4711 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/qos/qos.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/ext4test/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3574 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/ext4test/ext4test.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/rbd/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2340 2024-06-07 12:49 spdk-test_gen_spec/test/iscsi_tgt/rbd/rbd.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/rpc/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2304 2024-06-07 12:49 spdk-test_gen_spec/test/rpc/rpc.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 796 2024-06-07 12:49 spdk-test_gen_spec/test/rpc/rpc_plugin.py 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2081 2024-06-07 12:49 spdk-test_gen_spec/test/rpc/skip_rpc.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 175 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/autotest.config 00:46:58.333 -rw-r--r-- vagrant/vagrant 471 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/common.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 757 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/vfio_user.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/nvme/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 834 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/nvme/common.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2590 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/nvme/vfio_user_fio.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1718 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/nvme/vfio_user_restart_vm.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/virtio/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 650 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/virtio/common.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2794 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/virtio/fio_restart_vm.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2034 2024-06-07 12:49 spdk-test_gen_spec/test/vfio_user/virtio/initiator_bdevperf.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dma/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 363 2024-06-07 12:49 spdk-test_gen_spec/test/dma/Makefile 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dma/test_dma/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/test/dma/test_dma/.gitignore 00:46:58.333 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/test/dma/test_dma/Makefile 00:46:58.333 -rw-r--r-- vagrant/vagrant 25628 2024-06-07 12:49 spdk-test_gen_spec/test/dma/test_dma/test_dma.c 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 8046 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/clear_config.py 00:46:58.333 -rw-r--r-- vagrant/vagrant 1473 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/common.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 3637 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/config_filter.py 00:46:58.333 -rw-r--r-- vagrant/vagrant 606 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/extra_key.json 00:46:58.333 -rwxr-xr-x vagrant/vagrant 12147 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/json_config.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1048 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/json_config_extra_key.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2426 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/json_config_with_delay.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 1102 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/json_diff.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/alias_rpc/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 487 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/alias_rpc/alias_rpc.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 752 2024-06-07 12:49 spdk-test_gen_spec/test/json_config/alias_rpc/conf.json 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_client/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 16 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_client/.gitignore 00:46:58.333 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_client/Makefile 00:46:58.333 -rwxr-xr-x vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_client/rpc_client.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 10657 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_client/rpc_client_test.c 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/ 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/integrity/ 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2972 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/integrity/integrity_start.sh 00:46:58.333 -rwxr-xr-x vagrant/vagrant 2077 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/integrity/integrity_vm.sh 00:46:58.333 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/lvol/ 00:46:58.333 -rw-r--r-- vagrant/vagrant 1053 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/lvol/autotest.config 00:46:58.333 -rwxr-xr-x vagrant/vagrant 6318 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/lvol/lvol_test.sh 00:46:58.333 -rw-r--r-- vagrant/vagrant 37506 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 6354 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/irqs.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2552 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/manual.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 607 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/parse_irqs.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 3028 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/vhost.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/migration-tc1.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 3626 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/migration-tc1.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/migration-tc2.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 5355 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/migration-tc2.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 3322 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/migration/migration.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/nvmf/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2233 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/nvmf/nvmf_vhost.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/nvmf/nvmf_vhost_fio.job 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/ 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/fio_jobs/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 139 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/fio_jobs/default_initiator.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/fio_jobs/default_integrity.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/fio_jobs/default_integrity_nightly.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 210 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/fio_jobs/default_performance.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/common/autotest.config 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/other/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 8180 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/other/negative.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fio/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 1618 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fio/fio.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fio/vhost_fio.job 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/perf_bench/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 18530 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/perf_bench/vhost_perf.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fiotest/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 8761 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fiotest/fio.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/readonly/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 1066 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/readonly/delete_partition_vm.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 1148 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/readonly/disabled_readonly_vm.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2004 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/readonly/enabled_readonly_vm.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 3676 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/readonly/readonly.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fuzz/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2730 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/fuzz/fuzz.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/shared/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 1479 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/shared/shared.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/ 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/fio_jobs/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 226 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/fio_jobs/default_integrity.job 00:46:58.334 -rw-r--r-- vagrant/vagrant 8134 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/blk_hotremove.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 5554 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/common.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2956 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/scsi_hotattach.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 5818 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/scsi_hotdetach.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 3621 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/scsi_hotplug.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 7932 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/hotplug/scsi_hotremove.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/vhost_boot/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 4797 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/vhost_boot/vhost_boot.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/initiator/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 90 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/initiator/autotest.config 00:46:58.334 -rw-r--r-- vagrant/vagrant 607 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/initiator/bdev.fio 00:46:58.334 -rwxr-xr-x vagrant/vagrant 3916 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/initiator/blockdev.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/windows/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 2738 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/windows/windows_scsi_compliance.ps1 00:46:58.334 -rwxr-xr-x vagrant/vagrant 5522 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/windows/windows_scsi_compliance.py 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2820 2024-06-07 12:49 spdk-test_gen_spec/test/vhost/windows/windows_scsi_compliance.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/accel/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 317 2024-06-07 12:49 spdk-test_gen_spec/test/accel/Makefile 00:46:58.334 -rwxr-xr-x vagrant/vagrant 5478 2024-06-07 12:49 spdk-test_gen_spec/test/accel/accel.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 1836 2024-06-07 12:49 spdk-test_gen_spec/test/accel/accel_rpc.sh 00:46:58.334 -rw-r--r-- vagrant/vagrant 111250 2024-06-07 12:49 spdk-test_gen_spec/test/accel/bib 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/accel/dif/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 4 2024-06-07 12:49 spdk-test_gen_spec/test/accel/dif/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 421 2024-06-07 12:49 spdk-test_gen_spec/test/accel/dif/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 38826 2024-06-07 12:49 spdk-test_gen_spec/test/accel/dif/dif.c 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/dpdk_memory_utility/ 00:46:58.334 -rwxr-xr-x vagrant/vagrant 513 2024-06-07 12:49 spdk-test_gen_spec/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/keyring/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 556 2024-06-07 12:49 spdk-test_gen_spec/test/keyring/common.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 4004 2024-06-07 12:49 spdk-test_gen_spec/test/keyring/file.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2408 2024-06-07 12:49 spdk-test_gen_spec/test/keyring/linux.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_plugins/ 00:46:58.334 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_plugins/rpc_plugin.py -> ../rpc/rpc_plugin.py 00:46:58.334 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/rpc_plugins/scheduler_plugin.py -> ../event/scheduler/scheduler_plugin.py 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vmd/ 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/vmd/config/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 196 2024-06-07 12:49 spdk-test_gen_spec/test/vmd/config/config.fio 00:46:58.334 -rwxr-xr-x vagrant/vagrant 2065 2024-06-07 12:49 spdk-test_gen_spec/test/vmd/vmd.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/test/app/Makefile 00:46:58.334 -rwxr-xr-x vagrant/vagrant 835 2024-06-07 12:49 spdk-test_gen_spec/test/app/cmdline.sh 00:46:58.334 -rwxr-xr-x vagrant/vagrant 969 2024-06-07 12:49 spdk-test_gen_spec/test/app/version.sh 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/bdev_svc/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/test/app/bdev_svc/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 435 2024-06-07 12:49 spdk-test_gen_spec/test/app/bdev_svc/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 1865 2024-06-07 12:49 spdk-test_gen_spec/test/app/bdev_svc/bdev_svc.c 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/ 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/common/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 5715 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/common/fuzz_common.h 00:46:58.334 -rwxr-xr-x vagrant/vagrant 4608 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/common/fuzz_rpc.py 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/iscsi_fuzz/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/iscsi_fuzz/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 616 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/iscsi_fuzz/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 1134 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/iscsi_fuzz/README.md 00:46:58.334 -rw-r--r-- vagrant/vagrant 28520 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/iscsi_fuzz/iscsi_fuzz.c 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_nvme_fuzz/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_nvme_fuzz/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_nvme_fuzz/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 31649 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_vfio_fuzz/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_vfio_fuzz/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 498 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_vfio_fuzz/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 19204 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 2665 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/README.md 00:46:58.334 -rw-r--r-- vagrant/vagrant 6567 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/example.json 00:46:58.334 -rw-r--r-- vagrant/vagrant 23856 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/nvme_fuzz/nvme_fuzz.c 00:46:58.334 -rw-r--r-- vagrant/vagrant 527 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/Makefile 00:46:58.334 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/ 00:46:58.334 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/.gitignore 00:46:58.334 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/Makefile 00:46:58.334 -rw-r--r-- vagrant/vagrant 2175 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/README.md 00:46:58.334 -rw-r--r-- vagrant/vagrant 1935 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/example.json 00:46:58.334 -rw-r--r-- vagrant/vagrant 31535 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/vhost_fuzz.c 00:46:58.334 -rw-r--r-- vagrant/vagrant 320 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/vhost_fuzz.h 00:46:58.334 -rw-r--r-- vagrant/vagrant 2881 2024-06-07 12:49 spdk-test_gen_spec/test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/histogram_perf/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/app/histogram_perf/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 322 2024-06-07 12:49 spdk-test_gen_spec/test/app/histogram_perf/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 1575 2024-06-07 12:49 spdk-test_gen_spec/test/app/histogram_perf/histogram_perf.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/jsoncat/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/app/jsoncat/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 313 2024-06-07 12:49 spdk-test_gen_spec/test/app/jsoncat/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 3191 2024-06-07 12:49 spdk-test_gen_spec/test/app/jsoncat/jsoncat.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/match/ 00:46:58.335 -rwxr-xr-x vagrant/vagrant 7618 2024-06-07 12:49 spdk-test_gen_spec/test/app/match/match 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/app/stub/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/test/app/stub/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 363 2024-06-07 12:49 spdk-test_gen_spec/test/app/stub/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 4553 2024-06-07 12:49 spdk-test_gen_spec/test/app/stub/stub.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/ 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/pci/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/test/env/pci/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/test/env/pci/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 5302 2024-06-07 12:49 spdk-test_gen_spec/test/env/pci/pci_ut.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/vtophys/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/env/vtophys/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/test/env/vtophys/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 4019 2024-06-07 12:49 spdk-test_gen_spec/test/env/vtophys/vtophys.c 00:46:58.335 -rw-r--r-- vagrant/vagrant 447 2024-06-07 12:49 spdk-test_gen_spec/test/env/Makefile 00:46:58.335 -rwxr-xr-x vagrant/vagrant 1094 2024-06-07 12:49 spdk-test_gen_spec/test/env/env.sh 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/env_dpdk_post_init/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/test/env/env_dpdk_post_init/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/test/env/env_dpdk_post_init/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 2198 2024-06-07 12:49 spdk-test_gen_spec/test/env/env_dpdk_post_init/env_dpdk_post_init.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/mem_callbacks/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/test/env/mem_callbacks/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 321 2024-06-07 12:49 spdk-test_gen_spec/test/env/mem_callbacks/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 4787 2024-06-07 12:49 spdk-test_gen_spec/test/env/mem_callbacks/mem_callbacks.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/env/memory/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/env/memory/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 290 2024-06-07 12:49 spdk-test_gen_spec/test/env/memory/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 15747 2024-06-07 12:49 spdk-test_gen_spec/test/env/memory/memory_ut.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/Makefile 00:46:58.335 -rwxr-xr-x vagrant/vagrant 24553 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/basic.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 1528 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/common.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 3635 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/external_copy.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 29483 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/external_snapshot.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 8121 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/hotremove.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 996 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/lvol.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 9847 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/rename.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 9048 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/resize.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 39065 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/snapshot_clone.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 6753 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/tasting.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 10330 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/thin_provisioning.sh 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/esnap/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/esnap/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/esnap/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 25050 2024-06-07 12:49 spdk-test_gen_spec/test/lvol/esnap/esnap.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 6510 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/cgroups.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 21426 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/common.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 5613 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/governor.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 1900 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/idle.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 2694 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/interrupt.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 1898 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/isolate_cores.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 8510 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/load_balancing.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 785 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/rdmsr.pl 00:46:58.335 -rwxr-xr-x vagrant/vagrant 513 2024-06-07 12:49 spdk-test_gen_spec/test/scheduler/scheduler.sh 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 320 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/Makefile 00:46:58.335 -rwxr-xr-x vagrant/vagrant 31756 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdev_raid.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 26425 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/blockdev.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 8345 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/chaining.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 3425 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/nbd_common.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 33 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/nonarray.json 00:46:58.335 -rw-r--r-- vagrant/vagrant 39 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/nonenclosed.json 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevio/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 427 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevio/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevio/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 41590 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevio/bdevio.c 00:46:58.335 -rwxr-xr-x vagrant/vagrant 3185 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevio/tests.py 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevperf/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 690 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevperf/common.sh 00:46:58.335 -rw-r--r-- vagrant/vagrant 473 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevperf/conf.json 00:46:58.335 -rwxr-xr-x vagrant/vagrant 1364 2024-06-07 12:49 spdk-test_gen_spec/test/bdev/bdevperf/test_config.sh 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/ 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/app_repeat/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/event/app_repeat/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 727 2024-06-07 12:49 spdk-test_gen_spec/test/event/app_repeat/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 1568 2024-06-07 12:49 spdk-test_gen_spec/test/event/app_repeat/app_repeat.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/event_perf/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/event/event_perf/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 304 2024-06-07 12:49 spdk-test_gen_spec/test/event/event_perf/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 2999 2024-06-07 12:49 spdk-test_gen_spec/test/event/event_perf/event_perf.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 298 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 2270 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor/reactor.c 00:46:58.335 -rw-r--r-- vagrant/vagrant 402 2024-06-07 12:49 spdk-test_gen_spec/test/event/Makefile 00:46:58.335 -rwxr-xr-x vagrant/vagrant 4467 2024-06-07 12:49 spdk-test_gen_spec/test/event/cpu_locks.sh 00:46:58.335 -rwxr-xr-x vagrant/vagrant 1791 2024-06-07 12:49 spdk-test_gen_spec/test/event/event.sh 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor_perf/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor_perf/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor_perf/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 2224 2024-06-07 12:49 spdk-test_gen_spec/test/event/reactor_perf/reactor_perf.c 00:46:58.335 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/ 00:46:58.335 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/.gitignore 00:46:58.335 -rw-r--r-- vagrant/vagrant 398 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/Makefile 00:46:58.335 -rw-r--r-- vagrant/vagrant 11396 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/scheduler.c 00:46:58.335 -rwxr-xr-x vagrant/vagrant 1964 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/scheduler.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 1650 2024-06-07 12:49 spdk-test_gen_spec/test/event/scheduler/scheduler_plugin.py 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/make/ 00:46:58.336 -rwxr-xr-x vagrant/vagrant 11332 2024-06-07 12:49 spdk-test_gen_spec/test/make/check_so_deps.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/setup/ 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/setup/dm_mount/ 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1193 2024-06-07 12:49 spdk-test_gen_spec/test/setup/acl.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 1828 2024-06-07 12:49 spdk-test_gen_spec/test/setup/common.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 4720 2024-06-07 12:49 spdk-test_gen_spec/test/setup/devices.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1467 2024-06-07 12:49 spdk-test_gen_spec/test/setup/driver.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 5332 2024-06-07 12:49 spdk-test_gen_spec/test/setup/hugepages.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 434 2024-06-07 12:49 spdk-test_gen_spec/test/setup/test-setup.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/setup/nvme_mount/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/test/Makefile 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/ 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 621 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/common_flags.txt 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2903 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/postprocess.py 00:46:58.336 -rwxr-xr-x vagrant/vagrant 4624 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/rocksdb.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/rocksdb/rocksdb_commit_id 00:46:58.336 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/Makefile 00:46:58.336 -rwxr-xr-x vagrant/vagrant 3539 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/blobfs.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/fuse/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/fuse/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/fuse/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 1692 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/fuse/fuse.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/mkfs/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/mkfs/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 368 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/mkfs/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 1655 2024-06-07 12:49 spdk-test_gen_spec/test/blobfs/mkfs/mkfs.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/ 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/passthru/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 649 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/passthru/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 23788 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/passthru/vbdev_passthru.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/passthru/vbdev_passthru.h 00:46:58.336 -rw-r--r-- vagrant/vagrant 3449 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/passthru/vbdev_passthru_rpc.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 2385 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 918 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/README.md 00:46:58.336 -rwxr-xr-x vagrant/vagrant 5255 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/test_make.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 4217 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/app_driver.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 4023 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/app_module.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 2018 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/driver.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/driver.json 00:46:58.336 -rw-r--r-- vagrant/vagrant 4823 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/module.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 568 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/accel/module.json 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 2343 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 279 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/bdev.json 00:46:58.336 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/bdev_external.json 00:46:58.336 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/hello_world/hello_bdev.c -> ./../../../examples/bdev/hello_world/hello_bdev.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 723 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 3745 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/identify.c 00:46:58.336 -rwxr-xr-x vagrant/vagrant 719 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/identify.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 16498 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/nvme.c 00:46:58.336 -rw-r--r-- vagrant/vagrant 2374 2024-06-07 12:49 spdk-test_gen_spec/test/external_code/nvme/nvme.h 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/ 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/compliance/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 16 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/compliance/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 261 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/compliance/Makefile 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1099 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/compliance/compliance.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 44691 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/compliance/nvme_compliance.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reserve/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reserve/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reserve/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 10647 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reserve/reserve.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/connect_stress/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/connect_stress/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 223 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/connect_stress/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 7390 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/connect_stress/connect_stress.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reset/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reset/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 214 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reset/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 15754 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/reset/reset.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/Makefile 00:46:58.336 -rwxr-xr-x vagrant/vagrant 297 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/common.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 5292 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/cuse.c 00:46:58.336 -rwxr-xr-x vagrant/vagrant 766 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/nvme_cuse.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1268 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/nvme_cuse_rpc.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 3108 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/nvme_ns_manage_cuse.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/spdk_nvme_cli_cuse.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2029 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/spdk_nvme_cli_plugin.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2333 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cuse/spdk_smartctl_cuse.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/sgl/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 4 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/sgl/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 212 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/sgl/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 12739 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/sgl/sgl.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/doorbell_aers/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/doorbell_aers/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/doorbell_aers/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 7313 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/doorbell_aers/doorbell_aers.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/simple_copy/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/simple_copy/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 226 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/simple_copy/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 11863 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/simple_copy/simple_copy.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/e2edp/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/e2edp/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/e2edp/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 17321 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/e2edp/nvme_dp.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/startup/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 8 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/startup/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 216 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/startup/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 4732 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/startup/startup.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/err_injection/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/err_injection/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/err_injection/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 5539 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/err_injection/err_injection.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/xnvme/ 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1904 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/xnvme/xnvme.sh 00:46:58.336 -rw-r--r-- vagrant/vagrant 497 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/Makefile 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2452 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/hw_hotplug.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 3477 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_bp.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_fdp.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 4309 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_opal.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 594 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_pmr.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2877 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_reset_stuck_adm_cmd.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 986 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_rpc.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 2270 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_rpc_timeouts.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 657 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/nvme_scc.sh 00:46:58.336 -rwxr-xr-x vagrant/vagrant 4249 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/sw_hotplug.sh 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fdp/ 00:46:58.336 -rw-r--r-- vagrant/vagrant 4 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fdp/.gitignore 00:46:58.336 -rw-r--r-- vagrant/vagrant 218 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fdp/Makefile 00:46:58.336 -rw-r--r-- vagrant/vagrant 30169 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fdp/fdp.c 00:46:58.336 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/zns/ 00:46:58.336 -rwxr-xr-x vagrant/vagrant 1985 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/zns/zns.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/aer/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 4 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/aer/.gitignore 00:46:58.337 -rw-r--r-- vagrant/vagrant 212 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/aer/Makefile 00:46:58.337 -rw-r--r-- vagrant/vagrant 19108 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/aer/aer.c 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fused_ordering/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fused_ordering/.gitignore 00:46:58.337 -rw-r--r-- vagrant/vagrant 223 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fused_ordering/Makefile 00:46:58.337 -rw-r--r-- vagrant/vagrant 9143 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/fused_ordering/fused_ordering.c 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/boot_partition/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/boot_partition/.gitignore 00:46:58.337 -rw-r--r-- vagrant/vagrant 229 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/boot_partition/Makefile 00:46:58.337 -rw-r--r-- vagrant/vagrant 5881 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/boot_partition/boot_partition.c 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/overhead/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 9 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/overhead/.gitignore 00:46:58.337 -rw-r--r-- vagrant/vagrant 285 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/overhead/Makefile 00:46:58.337 -rw-r--r-- vagrant/vagrant 1004 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/overhead/README 00:46:58.337 -rw-r--r-- vagrant/vagrant 17096 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/overhead/overhead.c 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cmb/ 00:46:58.337 -rwxr-xr-x vagrant/vagrant 3183 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cmb/cmb.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 693 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/cmb/cmb_copy.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/perf/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 4208 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/perf/README.md 00:46:58.337 -rwxr-xr-x vagrant/vagrant 16518 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/perf/common.sh 00:46:58.337 -rw-r--r-- vagrant/vagrant 72 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/perf/config.fio.tmp 00:46:58.337 -rwxr-xr-x vagrant/vagrant 17833 2024-06-07 12:49 spdk-test_gen_spec/test/nvme/perf/run_perf.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/sma/ 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins/ 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins/plugin1/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 1303 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins/plugin1/__init__.py 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins/plugin2/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 1303 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins/plugin2/__init__.py 00:46:58.337 -rw-r--r-- vagrant/vagrant 903 2024-06-07 12:49 spdk-test_gen_spec/test/sma/common.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 8253 2024-06-07 12:49 spdk-test_gen_spec/test/sma/crypto.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 14436 2024-06-07 12:49 spdk-test_gen_spec/test/sma/discovery.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4391 2024-06-07 12:49 spdk-test_gen_spec/test/sma/nvmf_tcp.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4408 2024-06-07 12:49 spdk-test_gen_spec/test/sma/plugins.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4749 2024-06-07 12:49 spdk-test_gen_spec/test/sma/qos.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 568 2024-06-07 12:49 spdk-test_gen_spec/test/sma/sma.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 12003 2024-06-07 12:49 spdk-test_gen_spec/test/sma/vfiouser_qemu.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 8628 2024-06-07 12:49 spdk-test_gen_spec/test/sma/vhost_blk.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/ 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/blob_io_wait/ 00:46:58.337 -rwxr-xr-x vagrant/vagrant 1781 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/blob_io_wait/blob_io_wait.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/blobstore_grow/ 00:46:58.337 -rwxr-xr-x vagrant/vagrant 1857 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/blobstore_grow/blobstore_grow.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 1082 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/blobstore.sh 00:46:58.337 -rw-r--r-- vagrant/vagrant 50 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/btest.out.ignore 00:46:58.337 -rw-r--r-- vagrant/vagrant 2599 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/btest.out.match 00:46:58.337 -rw-r--r-- vagrant/vagrant 119 2024-06-07 12:49 spdk-test_gen_spec/test/blobstore/test.bs 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/ 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/ 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 194 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/drive-prep.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randr.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randrw.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify-depth128.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 349 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify-j2.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 287 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify-qd128-ext.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify-qd2048-ext.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 301 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify-qd256-nght.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw-verify.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/randw.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 353 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/unmap.fio 00:46:58.337 -rw-r--r-- vagrant/vagrant 261 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/config/fio/write_after_write.fio 00:46:58.337 -rwxr-xr-x vagrant/vagrant 1415 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/bdevperf.sh 00:46:58.337 -rw-r--r-- vagrant/vagrant 5346 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/common.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2833 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/dirty_shutdown.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2006 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/fio.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2563 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/ftl.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2374 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/restore.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2969 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/trim.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 3356 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/upgrade_shutdown.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2122 2024-06-07 12:49 spdk-test_gen_spec/test/ftl/write_after_write.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/ 00:46:58.337 -rw-r--r-- vagrant/vagrant 253 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/README.md 00:46:58.337 -rw-r--r-- vagrant/vagrant 21531 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/common.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 6330 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/nvmf.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/fips/ 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4174 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/fips/fips.sh 00:46:58.337 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/ 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4753 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/multipath_status.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/perf.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 2323 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/target_disconnect.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 5218 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/timeout.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 1479 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/aer.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 3006 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/async_init.sh 00:46:58.337 -rwxr-xr-x vagrant/vagrant 4516 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/auth.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1190 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/bdevperf.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4382 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/digest.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 6913 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/discovery.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2955 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/discovery_remove_ifc.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3708 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/dma.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 5414 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/failover.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3525 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/fio.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/identify.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 750 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/identify_kernel_nvmf.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 8716 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/mdns_discovery.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4986 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/multicontroller.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4480 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/host/multipath.sh 00:46:58.338 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/ 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1423 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/abort.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2185 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/abort_qd_sizes.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 7308 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/auth.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1651 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/bdev_io_wait.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1004 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/bdevio.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1398 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/connect_disconnect.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1185 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/connect_stress.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2496 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/delete_subsystem.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 7689 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/device_removal.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3260 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/dif.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1798 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/discovery.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1618 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/fabrics_fuzz.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3344 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/filesystem.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3095 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/fio.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 954 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/fused_ordering.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3247 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/host_management.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2407 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/identify_passthru.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2460 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/initiator_timeout.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1287 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/vfio_user_fuzz.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2217 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/zcopy.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3548 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/invalid.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1424 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/multiconnection.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4725 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/multipath.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1350 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/multitarget.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3618 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/multitarget_rpc.py 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1995 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nmic.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2444 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/ns_hotplug_stress.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3188 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/ns_masking.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2343 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nvme_cli.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1910 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nvmf_example.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 2180 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nvmf_lvol.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4361 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nvmf_lvs_grow.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3451 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/nvmf_vfio_user.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4290 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/perf_adq.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1594 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/queue_depth.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3720 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/referrals.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4961 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/rpc.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 4046 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/shutdown.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1868 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/srq_overwhelm.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 9411 2024-06-07 12:49 spdk-test_gen_spec/test/nvmf/target/tls.sh 00:46:58.338 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/ 00:46:58.338 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/ 00:46:58.338 -rw-r--r-- vagrant/vagrant 163 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_details_lvs.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 719 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_details_vhost.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 403 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_details_vhost_ctrl.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 137 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_details_vhost_target.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 7776 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_iscsi.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 4144 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_nvmf.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 118 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_pmem_info.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 2018 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_raid.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_rbd.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 5363 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_vhost.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 2364 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_virtio_pci.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 861 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/match_files/spdkcli_virtio_user.test.match 00:46:58.338 -rw-r--r-- vagrant/vagrant 1303 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/common.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 3543 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/iscsi.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 1274 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/raid.sh 00:46:58.338 -rwxr-xr-x vagrant/vagrant 939 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/rbd.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1985 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/spdkcli_job.py 00:46:58.339 -rwxr-xr-x vagrant/vagrant 906 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/tcp.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 6875 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/vhost.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 3142 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/virtio.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 4669 2024-06-07 12:49 spdk-test_gen_spec/test/spdkcli/nvmf.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 631 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/README.md 00:46:58.339 -rwxr-xr-x vagrant/vagrant 3662 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/autotest_setup.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 1208 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/apt-get 00:46:58.339 -rw-r--r-- vagrant/vagrant 17632 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/git 00:46:58.339 -rw-r--r-- vagrant/vagrant 750 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/pacman 00:46:58.339 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/pkg 00:46:58.339 -rw-r--r-- vagrant/vagrant 770 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/tdnf 00:46:58.339 -rw-r--r-- vagrant/vagrant 2253 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/yum 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/ 00:46:58.339 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/centos -> rhel 00:46:58.339 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/debian -> ubuntu 00:46:58.339 lrwxrwxrwx vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/fedora -> rhel 00:46:58.339 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/mariner 00:46:58.339 -rw-r--r-- vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/rhel 00:46:58.339 -rw-r--r-- vagrant/vagrant 96 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/os/ubuntu 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/ice/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 2008 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/ice/0001-devlink_fmsg.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 3798 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/ice/0001-ethtool-set-get-rxfh-params.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 1914 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/ice/0001-xdp_do_flush_map.patch 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/irdma/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 1039 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/irdma/0001-irdma-avoid-fortify-string-warning.patch 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/qat/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 13089 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/qat/0001-missing-prototypes.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 1608 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/qat/0001-no-vmlinux.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 974 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/qat/0001-phys_proc_id.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 23473 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/qat/0001-strlcpy-to-strscpy.patch 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/20.11/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 675 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/20.11/dpdk_pci.patch 00:46:58.339 -rw-r--r-- vagrant/vagrant 1420 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/20.11/dpdk_qat.patch 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/21.11+/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 352 2024-06-07 12:49 spdk-test_gen_spec/test/common/config/pkgdep/patches/dpdk/21.11+/dpdk_qat.patch 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 12278 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/test_env.c 00:46:58.339 -rw-r--r-- vagrant/vagrant 3150 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/test_iobuf.c 00:46:58.339 -rw-r--r-- vagrant/vagrant 3081 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/test_rdma.c 00:46:58.339 -rw-r--r-- vagrant/vagrant 2927 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/test_sock.c 00:46:58.339 -rw-r--r-- vagrant/vagrant 3574 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/ut_multithread.c 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/nvme/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 5356 2024-06-07 12:49 spdk-test_gen_spec/test/common/lib/nvme/common_stubs.h 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/common/nvme/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 4554 2024-06-07 12:49 spdk-test_gen_spec/test/common/nvme/functions.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 1635 2024-06-07 12:49 spdk-test_gen_spec/test/common/build_config.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 2080 2024-06-07 12:49 spdk-test_gen_spec/test/common/skipped_build_files.txt 00:46:58.339 -rw-r--r-- vagrant/vagrant 1350 2024-06-07 12:49 spdk-test_gen_spec/test/common/skipped_tests.txt 00:46:58.339 -rw-r--r-- vagrant/vagrant 962 2024-06-07 12:49 spdk-test_gen_spec/test/common/applications.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 14470 2024-06-07 12:49 spdk-test_gen_spec/test/common/autobuild_common.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 42923 2024-06-07 12:49 spdk-test_gen_spec/test/common/autotest_common.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1860 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/common.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 139 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/llvm-gcov.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/nvmf/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 3136 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/nvmf/fuzz_json.conf 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2074 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/nvmf/run.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/vfio/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 3516 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2353 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm/vfio/run.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1879 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/autofuzz.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1998 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/autofuzz_iscsi.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1592 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/autofuzz_nvmf.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2577 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/autofuzz_vhost.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1939 2024-06-07 12:49 spdk-test_gen_spec/test/fuzz/llvm.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/bdevperf-iotypes.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2165 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/fio-modes.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1615 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/flush.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 1186 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/mallocs.conf 00:46:58.339 -rwxr-xr-x vagrant/vagrant 578 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/stats.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/integrity/test.fio 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/management/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2132 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/management/configuration-change.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2303 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/management/create-destruct.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1825 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/management/multicore.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1971 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/management/remove.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 601 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/common.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 753 2024-06-07 12:49 spdk-test_gen_spec/test/ocf/ocf.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/thread/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/thread/lock/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/thread/lock/.gitignore 00:46:58.339 -rw-r--r-- vagrant/vagrant 647 2024-06-07 12:49 spdk-test_gen_spec/test/thread/lock/Makefile 00:46:58.339 -rw-r--r-- vagrant/vagrant 12203 2024-06-07 12:49 spdk-test_gen_spec/test/thread/lock/spdk_lock.c 00:46:58.339 -rw-r--r-- vagrant/vagrant 647 2024-06-07 12:49 spdk-test_gen_spec/test/thread/Makefile 00:46:58.339 -rwxr-xr-x vagrant/vagrant 742 2024-06-07 12:49 spdk-test_gen_spec/test/thread/thread.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/thread/poller_perf/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/test/thread/poller_perf/.gitignore 00:46:58.339 -rw-r--r-- vagrant/vagrant 313 2024-06-07 12:49 spdk-test_gen_spec/test/thread/poller_perf/Makefile 00:46:58.339 -rw-r--r-- vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/test/thread/poller_perf/poller_perf.c 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/compress/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 3726 2024-06-07 12:49 spdk-test_gen_spec/test/compress/compress.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/test/compress/dpdk.json 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/interrupt/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 1813 2024-06-07 12:49 spdk-test_gen_spec/test/interrupt/common.sh 00:46:58.339 -rw-r--r-- vagrant/vagrant 714 2024-06-07 12:49 spdk-test_gen_spec/test/interrupt/interrupt_common.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2684 2024-06-07 12:49 spdk-test_gen_spec/test/interrupt/reactor_set_interrupt.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1660 2024-06-07 12:49 spdk-test_gen_spec/test/interrupt/reap_unregistered_poller.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/openstack/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 2893 2024-06-07 12:49 spdk-test_gen_spec/test/openstack/install_devstack.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 3483 2024-06-07 12:49 spdk-test_gen_spec/test/openstack/run_openstack_tests.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ublk/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 4253 2024-06-07 12:49 spdk-test_gen_spec/test/ublk/ublk.sh 00:46:58.339 -rwxr-xr-x vagrant/vagrant 1369 2024-06-07 12:49 spdk-test_gen_spec/test/ublk/ublk_recovery.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/cpp_headers/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/test/cpp_headers/.gitignore 00:46:58.339 -rw-r--r-- vagrant/vagrant 857 2024-06-07 12:49 spdk-test_gen_spec/test/cpp_headers/Makefile 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/ioat/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/test/ioat/ioat.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/ 00:46:58.339 -rwxr-xr-x vagrant/vagrant 5418 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/rpm.sh 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/srcrpm/ 00:46:58.339 -rw-r--r-- vagrant/vagrant 8829 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/srcrpm/spdk-v24.09-1.src.rpm 00:46:58.339 -rw-r--r-- vagrant/vagrant 4558 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/gen-spdk.spec 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/build/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/buildroot/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/rpm/ 00:46:58.339 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/rpm/x86_64/ 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/source/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/source/spdk-v24.09.tar.gz 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/rpm/test-rpm/spec/ 00:46:58.340 -rwxr-xr-x vagrant/vagrant 307 2024-06-07 12:49 spdk-test_gen_spec/test/packaging/packaging.sh 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/ 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_layout_upgrade/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 22 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_layout_upgrade/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 291 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_layout_upgrade/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 13279 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_layout_upgrade/ftl_layout_upgrade_ut.c 00:46:58.340 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/Makefile 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mempool.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 15 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mempool.c/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mempool.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 2267 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mempool.c/ftl_mempool_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mngt/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mngt/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mngt/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 29410 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_mngt/ftl_mngt_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_p2l.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_p2l.c/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 278 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_p2l.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 21724 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_p2l.c/ftl_p2l_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_sb/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_sb/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 279 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_sb/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 25146 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_sb/ftl_sb_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/common/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 4598 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/common/utils.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_band.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_band.c/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_band.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 15998 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_band.c/ftl_band_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_bitmap.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_bitmap.c/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_bitmap.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 5330 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_bitmap.c/ftl_bitmap_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_io.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_io.c/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 271 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_io.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 11055 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_io.c/ftl_io_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_l2p/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_l2p/.gitignore 00:46:58.340 -rw-r--r-- vagrant/vagrant 316 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_l2p/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 2218 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ftl/ftl_l2p/ftl_l2p_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/notify/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 328 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/notify/Makefile 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/notify/notify.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/notify/notify.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 1876 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/notify/notify.c/notify_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/vhost/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 327 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/vhost/Makefile 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/vhost/vhost.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 295 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/vhost/vhost.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 23547 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/vhost/vhost.c/vhost_ut.c 00:46:58.340 -rw-r--r-- vagrant/vagrant 623 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 3807 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json_mock.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 338 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/Makefile 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 18232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd.c/idxd_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd_user.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd_user.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 6305 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/idxd/idxd_user.c/idxd_user_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/ 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_io_msg.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_io_msg.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 5477 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_io_msg.c/nvme_io_msg_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_rdma.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_rdma.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 51879 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_rdma.c/nvme_rdma_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 16112 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns.c/nvme_ns_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_tcp.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 235 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_tcp.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 72854 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_tcp.c/nvme_tcp_ut.c 00:46:58.340 -rw-r--r-- vagrant/vagrant 649 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/Makefile 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_cmd.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_cmd.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 72741 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_cmd.c/nvme_ns_cmd_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_transport.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 241 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_transport.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 7903 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_transport.c/nvme_transport_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_ocssd_cmd.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 244 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_ocssd_cmd.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 18381 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ns_ocssd_cmd.c/nvme_ns_ocssd_cmd_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 55234 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme.c/nvme_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_opal.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_opal.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 4404 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_opal.c/nvme_opal_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 105784 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr.c/nvme_ctrlr_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 37136 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie.c/nvme_pcie_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_cmd.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 241 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_cmd.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 27703 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_cmd.c/nvme_ctrlr_cmd_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie_common.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie_common.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 21507 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_pcie_common.c/nvme_pcie_common_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_ocssd_cmd.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_ocssd_cmd.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 3495 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_ctrlr_ocssd_cmd.c/nvme_ctrlr_ocssd_cmd_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_poll_group.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_poll_group.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 16255 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_poll_group.c/nvme_poll_group_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_cuse.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_cuse.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 19352 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_cuse.c/nvme_cuse_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_qpair.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_qpair.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 24127 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_qpair.c/nvme_qpair_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_fabric.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_fabric.c/Makefile 00:46:58.340 -rw-r--r-- vagrant/vagrant 13736 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_fabric.c/nvme_fabric_ut.c 00:46:58.340 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_quirks.c/ 00:46:58.340 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_quirks.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 1603 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvme/nvme_quirks.c/nvme_quirks_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 337 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/rpc.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/rpc.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 6141 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/rpc.c/rpc_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/subsystem.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/subsystem.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 6063 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/init/subsystem.c/subsystem_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/rdma.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/rdma.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 57864 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/rdma.c/rdma_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/subsystem.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/subsystem.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 75680 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/subsystem.c/subsystem_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/tcp.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 251 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/tcp.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 57976 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/tcp.c/tcp_ut.c 00:46:58.341 -rw-r--r-- vagrant/vagrant 509 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/auth.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 260 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/auth.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 40150 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/auth.c/auth_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/transport.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/transport.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 14560 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/transport.c/transport_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 118546 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr.c/ctrlr_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/vfio_user.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/vfio_user.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 9770 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/vfio_user.c/vfio_user_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_bdev.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_bdev.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 32550 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_bdev.c/ctrlr_bdev_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_discovery.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_discovery.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 35679 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/ctrlr_discovery.c/ctrlr_discovery_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 983 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 13210 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc.c/fc_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc_ls.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc_ls.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 26119 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/fc_ls.c/fc_ls_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/nvmf.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/nvmf.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 10585 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/nvmf/nvmf.c/nvmf_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/ 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/accel.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/accel.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 136182 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/accel.c/accel_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_compressdev.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 273 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_compressdev.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 35271 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_compressdev.c/accel_dpdk_compressdev_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_cryptodev.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 286 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_cryptodev.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 58541 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/dpdk_cryptodev.c/accel_dpdk_cryptodev_ut.c 00:46:58.341 -rw-r--r-- vagrant/vagrant 479 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/accel/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ioat/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ioat/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ioat/ioat.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ioat/ioat.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 3028 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/ioat/ioat.c/ioat_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/common.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 10 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/common.c/.gitignore 00:46:58.341 -rw-r--r-- vagrant/vagrant 286 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/common.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 5656 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rdma/common.c/common_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/ 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/nvme/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 334 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/nvme/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/nvme/bdev_nvme.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/nvme/bdev_nvme.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 214403 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/nvme/bdev_nvme.c/bdev_nvme_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/part.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/part.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 12144 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/part.c/part_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/ 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid_sb.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid_sb.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 10267 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid_sb.c/bdev_raid_sb_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/concat.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 171 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/concat.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 13358 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/concat.c/concat_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid0.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 235 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid0.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 26173 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid0.c/raid0_ut.c 00:46:58.341 -rw-r--r-- vagrant/vagrant 409 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 7872 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/common.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid1.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 235 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid1.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 17820 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid1.c/raid1_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid5f.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid5f.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 34022 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/raid5f.c/raid5f_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 239 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 53496 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/raid/bdev_raid.c/bdev_raid_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/scsi_nvme.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/scsi_nvme.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 10496 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/scsi_nvme.c/scsi_nvme_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 244840 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev.c/bdev_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_lvol.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_lvol.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 56068 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_lvol.c/vbdev_lvol_ut.c 00:46:58.341 -rw-r--r-- vagrant/vagrant 536 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/Makefile 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev_zone.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev_zone.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 13079 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/bdev_zone.c/bdev_zone_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_zone_block.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_zone_block.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 44316 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/vbdev_zone_block.c/vbdev_zone_block_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/compress.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/compress.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 11303 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/compress.c/compress_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/crypto.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/crypto.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 19186 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/crypto.c/crypto_ut.c 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/gpt/ 00:46:58.341 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/gpt/gpt.c/ 00:46:58.341 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/gpt/gpt.c/Makefile 00:46:58.341 -rw-r--r-- vagrant/vagrant 9462 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/gpt/gpt.c/gpt_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 328 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/gpt/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/mt/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/mt/bdev.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/mt/bdev.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 82067 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/mt/bdev.c/bdev_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 329 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/bdev/mt/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/conn.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/conn.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 26977 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/conn.c/conn_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/init_grp.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/init_grp.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 13648 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/init_grp.c/init_grp_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/iscsi.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/iscsi.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 81751 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/iscsi.c/iscsi_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/param.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/param.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 10531 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/param.c/param_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/portal_grp.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/portal_grp.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 6612 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/portal_grp.c/portal_grp_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 5958 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/common.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/tgt_node.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/tgt_node.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 22672 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/iscsi/tgt_node.c/tgt_node_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/reduce/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 328 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/reduce/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/reduce/reduce.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/reduce/reduce.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 59456 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/reduce/reduce.c/reduce_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 317945 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob.c/blob_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 10566 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob.c/esnap_dev.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 1945 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob.c/ext_dev.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 12408 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/bs_dev_common.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 1127 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/bs_scheduler.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob_bdev.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob_bdev.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 16416 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blob/blob_bdev.c/blob_bdev_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 357 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_parse.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_parse.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 26418 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_parse.c/json_parse_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_util.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_util.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 26200 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_util.c/json_util_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_write.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_write.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 19508 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/json/json_write.c/json_write_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/rpc.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/rpc.c/.gitignore 00:46:58.342 -rw-r--r-- vagrant/vagrant 330 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/rpc.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 8568 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/rpc/rpc.c/rpc_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/tree.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/tree.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 3623 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/tree.c/tree_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 371 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_async_ut/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 260 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_async_ut/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 16616 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_async_ut/blobfs_async_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_bdev.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_bdev.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 6285 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_bdev.c/blobfs_bdev_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_sync_ut/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_sync_ut/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 15885 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/blobfs/blobfs_sync_ut/blobfs_sync_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/jsonrpc/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/jsonrpc/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/jsonrpc/jsonrpc_server.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 262 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/jsonrpc/jsonrpc_server.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 10654 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/jsonrpc/jsonrpc_server.c/jsonrpc_server_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 650 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi.c/scsi_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_bdev.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_bdev.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 30724 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_bdev.c/scsi_bdev_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_pr.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 277 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_pr.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 20293 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/scsi_pr.c/scsi_pr_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/dev.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/dev.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 20261 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/dev.c/dev_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/lun.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/lun.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 21210 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/scsi/lun.c/lun_ut.c 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/dma.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 7 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/dma.c/.gitignore 00:46:58.342 -rw-r--r-- vagrant/vagrant 283 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/dma.c/Makefile 00:46:58.342 -rw-r--r-- vagrant/vagrant 9540 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/dma.c/dma_ut.c 00:46:58.342 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/dma/Makefile 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/keyring/ 00:46:58.342 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/keyring/keyring.c/ 00:46:58.342 -rw-r--r-- vagrant/vagrant 270 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/keyring/keyring.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 5840 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/keyring/keyring.c/keyring_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 322 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/keyring/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/posix.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/posix.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 4565 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/posix.c/posix_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/sock.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/sock.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 31737 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/sock.c/sock_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/uring.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/uring.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 7728 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/uring.c/uring_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/sock/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/env_dpdk/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/env_dpdk/pci_event.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/env_dpdk/pci_event.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 4133 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/env_dpdk/pci_event.c/pci_event_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 331 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/env_dpdk/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/log/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/log/log.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/log/log.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 6962 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/log/log.c/log_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/log/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/iobuf.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/iobuf.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 22563 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/iobuf.c/iobuf_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/thread.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/thread.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 51654 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/thread.c/thread_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/thread/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/app.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 311 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/app.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 5749 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/app.c/app_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/reactor.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/reactor.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 29244 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/reactor.c/reactor_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 335 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/event/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/lvol/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/lvol/lvol.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/lvol/lvol.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 101902 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/lvol/lvol.c/lvol_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 326 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/lvol/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/cpuset.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/cpuset.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 7266 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/cpuset.c/cpuset_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/xor.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/xor.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 2588 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/xor.c/xor_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc16.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc16.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 1478 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc16.c/crc16_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32_ieee.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 237 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32_ieee.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 751 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32_ieee.c/crc32_ieee_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32c.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32c.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 4473 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc32c.c/crc32c_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc64.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc64.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 1628 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/crc64.c/crc64_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/dif.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/dif.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 131329 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/dif.c/dif_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/iov.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/iov.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 8126 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/iov.c/iov_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/math.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 274 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/math.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 2067 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/math.c/math_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/base64.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/base64.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 10900 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/base64.c/base64_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/pipe.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 231 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/pipe.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 15289 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/pipe.c/pipe_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/bit_array.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/bit_array.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 9561 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/bit_array.c/bit_array_ut.c 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/string.c/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 233 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/string.c/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 14277 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/string.c/string_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 432 2024-06-07 12:49 spdk-test_gen_spec/test/unit/lib/util/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/spdk/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 336 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/spdk/Makefile 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/spdk/histogram_data.h/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 272 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/spdk/histogram_data.h/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 3020 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/spdk/histogram_data.h/histogram_ut.c 00:46:58.343 -rw-r--r-- vagrant/vagrant 321 2024-06-07 12:49 spdk-test_gen_spec/test/unit/include/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/test/unit/.gitignore 00:46:58.343 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/test/unit/Makefile 00:46:58.343 -rwxr-xr-x vagrant/vagrant 12883 2024-06-07 12:49 spdk-test_gen_spec/test/unit/unittest.sh 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/go/ 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 1598 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/go.sum 00:46:58.343 -rw-r--r-- vagrant/vagrant 1465 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/LICENSE 00:46:58.343 -rw-r--r-- vagrant/vagrant 755 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 2926 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/README.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 2963 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/clientIntegration.go 00:46:58.343 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/go.mod 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/client/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 6126 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/client/client.go 00:46:58.343 -rw-r--r-- vagrant/vagrant 1428 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/client/client_test.go 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/mocks/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 2432 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/mocks/IClient.go 00:46:58.343 -rw-r--r-- vagrant/vagrant 47 2024-06-07 12:49 spdk-test_gen_spec/go/rpc/mocks/boilerplate.txt 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/licenses/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 1265 2024-06-07 12:49 spdk-test_gen_spec/licenses/bsd-2-clause.txt 00:46:58.343 -rw-r--r-- vagrant/vagrant 1428 2024-06-07 12:49 spdk-test_gen_spec/licenses/bsd-3-clause.txt 00:46:58.343 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/vfiouserbuild/ 00:46:58.343 -rw-r--r-- vagrant/vagrant 1060 2024-06-07 12:49 spdk-test_gen_spec/vfiouserbuild/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 5440 2024-06-07 12:49 spdk-test_gen_spec/CODE_OF_CONDUCT.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 5239 2024-06-07 12:49 spdk-test_gen_spec/CONFIG 00:46:58.343 -rw-r--r-- vagrant/vagrant 135 2024-06-07 12:49 spdk-test_gen_spec/CONTRIBUTING.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 1086 2024-06-07 12:49 spdk-test_gen_spec/LICENSE 00:46:58.343 -rwxr-xr-x vagrant/vagrant 1558 2024-06-07 12:49 spdk-test_gen_spec/autobuild.sh 00:46:58.343 -rwxr-xr-x vagrant/vagrant 1217 2024-06-07 12:49 spdk-test_gen_spec/autopackage.sh 00:46:58.343 -rwxr-xr-x vagrant/vagrant 687 2024-06-07 12:49 spdk-test_gen_spec/autorun.sh 00:46:58.343 -rwxr-xr-x vagrant/vagrant 8621 2024-06-07 12:49 spdk-test_gen_spec/autorun_post.py 00:46:58.343 -rwxr-xr-x vagrant/vagrant 13602 2024-06-07 12:49 spdk-test_gen_spec/autotest.sh 00:46:58.343 -rwxr-xr-x vagrant/vagrant 46432 2024-06-07 12:49 spdk-test_gen_spec/configure 00:46:58.343 -rw-r--r-- vagrant/vagrant 3069 2024-06-07 12:49 spdk-test_gen_spec/deprecation.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/.run_test_name 00:46:58.343 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/mdl_rules.rb 00:46:58.343 -rw-r--r-- vagrant/vagrant 769 2024-06-07 12:49 spdk-test_gen_spec/.astylerc 00:46:58.343 -rw-r--r-- vagrant/vagrant 454 2024-06-07 12:49 spdk-test_gen_spec/.gitignore 00:46:58.343 -rw-r--r-- vagrant/vagrant 590 2024-06-07 12:49 spdk-test_gen_spec/.gitmodules 00:46:58.343 -rw-r--r-- vagrant/vagrant 198348 2024-06-07 12:49 spdk-test_gen_spec/CHANGELOG.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 3252 2024-06-07 12:49 spdk-test_gen_spec/Makefile 00:46:58.343 -rw-r--r-- vagrant/vagrant 9559 2024-06-07 12:49 spdk-test_gen_spec/README.md 00:46:58.343 -rw-r--r-- vagrant/vagrant 212 2024-06-07 12:49 spdk-test_gen_spec/SECURITY.md 00:46:58.344 -rw-r--r-- vagrant/vagrant 6350 2024-06-07 12:49 spdk-test_gen_spec/.spdk-isal.log 00:46:58.344 -rw-r--r-- vagrant/vagrant 6356 2024-06-07 12:49 spdk-test_gen_spec/.spdk-isal-crypto.log 00:46:58.344 -rw-r--r-- vagrant/vagrant 49 2024-06-07 12:49 spdk-test_gen_spec/.coredump_path 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/ 00:46:58.344 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/applypatch-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/commit-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/fsmonitor-watchman.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/post-update.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-applypatch.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-merge-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-push.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-rebase.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/pre-receive.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/prepare-commit-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/push-to-checkout.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/sendemail-validate.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/hooks/update.sample 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/info/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/info/exclude 00:46:58.344 -rw-r--r-- vagrant/vagrant 6820 2024-06-07 12:49 spdk-test_gen_spec/.git/info/refs 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 208 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/refs/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/refs/remotes/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/refs/remotes/origin/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 290 2024-06-07 12:49 spdk-test_gen_spec/.git/logs/refs/remotes/origin/master 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/HEAD 00:46:58.344 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/description 00:46:58.344 -rw-r--r-- vagrant/vagrant 95538 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/index 00:46:58.344 -rw-r--r-- vagrant/vagrant 1477 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/packed-refs 00:46:58.344 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/config 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/branches/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/ 00:46:58.344 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/applypatch-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/commit-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/fsmonitor-watchman.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/post-update.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-applypatch.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-merge-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-push.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-rebase.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/pre-receive.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/prepare-commit-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/push-to-checkout.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/sendemail-validate.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/hooks/update.sample 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/info/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/info/exclude 00:46:58.344 -rw-r--r-- vagrant/vagrant 1556 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/info/refs 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 398 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/remotes/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/remotes/origin/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 190 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/remotes/origin/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/heads/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 190 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/logs/refs/heads/master 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/info/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/info/packs 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/pack/ 00:46:58.344 -r--r--r-- vagrant/vagrant 698216 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/pack/pack-18df85e9e035558129a2f8dd066db948b4c38a82.idx 00:46:58.344 -r--r--r-- vagrant/vagrant 11917535 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/pack/pack-18df85e9e035558129a2f8dd066db948b4c38a82.pack 00:46:58.344 -r--r--r-- vagrant/vagrant 99644 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/objects/pack/pack-18df85e9e035558129a2f8dd066db948b4c38a82.rev 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/heads/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/heads/master 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/remotes/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/remotes/origin/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/remotes/origin/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/intel-ipsec-mb/refs/tags/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/info/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/info/exclude 00:46:58.344 -rw-r--r-- vagrant/vagrant 1389 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/info/refs 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/heads/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 185 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/heads/master 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/remotes/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/remotes/origin/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 185 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/logs/refs/remotes/origin/HEAD 00:46:58.344 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/HEAD 00:46:58.344 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/description 00:46:58.344 -rw-r--r-- vagrant/vagrant 51640 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/index 00:46:58.344 -rw-r--r-- vagrant/vagrant 1170 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/packed-refs 00:46:58.344 -rw-r--r-- vagrant/vagrant 296 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/config 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/info/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/info/packs 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/pack/ 00:46:58.344 -r--r--r-- vagrant/vagrant 161232 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/pack/pack-6950a142dee7e3dc0b131edbc69e289de7c44fc2.idx 00:46:58.344 -r--r--r-- vagrant/vagrant 2463307 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/pack/pack-6950a142dee7e3dc0b131edbc69e289de7c44fc2.pack 00:46:58.344 -r--r--r-- vagrant/vagrant 22932 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/objects/pack/pack-6950a142dee7e3dc0b131edbc69e289de7c44fc2.rev 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/heads/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/heads/master 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/remotes/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/remotes/origin/ 00:46:58.344 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/remotes/origin/HEAD 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/refs/tags/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/branches/ 00:46:58.344 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/ 00:46:58.344 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/applypatch-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/commit-msg.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/fsmonitor-watchman.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/post-update.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-applypatch.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-merge-commit.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-push.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-rebase.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/pre-receive.sample 00:46:58.344 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/prepare-commit-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/push-to-checkout.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/sendemail-validate.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l-crypto/hooks/update.sample 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/description 00:46:58.345 -rw-r--r-- vagrant/vagrant 42536 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/index 00:46:58.345 -rw-r--r-- vagrant/vagrant 1471 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/packed-refs 00:46:58.345 -rw-r--r-- vagrant/vagrant 285 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/config 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/branches/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/ 00:46:58.345 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/applypatch-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/commit-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/fsmonitor-watchman.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/post-update.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-applypatch.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-commit.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-merge-commit.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-push.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-rebase.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/pre-receive.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/prepare-commit-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/push-to-checkout.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/sendemail-validate.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/hooks/update.sample 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/info/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/info/exclude 00:46:58.345 -rw-r--r-- vagrant/vagrant 1750 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/info/refs 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/heads/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 181 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/heads/master 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/remotes/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/remotes/origin/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 181 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/refs/remotes/origin/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/logs/HEAD 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/info/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/info/packs 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/pack/ 00:46:58.345 -r--r--r-- vagrant/vagrant 142556 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/pack/pack-eb55c77d536ae9e76d7c7a3ace88d6bbdcfa0585.idx 00:46:58.345 -r--r--r-- vagrant/vagrant 2426726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/pack/pack-eb55c77d536ae9e76d7c7a3ace88d6bbdcfa0585.pack 00:46:58.345 -r--r--r-- vagrant/vagrant 20264 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/objects/pack/pack-eb55c77d536ae9e76d7c7a3ace88d6bbdcfa0585.rev 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/heads/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/heads/master 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/remotes/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/remotes/origin/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/remotes/origin/HEAD 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/isa-l/refs/tags/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/branches/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/ 00:46:58.345 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/applypatch-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/commit-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/fsmonitor-watchman.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/post-update.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-applypatch.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-commit.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-merge-commit.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-push.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-rebase.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/pre-receive.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/prepare-commit-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/push-to-checkout.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/sendemail-validate.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/hooks/update.sample 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/info/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/info/exclude 00:46:58.345 -rw-r--r-- vagrant/vagrant 4173 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/info/refs 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 399 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/HEAD 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/heads/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 191 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/heads/master 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/remotes/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/remotes/origin/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 191 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/logs/refs/remotes/origin/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/description 00:46:58.345 -rw-r--r-- vagrant/vagrant 9003 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/index 00:46:58.345 -rw-r--r-- vagrant/vagrant 4077 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/packed-refs 00:46:58.345 -rw-r--r-- vagrant/vagrant 302 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/config 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/info/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/info/packs 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/pack/ 00:46:58.345 -r--r--r-- vagrant/vagrant 170500 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/pack/pack-a9fcd6df400c47fd66450027f57b6ab2495a44b1.idx 00:46:58.345 -r--r--r-- vagrant/vagrant 2095244 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/pack/pack-a9fcd6df400c47fd66450027f57b6ab2495a44b1.pack 00:46:58.345 -r--r--r-- vagrant/vagrant 24256 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/objects/pack/pack-a9fcd6df400c47fd66450027f57b6ab2495a44b1.rev 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/heads/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/heads/master 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/remotes/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/remotes/origin/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/remotes/origin/HEAD 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/libvfio-user/refs/tags/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/info/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/info/packs 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/pack/ 00:46:58.345 -r--r--r-- vagrant/vagrant 382208 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/pack/pack-ddb3b16f393ad9e3cd60c13a9625067b21a1c871.idx 00:46:58.345 -r--r--r-- vagrant/vagrant 3221084 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/pack/pack-ddb3b16f393ad9e3cd60c13a9625067b21a1c871.pack 00:46:58.345 -r--r--r-- vagrant/vagrant 54500 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/objects/pack/pack-ddb3b16f393ad9e3cd60c13a9625067b21a1c871.rev 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/tags/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/heads/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/heads/master 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/remotes/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/remotes/origin/ 00:46:58.345 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/refs/remotes/origin/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/HEAD 00:46:58.345 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/description 00:46:58.345 -rw-r--r-- vagrant/vagrant 34777 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/index 00:46:58.345 -rw-r--r-- vagrant/vagrant 4368 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/packed-refs 00:46:58.345 -rw-r--r-- vagrant/vagrant 285 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/config 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/branches/ 00:46:58.345 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/ 00:46:58.345 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/applypatch-msg.sample 00:46:58.345 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/fsmonitor-watchman.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/post-update.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-applypatch.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-merge-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-push.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-rebase.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/pre-receive.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/prepare-commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/push-to-checkout.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/sendemail-validate.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/hooks/update.sample 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/info/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/info/exclude 00:46:58.346 -rw-r--r-- vagrant/vagrant 4447 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/info/refs 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/heads/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 183 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/heads/master 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/remotes/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/remotes/origin/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 183 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/refs/remotes/origin/HEAD 00:46:58.346 -rw-r--r-- vagrant/vagrant 391 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/ocf/logs/HEAD 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/branches/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/ 00:46:58.346 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/applypatch-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/fsmonitor-watchman.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/post-update.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-applypatch.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-merge-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-push.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-rebase.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/pre-receive.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/prepare-commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/push-to-checkout.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/sendemail-validate.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/hooks/update.sample 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/info/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/info/exclude 00:46:58.346 -rw-r--r-- vagrant/vagrant 2661 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/info/refs 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 388 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/HEAD 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/remotes/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/remotes/origin/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 182 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/remotes/origin/HEAD 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/heads/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 182 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/logs/refs/heads/main 00:46:58.346 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/HEAD 00:46:58.346 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/description 00:46:58.346 -rw-r--r-- vagrant/vagrant 93575 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/index 00:46:58.346 -rw-r--r-- vagrant/vagrant 2584 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/packed-refs 00:46:58.346 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/config 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/info/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/info/packs 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/pack/ 00:46:58.346 -r--r--r-- vagrant/vagrant 449828 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/pack/pack-2285825a4d37aca51fdff4e30e9bd1affd1ca3cb.idx 00:46:58.346 -r--r--r-- vagrant/vagrant 11216897 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/pack/pack-2285825a4d37aca51fdff4e30e9bd1affd1ca3cb.pack 00:46:58.346 -r--r--r-- vagrant/vagrant 64160 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/objects/pack/pack-2285825a4d37aca51fdff4e30e9bd1affd1ca3cb.rev 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/heads/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/heads/main 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/remotes/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/remotes/origin/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 30 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/remotes/origin/HEAD 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/xnvme/refs/tags/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/info/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/info/packs 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/pack/ 00:46:58.346 -r--r--r-- vagrant/vagrant 10508716 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/pack/pack-95c707d76b33c8444dd90aadddff292963c09f63.idx 00:46:58.346 -r--r--r-- vagrant/vagrant 115310303 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/pack/pack-95c707d76b33c8444dd90aadddff292963c09f63.pack 00:46:58.346 -r--r--r-- vagrant/vagrant 1501144 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/objects/pack/pack-95c707d76b33c8444dd90aadddff292963c09f63.rev 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/tags/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/heads/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/heads/spdk-21.05 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/remotes/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/remotes/origin/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 36 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/refs/remotes/origin/HEAD 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/branches/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/ 00:46:58.346 -rwxr-xr-x vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/applypatch-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 900 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4726 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/fsmonitor-watchman.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/post-update.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-applypatch.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 420 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-merge-commit.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-push.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 4902 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-rebase.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/pre-receive.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 1496 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/prepare-commit-msg.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2787 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/push-to-checkout.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 2312 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/sendemail-validate.sample 00:46:58.346 -rwxr-xr-x vagrant/vagrant 3654 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/hooks/update.sample 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/info/ 00:46:58.346 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/info/exclude 00:46:58.346 -rw-r--r-- vagrant/vagrant 19669 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/info/refs 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/ 00:46:58.346 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/heads/ 00:46:58.347 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/heads/spdk-21.05 00:46:58.347 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/remotes/ 00:46:58.347 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/remotes/origin/ 00:46:58.347 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/refs/remotes/origin/HEAD 00:46:58.347 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/logs/HEAD 00:46:58.347 -rw-r--r-- vagrant/vagrant 291 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/config 00:46:58.347 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/HEAD 00:46:58.347 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/description 00:46:58.347 -rw-r--r-- vagrant/vagrant 623208 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/index 00:46:58.347 -rw-r--r-- vagrant/vagrant 16633 2024-06-07 12:49 spdk-test_gen_spec/.git/modules/dpdk/packed-refs 00:46:58.347 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/ 00:46:58.347 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/info/ 00:46:58.347 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/info/packs 00:46:58.347 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/pack/ 00:46:58.347 -r--r--r-- vagrant/vagrant 5469696 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/pack/pack-0b4c2dbbad586fc8eb6b2aee062558790c0d3680.idx 00:46:58.347 -r--r--r-- vagrant/vagrant 66887676 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/pack/pack-0b4c2dbbad586fc8eb6b2aee062558790c0d3680.pack 00:46:58.922 -r--r--r-- vagrant/vagrant 781284 2024-06-07 12:49 spdk-test_gen_spec/.git/objects/pack/pack-0b4c2dbbad586fc8eb6b2aee062558790c0d3680.rev 00:46:58.922 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/ 00:46:58.922 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/heads/ 00:46:58.922 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/remotes/ 00:46:58.922 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/remotes/origin/ 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/remotes/origin/master 00:46:58.922 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/ 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/LTS 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v1.0.0 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v1.2.0 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v16.06 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v16.08 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v16.12 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v17.03 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v17.07 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v17.07.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v17.10 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v17.10.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.01.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.04 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.04.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.07 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.07.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.10 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.10.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v18.10.2 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.01.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.04 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.04.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.07 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.07.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.10 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.10-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v19.10.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.01-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.01.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.01.2 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.04 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.04-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.04.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.07 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.07-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.10 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v20.10-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.01-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.01.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.04 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.04-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.07 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.07-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.10 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v21.10-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.01-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.01.1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.01.2 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.05 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.05-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.09 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.09-pre 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v22.09-rc1 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.01 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.01-pre 00:46:58.922 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.01-rc1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.01.1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.05 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.05-pre 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.05-rc1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.09 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.09-pre 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v23.09-rc1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.01 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.01-pre 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.01-rc1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.05 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.05-pre 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.05-rc1 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/refs/tags/v24.09-pre 00:46:58.923 -rw-r--r-- vagrant/vagrant 8810 2024-06-07 12:49 spdk-test_gen_spec/.git/FETCH_HEAD 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/HEAD 00:46:58.923 -rw-r--r-- vagrant/vagrant 841 2024-06-07 12:49 spdk-test_gen_spec/.git/config 00:46:58.923 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/.git/description 00:46:58.923 -rw-r--r-- vagrant/vagrant 213980 2024-06-07 12:49 spdk-test_gen_spec/.git/index 00:46:58.923 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/.git/ORIG_HEAD 00:46:58.923 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.git/branches/ 00:46:58.923 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/include/ 00:46:58.923 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/include/linux/ 00:46:58.923 -rw-r--r-- vagrant/vagrant 2159 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_types.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 6802 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_blk.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3836 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_config.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 7079 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_pci.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 7492 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_ring.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 6035 2024-06-07 12:49 spdk-test_gen_spec/include/linux/virtio_scsi.h 00:46:58.923 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/ 00:46:58.923 -rw-r--r-- vagrant/vagrant 88631 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/bdev.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 60449 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/bdev_module.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 10589 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/bdev_zone.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4712 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/bit_array.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4441 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/bit_pool.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 43791 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/blob.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 2974 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/blob_bdev.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 17337 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/blobfs.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1995 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/blobfs_bdev.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 5213 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/conf.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3838 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/cpuset.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1056 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/crc16.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1531 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/crc32.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 655 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/crc64.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 16336 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/dif.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 16143 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/dma.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 2977 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/endian.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 44731 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/env.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1489 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/env_dpdk.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 10226 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/event.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 580 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/fd.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4798 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/fd_group.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 861 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/file.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 11324 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvmf_fc_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 24564 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvmf_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 22456 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvmf_transport.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3903 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/opal.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 8629 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/opal_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 35350 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/accel.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 12491 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/accel_module.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 809 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/assert.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3504 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/barrier.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3877 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/base64.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 11019 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/ftl.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3220 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/gpt_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 787 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/hexlify.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 6843 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/histogram_data.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 17270 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/idxd.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 16096 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/idxd_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4874 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/init.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 7252 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/ioat.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 7074 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/ioat_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 12747 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/iscsi_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 13865 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/json.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 11807 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/jsonrpc.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 3428 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/keyring.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 2838 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/keyring_module.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/likely.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 12161 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/log.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 14889 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/lvol.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 941 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/memory.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1948 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/mmio.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1819 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nbd.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 2397 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/notify.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 171969 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 5705 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme_intel.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 7952 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme_ocssd.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 8277 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme_ocssd_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 119470 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 16392 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvme_zns.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 50921 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvmf.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 8401 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/nvmf_cmd.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4009 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/pci_ids.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 5407 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/pipe.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1269 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/queue.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 13600 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/queue_extras.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 8367 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/reduce.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 4810 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/rpc.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 6798 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/scheduler.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 17029 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/scsi.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 21084 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/scsi_spec.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 20347 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/sock.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 1667 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/stdinc.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 9601 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/string.h 00:46:58.923 -rw-r--r-- vagrant/vagrant 38405 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/thread.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 15247 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/trace.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 3616 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/trace_parser.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 28846 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/tree.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 809 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/ublk.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 9826 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/util.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2612 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/uuid.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2286 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/version.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 903 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/vfio_user_pci.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2758 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/vfio_user_spec.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 9060 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/vfu_target.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 11516 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/vhost.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2809 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/vmd.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 826 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/xor.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 1149 2024-06-07 12:49 spdk-test_gen_spec/include/spdk/zipf.h 00:46:58.924 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/ 00:46:58.924 -rw-r--r-- vagrant/vagrant 3961 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/lvolstore.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2323 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/mlx5.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/uring.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 1866 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/usdt.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 5625 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/utf.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2868 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/vhost_user.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 15865 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/virtio.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 1651 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/init.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 517 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/assert.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 1719 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/cunit.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 3780 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/event.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 933 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/idxd.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 4951 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/mock.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 3062 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/nvme.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 24261 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/nvme_tcp.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 9038 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/rdma.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 2037 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/sgl.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 11071 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/sock.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 1689 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/thread.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 9156 2024-06-07 12:49 spdk-test_gen_spec/include/spdk_internal/trace_defs.h 00:46:58.924 -rw-r--r-- vagrant/vagrant 684 2024-06-07 12:49 spdk-test_gen_spec/include/Makefile 00:46:58.924 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/mk/ 00:46:58.924 -rw-r--r-- vagrant/vagrant 348 2024-06-07 12:49 spdk-test_gen_spec/mk/nvme.libtest.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 971 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.app.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 970 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.app_cxx.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 1112 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.app_vars.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 16816 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.common.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.deps.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 1771 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.fio.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 2676 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.lib.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 6319 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.lib_deps.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 738 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.mock.unittest.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 4053 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.modules.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.subdirs.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 1256 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk.unittest.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 120 2024-06-07 12:49 spdk-test_gen_spec/mk/spdk_blank.map 00:46:58.924 -rw-r--r-- vagrant/vagrant 202 2024-06-07 12:49 spdk-test_gen_spec/mk/cc.mk 00:46:58.924 -rw-r--r-- vagrant/vagrant 5391 2024-06-07 12:49 spdk-test_gen_spec/mk/config.mk 00:46:58.924 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/ 00:46:58.924 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/ 00:46:58.924 -rw-r--r-- vagrant/vagrant 3018 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/meson.build 00:46:58.924 -rw-r--r-- vagrant/vagrant 5978 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_adm.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 18122 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3857 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_admin_shim.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 5311 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_async_emu.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2397 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_async_nil.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 5203 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_async_posix.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 10410 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_async_thrpool.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1538 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_mem_posix.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3536 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_cbi_sync_psync.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2653 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_fbsd.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 8553 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_fbsd_async.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 5387 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_fbsd_dev.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3338 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_fbsd_nvme.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 5185 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 7155 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_async_libaio.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 11109 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_async_liburing.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 6756 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_async_ucmd.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 15997 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_block.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 8256 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_dev.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 4634 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_hugepage.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 10455 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_linux_nvme.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1909 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_macos.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_macos_admin.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 7219 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_macos_dev.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2775 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_macos_sync.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 5435 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_nosys.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2430 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_ramdisk.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3637 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_ramdisk_admin.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2506 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_ramdisk_dev.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3275 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_ramdisk_sync.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1498 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 3099 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk_admin.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 4818 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk_async.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 23194 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk_dev.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1753 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk_mem.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 2942 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_spdk_sync.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio.c 00:46:58.924 -rw-r--r-- vagrant/vagrant 1152 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio_admin.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3946 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio_async.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 5998 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio_dev.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3640 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio_mem.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 1892 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_vfio_sync.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 5152 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 6180 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_async_iocp.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 8101 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_async_iocp_th.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 6299 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_async_ioring.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 4186 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_block.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 11146 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_dev.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 7234 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_fs.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3045 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_mem.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 10484 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_be_windows_nvme.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 5521 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_buf.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 46206 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_cli.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2975 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_cmd.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3856 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_dev.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 1380 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_file.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2275 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_geo.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 1763 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_ident.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2726 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_kvs.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2639 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_lba.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 1532 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_libconf.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2484 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_libconf_entries.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 5121 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_nvm.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3661 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_opts.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 3349 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_queue.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 100 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_req.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 45252 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_spec.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 12493 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_spec_pp.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 2376 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_topology.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 744 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_ver.c 00:46:58.925 -rw-r--r-- vagrant/vagrant 15952 2024-06-07 12:49 spdk-test_gen_spec/xnvme/lib/xnvme_znd.c 00:46:58.925 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/ 00:46:58.925 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/ 00:46:58.925 -rw-r--r-- vagrant/vagrant 1051 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-delete.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 952 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-enum.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1047 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-exist.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1047 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-idfy-ns.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1043 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-list.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1120 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-retrieve.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1301 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs-store.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1015 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/kvs.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1474 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-dir-receive.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1639 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-dir-send.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 935 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-enum.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1339 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-idfy.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1214 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-info.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1586 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-read.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-write-dir.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1612 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-write-uncor.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-write-zeros.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1588 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk-write.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1303 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/lblk.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 2543 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/meson.build 00:46:58.925 -rw-r--r-- vagrant/vagrant 974 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xdd-async.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 904 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xdd-sync.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 807 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xdd.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1180 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-ctrlr-reset.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-dsm.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 903 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-enum.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1302 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-fdp-ruhs.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1360 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-fdp-ruhu.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-feature-get.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-feature-set.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1650 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-format.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1297 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-idfy-cs.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1255 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-idfy-ctrlr.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1323 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-idfy-ns.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-idfy.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1214 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-info.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 503 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-library-info.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 893 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-list.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1367 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-erri.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1424 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-fdp-config.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1473 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-fdp-events.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1372 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-fdp-stats.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1317 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-health.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1412 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log-ruhu.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1571 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-log.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-ns-rescan.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 2502 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-padc.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 2496 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-pioc.1 00:46:58.925 -rw-r--r-- vagrant/vagrant 1160 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-sanitize.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1445 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-set-fdp-events.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1184 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-show-regs.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1186 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme-subsystem-reset.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 2375 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 759 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-copy-async.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 787 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-copy-sync.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1138 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-dump-async-iovec.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 852 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-dump-async.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1024 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-dump-sync-iovec.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 742 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-dump-sync.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 745 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-load-async.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 675 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-load-sync.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file-write-read.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1326 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/xnvme_file.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1572 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-append.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1251 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-changes.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 921 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-enum.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1309 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-errors.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1228 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-idfy-ctrlr.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1220 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-idfy-ns.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1170 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-info.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1416 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-mgmt-close.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1420 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-mgmt-finish.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1412 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-mgmt-open.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1416 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-mgmt-reset.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1517 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-mgmt.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1566 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-read.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1434 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-report.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1566 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned-write.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1460 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tools/zoned.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/README.rst 00:46:58.926 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/ 00:46:58.926 -rw-r--r-- vagrant/vagrant 553 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/meson.build 00:46:58.926 -rw-r--r-- vagrant/vagrant 1170 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/xnvme_hello-hw.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 673 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/xnvme_hello.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1664 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/xnvme_io_async-read.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1607 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/xnvme_io_async-write.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 757 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/xnvme_io_async.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1647 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_async-append.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1655 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_async-read.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1643 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_async-write.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 854 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_async.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1476 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_sync-append.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1466 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_sync-read.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1472 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_sync-write.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 818 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/examples/zoned_io_sync.1 00:46:58.926 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/ 00:46:58.926 -rw-r--r-- vagrant/vagrant 2257 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/meson.build 00:46:58.926 -rw-r--r-- vagrant/vagrant 1585 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_async_intf-init_term.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 748 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_async_intf.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_buf-buf_alloc_free.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1346 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_buf-buf_virt_alloc_free.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 857 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_buf.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1257 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_cli-optional.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 707 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_cli.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 603 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_enum-backend.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 994 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_enum-multi.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1147 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_enum-open.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_enum.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1392 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_ioworker-verify-sync.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1469 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_ioworker-verify.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 800 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_ioworker.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1435 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_kvs-kvs_io.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 732 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_kvs.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1455 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_lblk-io.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1379 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_lblk-scopy.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1479 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_lblk-write_uncorrectable.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1449 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_lblk-write_zeroes.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1016 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_lblk.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1326 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_map-mem_map_unmap.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 738 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_map.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1341 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_scc-idfy.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1499 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_scc-scopy-msrc.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1489 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_scc-scopy.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1339 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_scc-support.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1126 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_scc.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 571 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_cli-check-opt-attr.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 696 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_cli-copy-xnvme_cli_run.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 859 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_cli.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 633 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_file-file-trunc.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 627 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_file-write-fsync.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 846 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_xnvme_file.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1497 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_append-verify.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 761 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_append.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 676 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_explicit_open-test_open_zdptr.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 743 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_explicit_open.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1399 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_state-changes.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1521 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_state-transition.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 853 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_state.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1470 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-flush-explicit.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1470 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-flush-implicit.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1440 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-flush.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1390 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-idfy.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1488 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-open-with-zrwa.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1516 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-open-without-zrwa.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1456 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa-support.1 00:46:58.926 -rw-r--r-- vagrant/vagrant 1277 2024-06-07 12:49 spdk-test_gen_spec/xnvme/man/tests/xnvme_tests_znd_zrwa.1 00:46:58.926 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/ 00:46:58.926 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/ 00:46:58.926 -rw-r--r-- vagrant/vagrant 11701 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/LICENSE 00:46:58.926 -rw-r--r-- vagrant/vagrant 3557 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/Makefile 00:46:58.926 -rw-r--r-- vagrant/vagrant 663 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/README.rst 00:46:58.926 -rw-r--r-- vagrant/vagrant 104 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/pyproject.toml 00:46:58.927 -rw-r--r-- vagrant/vagrant 71 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/requirements.txt 00:46:58.927 -rw-r--r-- vagrant/vagrant 1057 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/setup.py 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/auxiliary/ 00:46:58.927 -rwxr-xr-x vagrant/vagrant 3175 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/auxiliary/patch_ctypes_bindings.py 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/bin/ 00:46:58.927 -rwxr-xr-x vagrant/vagrant 1114 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/bin/xpy_dev_open 00:46:58.927 -rwxr-xr-x vagrant/vagrant 812 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/bin/xpy_enumerate 00:46:58.927 -rwxr-xr-x vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/bin/xpy_libconf 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 97 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/__init__.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 1679 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/library_loader.py 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/tests/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/tests/__init__.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 784 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/bindings/xnvme/ctypes_bindings/tests/test_loader.py 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 12195 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/conftest.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 33 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/requirements.txt 00:46:58.927 -rw-r--r-- vagrant/vagrant 3789 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_basics.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 889 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_buf.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 1231 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_enum.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 1282 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_io.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 17520 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_key_value.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 13554 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_lblk.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 17403 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/test_linux_hugepage.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 1868 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/utils.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 3089 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/tests/xnvme_be_combinations.py 00:46:58.927 -rw-r--r-- vagrant/vagrant 37 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/.gitignore 00:46:58.927 -rw-r--r-- vagrant/vagrant 2538 2024-06-07 12:49 spdk-test_gen_spec/xnvme/python/README.rst 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.github/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.github/codeql/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 104 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.github/codeql/codeql-config.yml 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.github/workflows/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 30257 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.github/workflows/verify.yml 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/.gitignore 00:46:58.927 -rw-r--r-- vagrant/vagrant 65 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/Cargo.toml 00:46:58.927 -rw-r--r-- vagrant/vagrant 506 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/README.rst 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/examples/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/examples/xnvme_rust_example01.rs 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/src/ 00:46:58.927 -rwxr-xr-x vagrant/vagrant 145 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/src/lib.rs 00:46:58.927 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/Cargo.toml 00:46:58.927 -rwxr-xr-x vagrant/vagrant 1691 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme-sys/build.rs 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme/src/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 77 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme/src/main.rs 00:46:58.927 -rwxr-xr-x vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/xnvme/rust/xnvme/Cargo.toml 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.reuse/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 1336 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.reuse/dep5 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/ 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 2319 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/meson.build 00:46:58.927 -rw-r--r-- vagrant/vagrant 74 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/meson_options.txt 00:46:58.927 -rw-r--r-- vagrant/vagrant 389 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/spdk_win_build.bat 00:46:58.927 -rw-r--r-- vagrant/vagrant 133 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/spdk_win_patches.sh 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/patches/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 1644 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/patches/spdk-win-0001-nvme-pcie-add-NVMe-PCIe-Driver-registration-hook.patch 00:46:58.927 -rw-r--r-- vagrant/vagrant 18646 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/patches/spdk-win-0002-nvme-add-iovec-passthru.patch 00:46:58.927 -rw-r--r-- vagrant/vagrant 4514 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk-win/patches/spdk-win-0003-lib-nvme-restore-spdk_nvme_ctrlr_get_registers.patch 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 4491 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/meson.build 00:46:58.927 -rw-r--r-- vagrant/vagrant 74 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/meson_options.txt 00:46:58.927 -rwxr-xr-x vagrant/vagrant 407 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/spdk_configure.sh 00:46:58.927 -rwxr-xr-x vagrant/vagrant 464 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/spdk_configure_debug.sh 00:46:58.927 -rwxr-xr-x vagrant/vagrant 129 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/spdk_patches.sh 00:46:58.927 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/ 00:46:58.927 -rw-r--r-- vagrant/vagrant 1116 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0001-nvme-add-struct-spdk_nvme_iocs_vector-definition.patch 00:46:58.927 -rw-r--r-- vagrant/vagrant 2949 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0002-nvmf-add-IDENTIFY_IOCS-cns-1Ch-emulation.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 3485 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0003-nvmf-add-nvmf_passthru_admin_cmd_for_ctrlr.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 18785 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0004-nvme-add-iovec-passthru.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 2415 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0005-lib-bdev-Add-spdk_bdev_is_kv-to-bdev-interface.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 1604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0006-nvmf-Support-adding-KV-NS-to-nvmf_tgt.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 1963 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0007-lib-nvmf-Add-I-O-cmd-passthru-for-KV.patch 00:46:58.928 -rw-r--r-- vagrant/vagrant 1867 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/spdk/patches/spdk-0008-lib-nvmf-Add-identify_-_iocs-passthru-for-KV.patch 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/wpdk/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 1265 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/wpdk/meson.build 00:46:58.928 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/packagefiles/wpdk/meson_options.txt 00:46:58.928 -rw-r--r-- vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/spdk-win.wrap 00:46:58.928 -rw-r--r-- vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/spdk.wrap 00:46:58.928 -rw-r--r-- vagrant/vagrant 259 2024-06-07 12:49 spdk-test_gen_spec/xnvme/subprojects/wpdk.wrap 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/LICENSES/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 1460 2024-06-07 12:49 spdk-test_gen_spec/xnvme/LICENSES/BSD-3-Clause.txt 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 3183 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/async_intf.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 2967 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/buf.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 1171 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/cli.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 7024 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/enum.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 13306 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/ioworker.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 2929 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/kvs.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 17812 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/lblk.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 2172 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/map.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 635 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/meson.build 00:46:58.928 -rw-r--r-- vagrant/vagrant 10225 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/scc.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 2344 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/xnvme_cli.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 4282 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/xnvme_file.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 5175 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/znd_append.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 4151 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/znd_explicit_open.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 3704 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/znd_state.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 11387 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tests/znd_zrwa.c 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/ 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/_kdebs/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/_kdebs/.keep 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 1872 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/plot-legends.yaml 00:46:58.928 -rw-r--r-- vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/plot-limits-4k-aio.yaml 00:46:58.928 -rw-r--r-- vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/plot-limits-4k.yaml 00:46:58.928 -rw-r--r-- vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/plot-limits-512.yaml 00:46:58.928 -rw-r--r-- vagrant/vagrant 485 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/xnvme_linking.c 00:46:58.928 -rw-r--r-- vagrant/vagrant 1653 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/auxiliary/xnvme_loading.c 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 4668 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/bench-amd.toml 00:46:58.928 -rw-r--r-- vagrant/vagrant 4437 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/bench-intel.toml 00:46:58.928 -rw-r--r-- vagrant/vagrant 5486 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/debian-bullseye.toml 00:46:58.928 -rw-r--r-- vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/default-config.toml 00:46:58.928 -rw-r--r-- vagrant/vagrant 3135 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/freebsd-13.toml 00:46:58.928 -rw-r--r-- vagrant/vagrant 865 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/configs/ramdisk.toml 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 3356 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/bdevperf.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 13480 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/bench_plotter.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 2217 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/bench_reporter.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1026 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/fio_build.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1716 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/gha_docgen.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1382 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/gha_prepare_aux.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1164 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/gha_prepare_hugetlbfs.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 2482 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/gha_prepare_python.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1459 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/gha_prepare_source.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1719 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/git_push_checkout.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1365 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/liburing_build.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 2041 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/linux_build_kdebs_tweaked.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 748 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/linux_prepare_nvme.py 00:46:58.928 -rwxr-xr-x vagrant/vagrant 4676 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/spdk_bdev_confs_generator.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1973 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/spdk_build_modded.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 832 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_bindings_py_build.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 669 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_bindings_py_test.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 904 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_build.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1113 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_build_prep.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 753 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_clean.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 385 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_freebsd_sysinfo.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 6828 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_guest_start_nvme.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 894 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_install.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 585 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/scripts/xnvme_kldconfig.py 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/ 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 137 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/Makefile 00:46:58.928 -rw-r--r-- vagrant/vagrant 585 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/README.rst 00:46:58.928 -rw-r--r-- vagrant/vagrant 6545 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/bench.jinja2.rst 00:46:58.928 -rw-r--r-- vagrant/vagrant 962 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/cover.jinja2.tmpl 00:46:58.928 -rw-r--r-- vagrant/vagrant 625 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/style.yaml 00:46:58.928 -rw-r--r-- vagrant/vagrant 49223 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/templates/perf_report/xnvme.png 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/ 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/__init__.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/test_xnvme_hello.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 431 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/test_xnvme_io_async.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 944 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/test_zoned_io_async.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1171 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/examples/test_zoned_io_sync.py 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/linkandload/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 3088 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/linkandload/test_xnvme_consumption.py 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/ 00:46:58.928 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/__init__.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 390 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_async_intf.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 453 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_buf.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 548 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_cli.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1448 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_enum.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 4066 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_ioworker.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 538 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_kvs.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 1372 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_lblk.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 365 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_map.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 2739 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_scc.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 525 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_xnvme_file.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 247 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_znd_append.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 403 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_znd_state.py 00:46:58.928 -rw-r--r-- vagrant/vagrant 3706 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/logic/test_xnvme_tests_znd_zrwa.py 00:46:58.928 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/selftest/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/selftest/__init__.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 32 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/selftest/test_nop.py 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/__init__.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 3172 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_fdp.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 3420 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_kvs.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 1437 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_lblk.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 2738 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_xdd.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 10140 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_xnvme.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_xnvme_file.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 1642 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_xnvme_library_info.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 4770 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/tools/test_zoned.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/__init__.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 12179 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/conftest.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 3089 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/xnvme_be_combinations.py 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/apps/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/apps/__init__.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 818 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/apps/test_fio.py 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/bindings/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/bindings/__init__.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 665 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/tests/bindings/test_py_bindings.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 37 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/.gitignore 00:46:58.929 -rw-r--r-- vagrant/vagrant 1931 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/Makefile 00:46:58.929 -rw-r--r-- vagrant/vagrant 248 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/README.rst 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 1648 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/bench.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1028 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/bootimg-debian-bullseye-amd64.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1727 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/bootimg-freebsd-13-amd64.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 356 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/build-kdebs.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1735 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/dev-python.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1329 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/dev-sync-and-build.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1214 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/docgen.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1289 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/provision-git.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 1146 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/provision.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 673 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/test-debian-bullseye.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 242 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/test-freebsd-13.yaml 00:46:58.929 -rw-r--r-- vagrant/vagrant 518 2024-06-07 12:49 spdk-test_gen_spec/xnvme/cijoe/workflows/test-ramdisk.yaml 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/ 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/ 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/alpine-latest-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1155 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/alpine-latest.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/archlinux-latest-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1013 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/archlinux-latest.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/centos-stream9-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1751 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/centos-stream9.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1283 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/debian-bookworm.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/debian-bullseye.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1283 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/debian-trixie.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 240 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/default-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 550 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/docgen.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-36-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1465 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-36.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-37-build.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1058 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-37.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-38-build.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1058 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/fedora-38.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 617 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/freebsd-13.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 290 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/gentoo-latest-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1251 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/gentoo-latest.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 310 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/macos-11-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 736 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/macos-11.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 310 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/macos-12-build.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 736 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/macos-12.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 736 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/macos-13.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1128 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/opensuse-tumbleweed-latest.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/oraclelinux-9-build.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1740 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/oraclelinux-9.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/rockylinux-9.2-build.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/rockylinux-9.2.sh 00:46:58.929 -rwxr-xr-x vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/ubuntu-focal.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/ubuntu-jammy.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1283 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/ubuntu-lunar.sh 00:46:58.929 -rw-r--r-- vagrant/vagrant 1740 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/windows-2022.bat 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 966 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/README.rst 00:46:58.929 -rw-r--r-- vagrant/vagrant 12760 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/deps.yaml 00:46:58.929 -rwxr-xr-x vagrant/vagrant 5480 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/emit.py 00:46:58.929 -rw-r--r-- vagrant/vagrant 14 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/requirements.txt 00:46:58.929 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/ 00:46:58.929 -rw-r--r-- vagrant/vagrant 149 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-apk.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-apt.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 287 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-brew.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 2289 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-choco.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 268 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-dnf-centos-stream9.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-dnf-oraclelinux-9.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 275 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-dnf-rockylinux-9.2.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 156 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-dnf.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-emerge.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 191 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pacman.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 257 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pip-centos-stream9.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 249 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pip-gentoo-latest.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 201 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pip.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 264 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pipx.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 380 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-pkg.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 470 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-yum.sh.jinja 00:46:58.929 -rwxr-xr-x vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/pkgman-zypper.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 112 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/script-bash.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 166 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/script-batch.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 104 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/script-tcsh.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-fio.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-liburing-centos-stream9.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-liburing-fedora-36.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-liburing-oraclelinux-9.sh.jinja 00:46:58.929 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-liburing-rockylinux-9.2.sh.jinja 00:46:58.930 -rw-r--r-- vagrant/vagrant 393 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-liburing.sh.jinja 00:46:58.930 -rw-r--r-- vagrant/vagrant 497 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-libvfn.sh.jinja 00:46:58.930 -rw-r--r-- vagrant/vagrant 764 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pkgs/emitter/templates/src-python3.sh.jinja 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/ 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/python3/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/python3/.keep 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/windows/ 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/windows/sys/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 27726 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/windows/sys/queue.h 00:46:58.930 -rw-r--r-- vagrant/vagrant 270 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/windows/sys/uio.h 00:46:58.930 -rw-r--r-- vagrant/vagrant 108 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/.gitignore 00:46:58.930 -rw-r--r-- vagrant/vagrant 1992 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/README.rst 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/fio/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/fio/.keep 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/liburing/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/liburing/.keep 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/libvfn/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/third-party/libvfn/.keep 00:46:58.930 -rw-r--r-- vagrant/vagrant 506 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/Dockerfile 00:46:58.930 -rw-r--r-- vagrant/vagrant 433 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/clang-format-c 00:46:58.930 -rw-r--r-- vagrant/vagrant 496 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/clang-format-h 00:46:58.930 -rw-r--r-- vagrant/vagrant 418 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/meson.build 00:46:58.930 -rwxr-xr-x vagrant/vagrant 1441 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/meson_dist_archive_fixer.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 4593 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/meson_dist_deb_build.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 2407 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pcf_clang_format.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 1186 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/pre-commit-check.sh 00:46:58.930 -rwxr-xr-x vagrant/vagrant 3159 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/print_help.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 26529 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/xnvme-driver.sh 00:46:58.930 -rwxr-xr-x vagrant/vagrant 5355 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/xnvme_libconf.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 1568 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/xnvme_ver.py 00:46:58.930 -rwxr-xr-x vagrant/vagrant 13781 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/xnvmec_generator.py 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/ 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 405 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/meson.build 00:46:58.930 -rw-r--r-- vagrant/vagrant 916 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/xnvme_hello-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1157 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/xnvme_io_async-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1315 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/zoned_io_async-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1259 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/examples/zoned_io_sync-completions 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 1341 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/meson.build 00:46:58.930 -rw-r--r-- vagrant/vagrant 1025 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_async_intf-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1116 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_buf-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 963 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_cli-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1095 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_enum-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1135 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_ioworker-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 965 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_kvs-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_lblk-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 966 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_map-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1334 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_scc-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1012 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_xnvme_cli-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 995 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_xnvme_file-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_znd_append-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 981 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_znd_explicit_open-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1133 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_znd_state-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1859 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tests/xnvme_tests_znd_zrwa-completions 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 1517 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/kvs-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 2298 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/lblk-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 511 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/meson.build 00:46:58.930 -rw-r--r-- vagrant/vagrant 1021 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/xdd-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 5211 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/xnvme-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 1678 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/xnvme_file-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 2872 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/tools/zoned-completions 00:46:58.930 -rw-r--r-- vagrant/vagrant 378 2024-06-07 12:49 spdk-test_gen_spec/xnvme/toolbox/bash_completion.d/README.rst 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 22 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/00_make.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 2334 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/00_make.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/10_make.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 1 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/10_make.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 847 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/contributing-branches.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 5360 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/contributing-conventions.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 3355 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/contributing-process.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 1825 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/contributing-toolbox.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 1224 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/index.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 2126 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/make.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 2554 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/release-checklist.rst 00:46:58.930 -rwxr-xr-x vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/contributing/strip_fchars.py 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/examples/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 690 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/examples/index.rst 00:46:58.930 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/ 00:46:58.930 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/000_xnvme_driver.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 119 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/000_xnvme_driver.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 26 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/010_xnvme_driver.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 117 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/010_xnvme_driver.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 74 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/100_compile.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 1 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/100_compile.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/110_run.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 606 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/110_run.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 27 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/200_dmesg.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 36 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/200_dmesg.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 39 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/300_find.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 577 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/300_find.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/400_xnvme_enum.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 138 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/400_xnvme_enum.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 50 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/410_xnvme_info.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 739 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/410_xnvme_info.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/500_xnvme_driver_reset.cmd 00:46:58.930 -rw-r--r-- vagrant/vagrant 106 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/500_xnvme_driver_reset.out 00:46:58.930 -rw-r--r-- vagrant/vagrant 176 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/build_freebsd.rst 00:46:58.930 -rw-r--r-- vagrant/vagrant 186 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/build_linux.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 263 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/build_meson.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 314 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/build_windows.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/clone.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 325 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/hello.c 00:46:58.931 -rw-r--r-- vagrant/vagrant 68 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/hello_00.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/hello_00.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/hello_01.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 606 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/hello_01.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 27824 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/pkg_config_00.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 125 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/pkg_config_00.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 20541 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/toolchain.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 191 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_block.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 618 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_block.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_default.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_default.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 217 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_zoned.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 610 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_info_zoned.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 67 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_emu.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_emu.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 72 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_io_uring.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_io_uring.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 70 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_libaio.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_io_async_read_libaio.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_library-info.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1149 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_library-info.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 33 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_info_default.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_info_default.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 27 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_info_fs.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 593 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_info_fs.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 76 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_io_ring.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_io_ring.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 73 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_iocp.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_iocp.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 76 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_iocp_th.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 396 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/getting_started/xnvme_win_io_async_read_iocp_th.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/python/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 6477 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/python/index.rst 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/ 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/file/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/file/file_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 796 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/file/file_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 87 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/file/index.rst 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/fio/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 4072 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/fio/index.rst 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 990 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_enum_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 475 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_enum_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_idfy_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 778 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_idfy_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_info_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_info_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_read_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1052 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_read_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 12 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 628 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_write_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1053 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/lblk/lblk_write_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 386 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/index.rst 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 269 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_async_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 692 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_async_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 16 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_sync_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 619 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_sync_usage.out 00:46:58.931 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 355 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xdd/xdd_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/ 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-set/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 225 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-set/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 25 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-set/xnvme_feature_set_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 933 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-set/xnvme_feature_set_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/padc/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 328 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/padc/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/padc/xnvme_padc_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1127 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/padc/xnvme_padc_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/format/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/format/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 20 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/format/xnvme_format_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1141 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/format/xnvme_format_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/pioc/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 581 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/pioc/example.sh 00:46:58.931 -rw-r--r-- vagrant/vagrant 293 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/pioc/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/pioc/xnvme_pioc_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/pioc/xnvme_pioc_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ctrlr/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ctrlr/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ctrlr/xnvme_idfy_ctrlr_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 699 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ctrlr/xnvme_idfy_ctrlr_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/sanitize/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 158 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/sanitize/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 22 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/sanitize/xnvme_sanitize_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 620 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/sanitize/xnvme_sanitize_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ns/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 224 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ns/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 21 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ns/xnvme_idfy_ns_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 775 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy-ns/xnvme_idfy_ns_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 220 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy/index.rst 00:46:58.931 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy/xnvme_idfy_usage.cmd 00:46:58.931 -rw-r--r-- vagrant/vagrant 1022 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/idfy/xnvme_idfy_usage.out 00:46:58.931 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/ 00:46:58.931 -rw-r--r-- vagrant/vagrant 193 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/xnvme_info.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 604 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/xnvme_info.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/xnvme_info_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 650 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/info/xnvme_info_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 364 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/xnvme_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1144 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/xnvme_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/library-info/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 238 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/library-info/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/library-info/xnvme_library_info.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1149 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/library-info/xnvme_library_info.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-erri/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 225 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-erri/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 22 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-erri/xnvme_log_erri_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 829 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-erri/xnvme_log_erri_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/enum/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 251 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/enum/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/enum/xnvme_enum_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 464 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/enum/xnvme_enum_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-health/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 249 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-health/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-health/xnvme_log_health_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 772 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log-health/xnvme_log_health_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-get/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 225 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-get/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 25 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-get/xnvme_feature_get_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 919 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/feature-get/xnvme_feature_get_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 200 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log/xnvme_log_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1045 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/xnvme/log/xnvme_log_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 3671 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/index.rst 00:46:58.932 -rw-r--r-- vagrant/vagrant 270 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_append.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1223 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_append.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 20 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_append_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1038 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_append_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 27 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_changes.uone.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 205 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_changes.uone.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 21 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_changes_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 685 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_changes_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 11 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_enum.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 79 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_enum.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_enum_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 461 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_enum_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 26 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_errors.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 181 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_errors.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 20 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_errors_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 756 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_errors_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_info.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 593 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_info.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_info_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 613 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_info_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 41 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 78 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_close_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 893 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_close_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 25 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_finish_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 895 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_finish_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_open_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 891 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_open_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 18 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 971 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_mgmt_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 2450 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_props.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 43 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_read.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 117 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_read.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_read_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 911 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_read_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 26 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 246061 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report_range.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 1491 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report_range.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 20 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 886 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_report_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 911 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_usage.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 85 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_write.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 195 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_write.out 00:46:58.932 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_write_usage.cmd 00:46:58.932 -rw-r--r-- vagrant/vagrant 911 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tools/zoned/zoned_write_usage.out 00:46:58.932 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/ 00:46:58.932 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/.keep 00:46:58.932 -rw-r--r-- vagrant/vagrant 10057 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/fabrics.png 00:46:58.932 -rw-r--r-- vagrant/vagrant 402 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/theme_overrides.css 00:46:58.932 -rw-r--r-- vagrant/vagrant 1495421 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/xnvme-ci-overview.drawio.png 00:46:58.933 -rw-r--r-- vagrant/vagrant 540365 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/xnvme-ci-overview.png 00:46:58.933 -rw-r--r-- vagrant/vagrant 49223 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/_static/xnvme-logo-medium.png 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/ 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/devs/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 14051 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/devs/index.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 78 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/000_compile_example.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 1 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/000_compile_example.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 70 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/010_run_c_example.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/010_run_c_example.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 57 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/020_run_python_example.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/020_run_python_example.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 1692 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/enumerate_example.c 00:46:58.933 -rw-r--r-- vagrant/vagrant 1764 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/enumerate_example.py 00:46:58.933 -rw-r--r-- vagrant/vagrant 2963 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/dynamic_loading/index.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/ 00:46:58.933 -rwxr-xr-x vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_env.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 121 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_initiator_modules.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 936 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_initiator_nvmecli.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 619 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_initiator_xnvme.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 1718 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_target_linux.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 113 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_target_modules.sh 00:46:58.933 -rwxr-xr-x vagrant/vagrant 1586 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/fabrics_target_spdk.sh 00:46:58.933 -rw-r--r-- vagrant/vagrant 6541 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fabrics/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 489 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 2627 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/io_interfaces.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 3856 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/library.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 54 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/010_xnvme_feature_get.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 72 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/010_xnvme_feature_get.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 72 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/020_xnvme_set_fdp_events.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 71 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/020_xnvme_set_fdp_events.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 77 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/030_xnvme_feature_get.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 249 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/030_xnvme_feature_get.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 38 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/040_xnvme_fdp_ruhs.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 261 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/040_xnvme_fdp_ruhs.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 38 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/050_xnvme_fdp_ruhu.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 17 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/050_xnvme_fdp_ruhu.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 62 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/100_xnvme_log_fdp_config.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/100_xnvme_log_fdp_config.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 43 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/200_xnvme_log_fdp_stats.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 86 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/200_xnvme_log_fdp_stats.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/300_xnvme_log_ruhu.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 121 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/300_xnvme_log_ruhu.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 75 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/400_xnvme_log_fdp_events.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 208 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/400_xnvme_log_fdp_events.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 6172 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/fdp/index.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 232 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/nvm_extended_lba.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 7397 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/nvme_cmdsets.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 27 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/100_xnvme_idfy_cs.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 65 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/tutorial/nvme/100_xnvme_idfy_cs.out 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 5 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/.gitignore 00:46:58.933 -rw-r--r-- vagrant/vagrant 1594 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/Makefile 00:46:58.933 -rw-r--r-- vagrant/vagrant 3821 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/README.rst 00:46:58.933 -rwxr-xr-x vagrant/vagrant 4793 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/apigen.py 00:46:58.933 -rw-r--r-- vagrant/vagrant 1280 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/conf.py 00:46:58.933 -rwxr-xr-x vagrant/vagrant 1932 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/dest.py 00:46:58.933 -rw-r--r-- vagrant/vagrant 82790 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/doxy.cfg 00:46:58.933 -rwxr-xr-x vagrant/vagrant 1481 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/getting_started.py 00:46:58.933 -rw-r--r-- vagrant/vagrant 101 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/requirements.txt 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/diagrams/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 2622 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/diagrams/fabrics.dia 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/templates/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 975 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/templates/api_section.jinja 00:46:58.933 -rw-r--r-- vagrant/vagrant 3504 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/autogen/templates/toolchain.rst.jinja 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 3490 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 24 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_cli.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 630 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_cli.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 1052 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_fbsd.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 1748 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_intf.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 1993 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_linux.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 1351 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_windows.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 26 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/010_xnvme_driver.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 117 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/010_xnvme_driver.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 177 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/100_compile.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 1 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/100_compile.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/110_run.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 653 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/110_run.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 48 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/115_xnvme_info.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 786 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/115_xnvme_info.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/120_xnvme_driver_reset.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 106 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/120_xnvme_driver_reset.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 27 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/200_dmesg.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 36 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/200_dmesg.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 39 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/300_find.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 577 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/300_find.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 26 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/400_xnvme_driver.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 117 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/400_xnvme_driver.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 19 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/401_xnvme_driver_reset.cmd 00:46:58.933 -rw-r--r-- vagrant/vagrant 106 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/401_xnvme_driver_reset.out 00:46:58.933 -rw-r--r-- vagrant/vagrant 339 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/hello.c 00:46:58.933 -rw-r--r-- vagrant/vagrant 5906 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/backends/xnvme_be_spdk/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 2308 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/index.rst 00:46:58.933 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/ 00:46:58.933 -rw-r--r-- vagrant/vagrant 1762 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/index.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 4318 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 1743 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_adm.rst 00:46:58.933 -rw-r--r-- vagrant/vagrant 2251 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_buf.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 4036 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_cli.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1700 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_cmd.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 2255 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_dev.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1103 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_file.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 786 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_geo.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 989 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_ident.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 937 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_kvs.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1919 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_lba.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 624 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_libconf.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 560 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_mem.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_nvm.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 867 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_opts.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1863 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_queue.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 28923 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_spec.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 905 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_ver.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 2714 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/capis/xnvme_znd.rst 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/ 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/build/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 178 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/build/index.rst 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/verify/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 2346 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/verify/index.rst 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/bench/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 5853 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/bench/index.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 2707 2024-06-07 12:49 spdk-test_gen_spec/xnvme/docs/ci/index.rst 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 11057 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/kvs.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 17644 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/lblk.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 415 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/meson.build 00:46:58.934 -rw-r--r-- vagrant/vagrant 8314 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/xdd.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 40809 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/xnvme.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 24275 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/xnvme_file.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 19053 2024-06-07 12:49 spdk-test_gen_spec/xnvme/tools/zoned.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 30 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.git 00:46:58.934 -rw-r--r-- vagrant/vagrant 394 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.git-blame-ignore-revs 00:46:58.934 -rw-r--r-- vagrant/vagrant 689 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.gitignore 00:46:58.934 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.gitmodules 00:46:58.934 -rw-r--r-- vagrant/vagrant 2405 2024-06-07 12:49 spdk-test_gen_spec/xnvme/.pre-commit-config.yaml 00:46:58.934 -rw-r--r-- vagrant/vagrant 34426 2024-06-07 12:49 spdk-test_gen_spec/xnvme/CHANGELOG.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 203 2024-06-07 12:49 spdk-test_gen_spec/xnvme/CONTRIBUTING.md 00:46:58.934 -rw-r--r-- vagrant/vagrant 1656 2024-06-07 12:49 spdk-test_gen_spec/xnvme/CONTRIBUTORS.md 00:46:58.934 -rw-r--r-- vagrant/vagrant 1196 2024-06-07 12:49 spdk-test_gen_spec/xnvme/ISSUES.rst 00:46:58.934 -rw-r--r-- vagrant/vagrant 1818 2024-06-07 12:49 spdk-test_gen_spec/xnvme/LICENSE 00:46:58.934 -rw-r--r-- vagrant/vagrant 359 2024-06-07 12:49 spdk-test_gen_spec/xnvme/MAINTAINERS.md 00:46:58.934 -rw-r--r-- vagrant/vagrant 13395 2024-06-07 12:49 spdk-test_gen_spec/xnvme/Makefile 00:46:58.934 -rw-r--r-- vagrant/vagrant 2330 2024-06-07 12:49 spdk-test_gen_spec/xnvme/README.md 00:46:58.934 -rw-r--r-- vagrant/vagrant 3488 2024-06-07 12:49 spdk-test_gen_spec/xnvme/build.bat 00:46:58.934 -rwxr-xr-x vagrant/vagrant 1859 2024-06-07 12:49 spdk-test_gen_spec/xnvme/configure 00:46:58.934 -rw-r--r-- vagrant/vagrant 7714 2024-06-07 12:49 spdk-test_gen_spec/xnvme/meson.build 00:46:58.934 -rw-r--r-- vagrant/vagrant 1045 2024-06-07 12:49 spdk-test_gen_spec/xnvme/meson_options.txt 00:46:58.934 -rw-r--r-- vagrant/vagrant 953 2024-06-07 12:49 spdk-test_gen_spec/xnvme/xnvme.spec.in 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 509 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/meson.build 00:46:58.934 -rw-r--r-- vagrant/vagrant 796 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_dev.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 642 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_enum.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 1181 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_hello.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 7987 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_io_async.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 3488 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_single_async.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 1708 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/xnvme_single_sync.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 12445 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/zoned_io_async.c 00:46:58.934 -rw-r--r-- vagrant/vagrant 7671 2024-06-07 12:49 spdk-test_gen_spec/xnvme/examples/zoned_io_sync.c 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 1371 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 9451 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_adm.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 860 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_be.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 7061 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_buf.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 14910 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_cli.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 4453 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_cmd.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 5654 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_dev.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 2907 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_file.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1392 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_geo.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1021 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_ident.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 3093 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_kvs.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 3712 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_lba.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1102 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_libconf.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1146 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_mem.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 5637 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_nvm.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 2538 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_opts.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 5885 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_pp.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 4778 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_queue.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 94620 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_spec.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 2292 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_spec_fs.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 7065 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_spec_pp.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 870 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_topology.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 10467 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_util.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1083 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_ver.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 10012 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/libxnvme_znd.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1005 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/meson.build 00:46:58.934 -rw-r--r-- vagrant/vagrant 7301 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1086 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_cbi.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 956 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_fbsd.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_linux.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 522 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_linux_libaio.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1092 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_linux_liburing.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 672 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_linux_nvme.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 808 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_macos.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 4648 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_nosys.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 748 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_ramdisk.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_registry.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1865 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_spdk.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1117 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_vfio.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 2211 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_windows.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1655 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_windows_ioring.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1972 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_be_windows_nvme.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1404 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_cmd.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1199 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_dev.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 303 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_geo.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 1559 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_queue.h 00:46:58.934 -rw-r--r-- vagrant/vagrant 428 2024-06-07 12:49 spdk-test_gen_spec/xnvme/include/xnvme_spec.h 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.githooks/ 00:46:58.934 -rwxr-xr-x vagrant/vagrant 834 2024-06-07 12:49 spdk-test_gen_spec/.githooks/pre-commit 00:46:58.934 -rwxr-xr-x vagrant/vagrant 2240 2024-06-07 12:49 spdk-test_gen_spec/.githooks/pre-push 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/ 00:46:58.934 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/ 00:46:58.934 -rw-r--r-- vagrant/vagrant 1743 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/CMakeLists.txt 00:46:58.935 -rw-r--r-- vagrant/vagrant 2573 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/Makefile 00:46:58.935 -rw-r--r-- vagrant/vagrant 1679 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/win_x64.mak 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 2883 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/CMakeLists.txt 00:46:58.935 -rw-r--r-- vagrant/vagrant 2971 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/Makefile 00:46:58.935 -rw-r--r-- vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/README.md 00:46:58.935 -rw-r--r-- vagrant/vagrant 9192 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/main.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 2619 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/burst-app/win_x64.mak 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/cmake/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 2588 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/cmake/unix.cmake 00:46:58.935 -rw-r--r-- vagrant/vagrant 2492 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/cmake/windows-mingw.cmake 00:46:58.935 -rw-r--r-- vagrant/vagrant 2103 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/examples/cmake/windows.cmake 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/ 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 93 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/README 00:46:58.935 -rw-r--r-- vagrant/vagrant 1973 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes128_cntr_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 6898 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes128_ecb_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes128_gcm_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1942 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes192_cntr_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes192_ecb_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes192_gcm_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1942 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes256_cntr_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes256_ecb_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/aes256_gcm_vaes_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 21787 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/mb_mgr_avx2_t2.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 2430 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/mb_mgr_zuc_submit_flush_gfni_avx2.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 2030 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t2/zuc_x8_gfni_avx2.asm 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 70 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/README 00:46:58.935 -rw-r--r-- vagrant/vagrant 8897 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha1_flush_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 12168 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha1_submit_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1649 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha224_flush_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1650 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha224_submit_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 8916 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha256_flush_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 11324 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_hmac_sha256_submit_ni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 21304 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/mb_mgr_sse_t2.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 6854 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha1_ni_one_block_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 7842 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha1_ni_x1_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 15128 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha1_ni_x2_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 9676 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha256_ni_one_block_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 12407 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha256_ni_x1_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 20721 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha256_ni_x2_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 3704 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha_ni_mb_sse.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 3792 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t2/sha_ni_sse.c 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t3/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 102 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t3/README 00:46:58.935 -rw-r--r-- vagrant/vagrant 21727 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t3/mb_mgr_avx2_t3.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 39391 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t3/poly_fma_avx2.asm 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 76 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/README 00:46:58.935 -rw-r--r-- vagrant/vagrant 2293 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes128_cbc_dec_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1894 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes128_cbc_enc_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1880 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes128_cbc_mac_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 5723 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes128_ecb_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 2294 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes192_cbc_dec_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes192_cbc_enc_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes192_ecb_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 2294 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes256_cbc_dec_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes256_cbc_enc_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1881 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes256_cbc_mac_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/aes256_ecb_by8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1759 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes128_cbc_enc_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1762 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes128_cbc_enc_submit_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes128_ccm_auth_submit_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes128_cmac_submit_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1759 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes192_cbc_enc_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1762 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes192_cbc_enc_submit_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1759 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes256_cbc_enc_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1762 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes256_cbc_enc_submit_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1858 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes256_ccm_auth_submit_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1839 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_aes256_cmac_submit_flush_x8_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 21385 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_sse_t3.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 2398 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/mb_mgr_zuc_submit_flush_gfni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 2066 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t3/zuc_x4_gfni_sse.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 7323 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/CMakeLists.txt 00:46:58.935 -rw-r--r-- vagrant/vagrant 32237 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/Makefile 00:46:58.935 -rw-r--r-- vagrant/vagrant 123790 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/api_doxygen.conf 00:46:58.935 -rw-r--r-- vagrant/vagrant 159355 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/intel-ipsec-mb.h 00:46:58.935 -rw-r--r-- vagrant/vagrant 33403 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/libIPSec_MB.def 00:46:58.935 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/libipsec-mb-dev.7 00:46:58.935 -rw-r--r-- vagrant/vagrant 6494 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/libipsec-mb.7 00:46:58.935 -rw-r--r-- vagrant/vagrant 29863 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/win_x64.mak 00:46:58.935 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/ 00:46:58.935 -rw-r--r-- vagrant/vagrant 95 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/README 00:46:58.935 -rw-r--r-- vagrant/vagrant 1789 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/aes128_gcm_by8_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1789 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/aes192_gcm_by8_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1789 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/aes256_gcm_by8_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 112794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/chacha20_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 94341 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/des_x16_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 3401 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_avx512.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 25467 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_avx512_t1.c 00:46:58.935 -rw-r--r-- vagrant/vagrant 18701 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_des_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 12404 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha1_flush_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 14415 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha1_submit_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1652 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha224_flush_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1653 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha224_submit_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 14099 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha256_flush_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 14554 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha256_submit_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1653 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha384_flush_avx512.asm 00:46:58.935 -rw-r--r-- vagrant/vagrant 1654 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha384_submit_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 11955 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha512_flush_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 13038 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_hmac_sha512_submit_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 43191 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/mb_mgr_zuc_submit_flush_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 53128 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/poly_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 16954 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/sha1_x16_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 29396 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/sha256_x16_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 29475 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/sha512_x8_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 4866 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/sha_avx512.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 5308 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/sha_mb_avx512.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 2744 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/snow3g_avx512.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 35227 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/zuc_top_avx512.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 152693 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t1/zuc_x16_avx512.asm 00:46:58.936 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/ 00:46:58.936 -rw-r--r-- vagrant/vagrant 10154 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes128_ecbenc_x3.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 15883 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes_cmac_subkey_gen.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 19492 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes_keyexp_128.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 22141 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes_keyexp_192.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 24143 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes_keyexp_256.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 4921 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/aes_xcbc_expand_key.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 12180 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/alloc.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 2572 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/atomic.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 34483 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/chacha20_poly1305.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 3526 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/cipher_suite_id.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 4294 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/clear_regs_mem_fns.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 5012 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/const.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 38487 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/constant_lookup_fns.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 7890 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/cpu_feature.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 11222 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/crc32_const.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 3581 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/crc32_refl_const.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 27272 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/des_basic.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 5764 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/des_key.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 11550 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/error.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 25482 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/gcm.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 7947 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/hmac_ipad_opad.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 2813 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/kasumi_iv.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 3605 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/mb_mgr_auto.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 2594 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/mbcpuid.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 9821 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/md5_one_block.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 16368 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/ooo_mgr_reset.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 20244 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/poly1305.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 6947 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/quic_aes_gcm.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 5455 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/quic_chacha20_poly1305.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 3508 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/quic_hp_aes_ecb.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 3038 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/quic_hp_chacha20.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 4273 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/save_xmms.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 73610 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/self_test.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 11116 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/sm3.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 3557 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/snow3g_iv.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 21992 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/snow3g_tables.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 1997 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/version.c 00:46:58.936 -rw-r--r-- vagrant/vagrant 4695 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/wireless_common.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 20999 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/zuc_common.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 3412 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/x86_64/zuc_iv.c 00:46:58.936 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/ 00:46:58.936 -rw-r--r-- vagrant/vagrant 196 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/README 00:46:58.936 -rw-r--r-- vagrant/vagrant 1798 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes128_gcm_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1802 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes128_gcm_sgl_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1803 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes128_gmac_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1798 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes192_gcm_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1802 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes192_gcm_sgl_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1803 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes192_gmac_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1798 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes256_gcm_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1802 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes256_gcm_sgl_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 1803 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes256_gmac_api_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 20532 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cbc_dec_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 49081 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cbc_enc_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 2889 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cbcs_dec_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 17669 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cbcs_enc_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 3397 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cntr_api_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 3469 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cntr_bit_api_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 2862 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cntr_ccm_api_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 2642 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_cntr_pon_api_by16_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 42635 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_docsis_dec_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 59275 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_docsis_dec_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 61444 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_docsis_enc_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 89602 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_docsis_enc_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 10402 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_ecb_quic_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 7435 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/aes_ecb_vaes_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 4594 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc16_x25_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 14192 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_by16_vclmul_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 9727 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_fp_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 7214 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_iuup_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 7178 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_lte_avx512.asm 00:46:58.936 -rw-r--r-- vagrant/vagrant 12901 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_refl_by16_vclmul_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 4595 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_sctp_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 7199 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/crc32_wimax_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 5749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/ethernet_fcs_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 9539 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_cbc_enc_flush_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 9421 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_cbc_enc_submit_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 8957 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_cbcs_1_9_flush_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 8520 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_cbcs_1_9_submit_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 25580 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_ccm_auth_submit_flush_x16_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 19511 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_cmac_submit_flush_x16_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 14434 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes128_xcbc_submit_flush_x16_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1778 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes192_cbc_enc_flush_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1775 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes192_cbc_enc_submit_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1778 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes256_cbc_enc_flush_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1775 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes256_cbc_enc_submit_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1892 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes256_ccm_auth_submit_flush_x16_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1952 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_aes256_cmac_submit_flush_x16_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 26930 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_avx512_t2.c 00:46:58.937 -rw-r--r-- vagrant/vagrant 17923 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_snow3g_uea2_submit_flush_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 10001 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_snow3g_uia2_submit_flush_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 3142 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/mb_mgr_zuc_submit_flush_gfni_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 64605 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/poly_fma_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 16809 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/pon_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 24295 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/snow3g_uia2_by32_vaes_avx512.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2824 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx512_t2/zuc_x16_vaes_avx512.asm 00:46:58.937 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/ 00:46:58.937 -rw-r--r-- vagrant/vagrant 23 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/README 00:46:58.937 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cbc_dec_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1886 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cbc_enc_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1872 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cbc_mac_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2442 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cbcs_1_9_dec_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2131 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cbcs_1_9_enc_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 15484 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cntr_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_cntr_ccm_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 5717 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_ecb_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1889 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes128_xcbc_mac_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes192_cbc_dec_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1887 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes192_cbc_enc_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 12287 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes192_cntr_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes192_ecb_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2297 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_cbc_dec_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1887 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_cbc_enc_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1873 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_cbc_mac_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 14817 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_cntr_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_cntr_ccm_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes256_ecb_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 6912 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes_cfb_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 8869 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/aes_ecb_quic_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 57745 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/chacha20_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 4672 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc16_x25_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 14779 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 9613 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_fp_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 7138 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_iuup_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 7102 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_lte_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 13991 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_refl_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 4558 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_sctp_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 7119 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/crc32_wimax_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 5888 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/ethernet_fcs_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 15231 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/kasumi_avx.c 00:46:58.937 -rw-r--r-- vagrant/vagrant 6837 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_cbc_enc_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 5489 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_cbc_enc_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 6523 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_cbcs_1_9_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 6282 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_cbcs_1_9_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 20341 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_ccm_auth_submit_flush_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 16998 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_cmac_submit_flush_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 8028 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_xcbc_flush_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 7890 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes128_xcbc_submit_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1728 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes192_cbc_enc_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1731 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes192_cbc_enc_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1728 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes256_cbc_enc_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1731 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes256_cbc_enc_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1855 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes256_ccm_auth_submit_flush_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1809 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_aes256_cmac_submit_flush_x8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 3319 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_avx.c 00:46:58.937 -rw-r--r-- vagrant/vagrant 21250 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_avx_t1.c 00:46:58.937 -rw-r--r-- vagrant/vagrant 9960 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_md5_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 11983 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_md5_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 9559 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha1_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 12488 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha1_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1688 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha224_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1690 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha224_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 11207 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha256_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 13355 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha256_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1703 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha384_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1705 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha384_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 10355 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha512_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 12725 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_hmac_sha512_submit_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 35474 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/mb_mgr_zuc_submit_flush_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 28167 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/md5_x4x2_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 2021 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/memcpy_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 46071 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/pon_by8_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 10606 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha1_one_block_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 13382 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha1_x4_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha224_one_block_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 12176 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha256_mult_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 15190 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha256_one_block_avx.asm 00:46:58.937 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha384_one_block_avx.asm 00:46:58.938 -rw-r--r-- vagrant/vagrant 12912 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha512_one_block_avx.asm 00:46:58.938 -rw-r--r-- vagrant/vagrant 11689 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha512_x2_avx.asm 00:46:58.938 -rw-r--r-- vagrant/vagrant 4836 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha_avx.c 00:46:58.938 -rw-r--r-- vagrant/vagrant 5237 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/sha_mb_avx.c 00:46:58.938 -rw-r--r-- vagrant/vagrant 2494 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/snow3g_avx.c 00:46:58.938 -rw-r--r-- vagrant/vagrant 9701 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/snow3g_uia2_by4_avx.asm 00:46:58.938 -rw-r--r-- vagrant/vagrant 16505 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/snow_v_avx.asm 00:46:58.938 -rw-r--r-- vagrant/vagrant 44382 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/zuc_top_avx.c 00:46:58.938 -rw-r--r-- vagrant/vagrant 67478 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t1/zuc_x4_avx.asm 00:46:58.938 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t2/ 00:46:58.938 -rw-r--r-- vagrant/vagrant 21624 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx_t2/mb_mgr_avx_t2.c 00:46:58.938 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/cmake/ 00:46:58.938 -rw-r--r-- vagrant/vagrant 2328 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/cmake/uninstall.cmake.in 00:46:58.938 -rw-r--r-- vagrant/vagrant 4959 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/cmake/unix.cmake 00:46:58.938 -rw-r--r-- vagrant/vagrant 4718 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/cmake/windows-mingw.cmake 00:46:58.938 -rw-r--r-- vagrant/vagrant 4672 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/cmake/windows.cmake 00:46:58.938 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/ 00:46:58.938 -rw-r--r-- vagrant/vagrant 11379 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cbc_dec_by8_avx.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 11291 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cbc_dec_by8_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 10847 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cbc_enc_x8_avx.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 11383 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cbc_enc_x8_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 8666 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cbcs_dec_by8_avx.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 20390 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cntr_by16_vaes_avx2.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 78247 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_cntr_by16_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 74131 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aes_common.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 4994 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aesni_emu.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 7911 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/aesni_emu.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 5658 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx2_type1.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3668 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx2_type2.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 1977 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx2_type3.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 7333 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx512_type1.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 8261 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx512_type2.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 10985 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx_type1.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 1919 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_avx_type2.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 9019 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_noaesni.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 11674 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_sse_type1.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3710 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_sse_type2.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 4889 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_sse_type3.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 5516 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/arch_x86_64.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 1782 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/cet.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 9540 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/chacha20_poly1305.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2820 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/chacha_poly_defines.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 5963 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/clear_regs.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 2293 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/clear_regs_mem.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 9937 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/const.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 8486 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/constant_lookup.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 9956 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/constant_lookup.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 3487 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/constants.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3095 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/constants.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 2431 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/cpu_feature.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2662 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/crc32.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 2178 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/crc32_const.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 2682 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/crc32_refl.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 1951 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/crc32_refl_const.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 6159 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/datastruct.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 10448 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/dbgprint.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 4607 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/des.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 4716 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/des_utils.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 13708 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/docsis_common.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2358 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/error.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 5604 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/error.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 54661 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 14816 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_api_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 16365 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_api_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 1800 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 1799 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_avx_gen4.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 4952 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_common.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 151269 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_common_avx2_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 10344 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_defines.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 9285 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_gmac_api_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 9801 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_gmac_api_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 3180 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_keys_avx2_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 4693 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_keys_sse_avx.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 2739 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_keys_vaes_avx2.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 6876 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_keys_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 17051 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_sgl_api_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 15831 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_sgl_api_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 83165 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_sse.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 189758 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_vaes_avx2.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 172890 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/gcm_vaes_avx512.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 8825 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/imb_job.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 33000 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/ipsec_ooo_mgr.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 4884 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/job_api_docsis.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 16114 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/job_api_gcm.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2958 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/job_api_kasumi.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3945 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/job_api_snowv.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 6710 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/kasumi_interface.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 76864 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/kasumi_internal.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 19052 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_burst.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 9700 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_burst_async.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3467 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_code.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 21645 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_datastruct.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 124698 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_job_api.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 84866 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/mb_mgr_job_check.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2194 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/memcpy.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 21358 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/memcpy.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 4877 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/noaesni.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3248 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/ooo_mgr_reset.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 3753 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/os.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 8385 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/reg_sizes.inc 00:46:58.938 -rw-r--r-- vagrant/vagrant 1964 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/save_xmms.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 11031 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/sha_generic.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 25867 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/sha_mb_mgr.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 2959 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/sm3.h 00:46:58.938 -rw-r--r-- vagrant/vagrant 20799 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 125815 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g_common.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 2917 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g_submit.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 3001 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g_tables.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 54103 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g_uea2_by16_vaes_avx512.inc 00:46:58.939 -rw-r--r-- vagrant/vagrant 54838 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/snow3g_uea2_by4_sse.inc 00:46:58.939 -rw-r--r-- vagrant/vagrant 10027 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/transpose_avx2.inc 00:46:58.939 -rw-r--r-- vagrant/vagrant 31252 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/transpose_avx512.inc 00:46:58.939 -rw-r--r-- vagrant/vagrant 2739 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/transpose_sse.inc 00:46:58.939 -rw-r--r-- vagrant/vagrant 14918 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/wireless_common.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 68567 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/zuc_internal.h 00:46:58.939 -rw-r--r-- vagrant/vagrant 26082 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/include/zuc_sbox.inc 00:46:58.939 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/ 00:46:58.939 -rw-r--r-- vagrant/vagrant 1712 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cbc_dec_by4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1861 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cbc_enc_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1913 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cbc_mac_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1763 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cbcs_1_9_dec_by4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 2507 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cbcs_1_9_enc_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cntr_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1715 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_cntr_ccm_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1855 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes128_xcbc_mac_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1712 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes192_cbc_dec_by4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1709 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes192_cbc_enc_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes192_cntr_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1712 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes256_cbc_dec_by4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1709 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes256_cbc_enc_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1770 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes256_cbc_mac_x4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes256_cntr_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1715 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes256_cntr_ccm_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1754 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes_cfb_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 2141 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aes_ecb_by4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 17440 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/aesni_emu.c 00:46:58.939 -rw-r--r-- vagrant/vagrant 1761 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc16_x25_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1692 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1868 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_fp_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1817 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_iuup_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1807 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_lte_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1707 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_refl_by8_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1754 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_sctp_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/crc32_wimax_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/ethernet_fcs_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm128_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1828 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm128_gmac_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1827 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm128_sgl_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1824 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm192_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm192_gmac_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1828 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm192_sgl_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1823 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm256_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1828 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm256_gmac_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1827 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/gcm256_sgl_api_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_cbc_enc_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_cbc_enc_submit_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1767 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1770 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1868 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_ccm_auth_submit_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1835 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_cmac_submit_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1743 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_xcbc_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1746 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes128_xcbc_submit_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes192_cbc_enc_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes192_cbc_enc_submit_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1749 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes256_cbc_enc_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1752 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes256_cbc_enc_submit_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1887 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1835 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 23607 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_sse_no_aesni.c 00:46:58.939 -rw-r--r-- vagrant/vagrant 2451 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/mb_mgr_zuc_submit_flush_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1971 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/pon_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1820 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/sm4_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 2610 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/snow3g_sse_no_aesni.c 00:46:58.939 -rw-r--r-- vagrant/vagrant 1751 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/snow3g_uia2_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1737 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/snow_v_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 2131 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/zuc_sse_no_aesni.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 45261 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/no-aesni/zuc_top_sse_no_aesni.c 00:46:58.939 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/ 00:46:58.939 -rw-r--r-- vagrant/vagrant 49 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/README 00:46:58.939 -rw-r--r-- vagrant/vagrant 1791 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/aes128_gcm_by8_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1791 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/aes192_gcm_by8_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1791 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/aes256_gcm_by8_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 59819 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/chacha20_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 3550 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_avx2.c 00:46:58.939 -rw-r--r-- vagrant/vagrant 21504 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_avx2_t1.c 00:46:58.939 -rw-r--r-- vagrant/vagrant 11803 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_md5_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 12918 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_md5_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 10151 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha1_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 13084 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha1_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1691 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha224_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1693 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha224_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 11692 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha256_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 13514 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha256_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1706 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha384_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 1708 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha384_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 10692 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha512_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 12760 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_hmac_sha512_submit_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 37102 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/mb_mgr_zuc_submit_flush_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 37621 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/md5_x8x2_avx2.asm 00:46:58.939 -rw-r--r-- vagrant/vagrant 14592 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/sha1_x8_avx2.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 19072 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/sha256_oct_avx2.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 18669 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/sha512_x4_avx2.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 4844 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/sha_avx2.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 5250 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/sha_mb_avx2.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 2640 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/snow3g_avx2.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 45909 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/zuc_top_avx2.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 51842 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/avx2_t1/zuc_x8_avx2.asm 00:46:58.940 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/ 00:46:58.940 -rw-r--r-- vagrant/vagrant 51 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/README 00:46:58.940 -rw-r--r-- vagrant/vagrant 11588 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cbc_dec_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11334 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cbc_enc_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1870 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cbc_mac_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1720 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cbcs_1_9_dec_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 2464 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cbcs_1_9_enc_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14191 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cntr_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_cntr_ccm_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 13069 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_ecb_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 2317 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes128_xcbc_mac_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11895 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes192_cbc_dec_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 10484 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes192_cbc_enc_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11968 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes192_cntr_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1792 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes192_ecb_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 12833 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_cbc_dec_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11709 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_cbc_enc_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1714 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_cbc_mac_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14666 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_cntr_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1722 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_cntr_ccm_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1792 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes256_ecb_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 6739 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes_cfb_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 8965 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/aes_ecb_quic_x8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 100897 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/chacha20_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 4738 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc16_x25_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 15814 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 9858 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_fp_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 7329 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_iuup_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 7274 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_lte_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14444 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_refl_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 4673 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_sctp_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 7308 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/crc32_wimax_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 5948 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/ethernet_fcs_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1788 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm128_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm128_gmac_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1792 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm128_sgl_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1790 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm192_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm192_gmac_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm192_sgl_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1789 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm256_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm256_gmac_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1793 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/gcm256_sgl_api_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 15192 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/kasumi_sse.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 6666 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_cbc_enc_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 5576 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_cbc_enc_submit_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 6794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_cbcs_1_9_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 6656 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_cbcs_1_9_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 20627 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_ccm_auth_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 16539 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_cmac_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 7119 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_xcbc_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 7674 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes128_xcbc_submit_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1731 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes192_cbc_enc_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1734 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes192_cbc_enc_submit_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1731 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes256_cbc_enc_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1734 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes256_cbc_enc_submit_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1855 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes256_ccm_auth_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1829 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_aes256_cmac_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 9753 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_md5_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11967 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_md5_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 9714 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha1_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 12526 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha1_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1688 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha224_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1690 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha224_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 11023 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha256_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 13235 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha256_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1703 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha384_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1705 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha384_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 10235 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha512_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 12817 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_hmac_sha512_submit_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 20254 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_snow3g_uea2_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 12250 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_snow3g_uia2_submit_flush_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 3485 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_sse.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 21256 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_sse_t1.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 35770 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/mb_mgr_zuc_submit_flush_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 32830 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/md5_x4x2_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 3026 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/memcpy_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 33755 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/pon_by8_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 10666 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha1_one_block_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 13406 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha1_x4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha224_one_block_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14987 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha256_mult_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14656 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha256_one_block_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 1794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha384_one_block_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 12940 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha512_one_block_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 14928 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha512_x2_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 4941 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha_mb_sse.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 4839 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sha_sse.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 17133 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/sm4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 2674 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/snow3g_sse.c 00:46:58.940 -rw-r--r-- vagrant/vagrant 9808 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/snow3g_uia2_by4_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 15548 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/snow_v_sse.asm 00:46:58.940 -rw-r--r-- vagrant/vagrant 52029 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/zuc_top_sse.c 00:46:58.941 -rw-r--r-- vagrant/vagrant 68699 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/lib/sse_t1/zuc_x4_sse.asm 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 3299 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/CMakeLists.txt 00:46:59.206 -rw-r--r-- vagrant/vagrant 5132 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/Makefile 00:46:59.206 -rw-r--r-- vagrant/vagrant 3047 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/README.md 00:46:59.206 -rwxr-xr-x vagrant/vagrant 15855 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/ipsec_diff_tool.py 00:46:59.206 -rw-r--r-- vagrant/vagrant 166869 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/ipsec_perf.c 00:46:59.206 -rwxr-xr-x vagrant/vagrant 25492 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/ipsec_perf_tool.py 00:46:59.206 -rw-r--r-- vagrant/vagrant 3198 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/misc.asm 00:46:59.206 -rw-r--r-- vagrant/vagrant 1991 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/misc.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 8423 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/msr.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 3420 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/msr.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 3091 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/win_x64.mak 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/cmake/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 3249 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/cmake/unix.cmake 00:46:59.206 -rw-r--r-- vagrant/vagrant 2922 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/cmake/windows-mingw.cmake 00:46:59.206 -rw-r--r-- vagrant/vagrant 2585 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/perf/cmake/windows.cmake 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 5378 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/intel-ipsec-mb.spec 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/SUSE/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 3456 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/SUSE/intel-ipsec-mb.spec 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/patches/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 733 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/patches/0001-Fix-for-executable-stack-in-v0.54-release.patch 00:46:59.206 -rw-r--r-- vagrant/vagrant 1954 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/rpm/patches/0001-Fix-for-perf-scaling-on-release-1.3.patch 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/ 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/include/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 2209 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/include/aead_test.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 2092 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/include/cipher_test.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 2120 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/include/mac_test.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 3037 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/include/utils.h 00:46:59.206 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ 00:46:59.206 -rw-r--r-- vagrant/vagrant 4940 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/CMakeLists.txt 00:46:59.206 -rw-r--r-- vagrant/vagrant 3342 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/Makefile 00:46:59.206 -rw-r--r-- vagrant/vagrant 1356 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/README.md 00:46:59.206 -rw-r--r-- vagrant/vagrant 23286 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cbc_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 20385 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cbc_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 12193 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cbcs_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 126640 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cbcs_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 6956 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cfb_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 6459 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_cfb_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 66041 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/aes_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 121436 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/api_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 17002 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ccm_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 34679 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ccm_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 40463 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/chacha20_poly1305_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 6736 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/chacha20_poly1305_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 9497 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/chacha_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 6985 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/chacha_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 17523 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/chained_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 5275 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/clear_mem_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 25946 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/cmac_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 26894 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/cmac_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 23594 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/crc_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 29413 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ctr_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 30650 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ctr_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 11559 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/customop_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 1835 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/customop_test.h 00:46:59.206 -rw-r--r-- vagrant/vagrant 17187 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/des_test.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 5386 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/des_test.json.c 00:46:59.206 -rw-r--r-- vagrant/vagrant 188541 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/direct_api_param_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 2992 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/direct_api_param_test.c_template 00:46:59.207 -rw-r--r-- vagrant/vagrant 43434 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/direct_api_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 11566 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/do_test.h 00:46:59.207 -rw-r--r-- vagrant/vagrant 10124 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ecb_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 19820 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ecb_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 4010 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/gcm_ctr_vectors_test.h 00:46:59.207 -rw-r--r-- vagrant/vagrant 65298 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/gcm_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 22332 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/gcm_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 5694 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ghash_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 3064 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/ghash_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 10004 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/gmac_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 4828 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/gmac_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 5684 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hec_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 8497 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_md5.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 8660 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_md5_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 6484 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha1.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 16652 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha1_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 7407 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha224.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 7519 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha256.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 22597 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha256_sha512_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 7920 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha384.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 8413 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sha512.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 2590 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sm3.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 12063 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/hmac_sm3_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 32412 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/kasumi_f8.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9655 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/kasumi_f9.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 44284 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/kasumi_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 19766 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/main.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 8090 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/null_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 7640 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/poly1305_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9391 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/poly1305_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 25628 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/pon_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 6779 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/quic_chacha20_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9490 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/quic_ecb_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9439 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sha_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 11058 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sha_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 6844 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm3_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 25229 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm3_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 10705 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm4_cbc_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 87534 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm4_cbc_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 10559 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm4_ecb_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 94356 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/sm4_ecb_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 120605 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow3g_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 26848 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow3g_test_f8_vectors.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 14163 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow3g_test_f9_vectors.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 4799 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow_v_aead.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9907 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow_v_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 28507 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/snow_v_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 22522 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/test_api.py 00:46:59.207 -rw-r--r-- vagrant/vagrant 3584 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/win_x64.mak 00:46:59.207 -rw-r--r-- vagrant/vagrant 9462 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/xcbc_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 8332 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/xcbc_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 14891 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eea3_128.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 13147 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eea3_256.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 32591 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eea3_test.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9279 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eia3_128.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 46900 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eia3_256.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 30075 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/kat-app/zuc_eia3_test.c 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 3811 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/CMakeLists.txt 00:46:59.207 -rw-r--r-- vagrant/vagrant 2143 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/Makefile 00:46:59.207 -rw-r--r-- vagrant/vagrant 1121 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/README.md 00:46:59.207 -rw-r--r-- vagrant/vagrant 242656 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/aes_ccm_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 87993 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/aes_cmac_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 149233 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/aes_gcm_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 265705 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/chacha20_poly1305_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 183261 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/gmac_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 57869 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/hmac_sha1_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 67852 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/hmac_sha224_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 73186 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/hmac_sha256_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 97784 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/hmac_sha384_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 119933 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/hmac_sha512_test.json.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 2331 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/win_x64.mak 00:46:59.207 -rw-r--r-- vagrant/vagrant 54570 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/wycheproof-app/wycheproof.c 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 3712 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/CMakeLists.txt 00:46:59.207 -rw-r--r-- vagrant/vagrant 2263 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/Makefile 00:46:59.207 -rw-r--r-- vagrant/vagrant 2121 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/README.md 00:46:59.207 -rw-r--r-- vagrant/vagrant 122287 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/ipsec_xvalid.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 9000 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/misc.asm 00:46:59.207 -rw-r--r-- vagrant/vagrant 9883 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/misc.h 00:46:59.207 -rw-r--r-- vagrant/vagrant 2286 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/xvalid-app/win_x64.mak 00:46:59.207 -rw-r--r-- vagrant/vagrant 2425 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/CMakeLists.txt 00:46:59.207 -rw-r--r-- vagrant/vagrant 3707 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/Makefile 00:46:59.207 -rw-r--r-- vagrant/vagrant 863 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/README.md 00:46:59.207 -rw-r--r-- vagrant/vagrant 1864 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/win_x64.mak 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/acvp-app/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 3018 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/acvp-app/CMakeLists.txt 00:46:59.207 -rw-r--r-- vagrant/vagrant 2337 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/acvp-app/Makefile 00:46:59.207 -rw-r--r-- vagrant/vagrant 1464 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/acvp-app/README.md 00:46:59.207 -rw-r--r-- vagrant/vagrant 60329 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/acvp-app/acvp_app_main.c 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/cmake/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 2744 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/cmake/unix.cmake 00:46:59.207 -rw-r--r-- vagrant/vagrant 2535 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/cmake/windows-mingw.cmake 00:46:59.207 -rw-r--r-- vagrant/vagrant 2088 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/cmake/windows.cmake 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/common/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 3592 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/common/common.mk 00:46:59.207 -rw-r--r-- vagrant/vagrant 14120 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/common/utils.c 00:46:59.207 -rw-r--r-- vagrant/vagrant 2506 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/common/win_x64_common.mk 00:46:59.207 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/ 00:46:59.207 -rw-r--r-- vagrant/vagrant 3248 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/CMakeLists.txt 00:46:59.207 -rw-r--r-- vagrant/vagrant 2692 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/Makefile 00:46:59.207 -rw-r--r-- vagrant/vagrant 1127 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/README.md 00:46:59.208 -rw-r--r-- vagrant/vagrant 74837 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/direct_api_fuzz_test.c 00:46:59.208 -rw-r--r-- vagrant/vagrant 29690 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/test/fuzz-app/job_api_fuzz_test.c 00:46:59.208 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/ 00:46:59.208 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/workflows/ 00:46:59.208 -rw-r--r-- vagrant/vagrant 1042 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/workflows/freebsd.yml 00:46:59.208 -rw-r--r-- vagrant/vagrant 3638 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/workflows/linux.yml 00:46:59.208 -rw-r--r-- vagrant/vagrant 2097 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/workflows/windows.yml 00:46:59.208 -rwxr-xr-x vagrant/vagrant 1508 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.github/PULL_REQUEST_TEMPLATE.md 00:46:59.208 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/cmake/ 00:46:59.208 -rw-r--r-- vagrant/vagrant 3571 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/cmake/clang-format.cmake 00:46:59.208 -rw-r--r-- vagrant/vagrant 10001 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/cmake/utils.cmake 00:46:59.208 -rw-r--r-- vagrant/vagrant 2110 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.clang-format 00:46:59.208 -rw-r--r-- vagrant/vagrant 39 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.git 00:46:59.208 -rw-r--r-- vagrant/vagrant 208 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/.gitignore 00:46:59.208 -rw-r--r-- vagrant/vagrant 3258 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/CMakeLists.txt 00:46:59.208 -rwxr-xr-x vagrant/vagrant 8412 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/CONTRIBUTING 00:46:59.208 -rw-r--r-- vagrant/vagrant 7036 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/INSTALL.md 00:46:59.208 -rw-r--r-- vagrant/vagrant 1505 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/LICENSE 00:46:59.208 -rw-r--r-- vagrant/vagrant 2997 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/Makefile 00:46:59.208 -rw-r--r-- vagrant/vagrant 31430 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/README.md 00:46:59.208 -rw-r--r-- vagrant/vagrant 36562 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/ReleaseNotes.txt 00:46:59.208 -rw-r--r-- vagrant/vagrant 6050 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/SECURITY.md 00:46:59.208 -rw-r--r-- vagrant/vagrant 2245 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/mkdep.bat 00:46:59.208 -rw-r--r-- vagrant/vagrant 1939 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/win_x64.mak 00:46:59.208 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/ 00:46:59.208 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/ 00:46:59.208 -rw-r--r-- vagrant/vagrant 1019 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/mag_seld.svg 00:46:59.208 -rw-r--r-- vagrant/vagrant 90 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/pages_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 5361 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/search.css 00:46:59.208 -rw-r--r-- vagrant/vagrant 23448 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/search.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 599 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/searchdata.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1869 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 144 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 742 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_2.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 152 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_3.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 146 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_4.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_5.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 520 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_6.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 384 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_7.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1967 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_8.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 148 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_9.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 2283 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_a.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 150 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_b.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 750 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/typedefs_c.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1599 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 2588 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 130 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_10.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 557 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_11.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 5970 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_12.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_13.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_14.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 228 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_15.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 602 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_16.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 4110 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_2.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1593 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_3.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1061 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_4.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1378 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_5.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 5703 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_6.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1846 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_7.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 712 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_8.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 116 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_9.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1132 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_a.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 554 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_b.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 848 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_c.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_d.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_e.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 746 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/variables_f.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1730 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 22773 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 746 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_10.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 794 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_11.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 620 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_12.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 9785 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_13.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 436 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_14.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 143 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_15.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_16.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1606 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_17.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 144 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_2.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 4976 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_3.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 2123 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_4.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1061 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_5.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 2164 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_6.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 7444 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_7.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 2350 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_8.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 63214 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_9.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 116 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_a.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 3624 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_b.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 554 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_c.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1416 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_d.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 354 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_e.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 246 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/all_f.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 158 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/classes_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 194 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/classes_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 308 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/classes_2.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 122 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/classes_3.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 131 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/classes_4.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 947 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/close.svg 00:46:59.208 -rw-r--r-- vagrant/vagrant 150 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/defines_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 152 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/defines_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 30706 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/defines_2.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 174 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/defines_3.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 1188 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/enums_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 27318 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/enumvalues_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 108 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/files_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 83 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/files_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 18302 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_0.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_1.js 00:46:59.208 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_2.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 1180 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_3.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 2399 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_4.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 296 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_5.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 588 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_6.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 556 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_7.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 1440 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_8.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/functions_9.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 804 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/mag.svg 00:46:59.209 -rw-r--r-- vagrant/vagrant 804 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/mag_d.svg 00:46:59.209 -rw-r--r-- vagrant/vagrant 1019 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/search/mag_sel.svg 00:46:59.209 -rw-r--r-- vagrant/vagrant 10710 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_s.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5096 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_u.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4762 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_v.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6058 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 1034 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 7287 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_a.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 8655 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_c.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6237 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_d.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5731 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_e.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6072 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_f.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 10517 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_g.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6547 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_h.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5403 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_i.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4747 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_j.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5772 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_k.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5217 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_l.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5428 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_m.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4991 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_n.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4890 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_o.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5382 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_p.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4753 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_q.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5183 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_r.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 10713 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_s.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5099 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_u.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4765 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_v.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4867 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_x.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5241 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_vars_z.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4864 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_x.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5238 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_z.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4776 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 23311 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_a.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4771 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_b.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5340 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_c.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5184 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_d.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4737 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_defs.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_defs.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 4737 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_defs_d.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 33676 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_defs_i.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4746 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_defs_k.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 574 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_dup.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 5754 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_enum.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 30956 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_eval.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5454 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_f.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 29589 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_func.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6183 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_g.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5165 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_h.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 64042 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_i.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 6906 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_k.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5204 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_m.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5325 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_q.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 8190 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_s.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 14506 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_type.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 4774 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_x.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5616 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/globals_z.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 59322 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/index.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 1021877 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/intel-ipsec-mb_8h.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 65499 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/intel-ipsec-mb_8h.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 769178 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/intel-ipsec-mb_8h_source.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 176630 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/jquery.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 153 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/nav_f.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 169 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/nav_fd.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 95 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/nav_g.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 98 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/nav_h.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 114 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/nav_hd.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 2183 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtree.css 00:46:59.209 -rw-r--r-- vagrant/vagrant 15897 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtree.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 5630 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreedata.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 14170 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreeindex0.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 20027 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreeindex1.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 20155 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreeindex2.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 17257 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreeindex3.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 234 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/navtreeindex4.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 123 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/open.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 5685 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/resize.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 314 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/splitbar.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 282 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/splitbard.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 83333 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__JOB.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 5860 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__JOB.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 167654 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__MGR.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 15018 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__MGR.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 7871 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__SGL__IOV.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 277 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structIMB__SGL__IOV.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 14757 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structchacha20__poly1305__context__data.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 1143 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structchacha20__poly1305__context__data.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 12291 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structgcm__context__data.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 719 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structgcm__context__data.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 15352 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structgcm__key__data.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 675 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structgcm__key__data.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 6885 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structimb__uint128__t.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 202 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structimb__uint128__t.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 7625 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structkasumi__key__sched__s.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 222 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structkasumi__key__sched__s.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 6287 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structsnow3g__key__schedule__s.html 00:46:59.209 -rw-r--r-- vagrant/vagrant 132 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/structsnow3g__key__schedule__s.js 00:46:59.209 -rw-r--r-- vagrant/vagrant 853 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/sync_off.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 845 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/sync_on.png 00:46:59.209 -rw-r--r-- vagrant/vagrant 142 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_a.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 135 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_ad.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 169 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_b.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 173 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_bd.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 177 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_h.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 180 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_hd.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 184 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_s.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 188 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tab_sd.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 11055 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/tabs.css 00:46:59.210 -rw-r--r-- vagrant/vagrant 4659 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/README_8md.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 7353 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/annotated.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 799 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/annotated_dup.js 00:46:59.210 -rw-r--r-- vagrant/vagrant 676 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/bc_s.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 635 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/bc_sd.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 6085 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/classes.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 132 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/closed.png 00:46:59.210 -rw-r--r-- vagrant/vagrant 1503 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/doc.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 1503 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/docd.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 44883 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/doxygen.css 00:46:59.210 -rw-r--r-- vagrant/vagrant 15461 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/doxygen.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 4545 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/dynsections.js 00:46:59.210 -rw-r--r-- vagrant/vagrant 5001 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/files.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 94 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/files_dup.js 00:46:59.210 -rw-r--r-- vagrant/vagrant 1996 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/folderclosed.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 1996 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/folderclosedd.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 3269 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/folderopen.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 3214 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/folderopend.svg 00:46:59.210 -rw-r--r-- vagrant/vagrant 6055 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 7284 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_a.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 8652 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_c.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 6234 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_d.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 918 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_dup.js 00:46:59.210 -rw-r--r-- vagrant/vagrant 5728 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_e.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 6069 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_f.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 10514 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_g.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 6544 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_h.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5400 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_i.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 4744 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_j.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5769 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_k.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5214 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_l.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5425 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_m.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 4988 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_n.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 4887 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_o.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5379 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_p.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 4750 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_q.html 00:46:59.210 -rw-r--r-- vagrant/vagrant 5180 2024-06-07 12:49 spdk-test_gen_spec/intel-ipsec-mb/docs/functions_r.html 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/ 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 412 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 48941 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/vfu_virtio.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 18025 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/vfu_virtio_blk.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 12165 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/vfu_virtio_internal.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 8332 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/vfu_virtio_rpc.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 30070 2024-06-07 12:49 spdk-test_gen_spec/module/vfu_device/vfu_virtio_scsi.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_compressdev/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 440 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_compressdev/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 23990 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_compressdev/accel_dpdk_compressdev.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 366 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_compressdev/accel_dpdk_compressdev.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 1551 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 599 2024-06-07 12:49 spdk-test_gen_spec/module/accel/Makefile 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_cryptodev/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 484 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_cryptodev/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 53036 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 464 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 2376 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dsa/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dsa/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 14988 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dsa/accel_dsa.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dsa/accel_dsa.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 1594 2024-06-07 12:49 spdk-test_gen_spec/module/accel/dsa/accel_dsa_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/error/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 376 2024-06-07 12:49 spdk-test_gen_spec/module/accel/error/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 6722 2024-06-07 12:49 spdk-test_gen_spec/module/accel/error/accel_error.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 674 2024-06-07 12:49 spdk-test_gen_spec/module/accel/error/accel_error.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 2465 2024-06-07 12:49 spdk-test_gen_spec/module/accel/error/accel_error_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/iaa/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/module/accel/iaa/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 10067 2024-06-07 12:49 spdk-test_gen_spec/module/accel/iaa/accel_iaa.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 280 2024-06-07 12:49 spdk-test_gen_spec/module/accel/iaa/accel_iaa.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 979 2024-06-07 12:49 spdk-test_gen_spec/module/accel/iaa/accel_iaa_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ioat/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 380 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ioat/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 7683 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ioat/accel_ioat.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 315 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ioat/accel_ioat.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 795 2024-06-07 12:49 spdk-test_gen_spec/module/accel/ioat/accel_ioat_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/accel/mlx5/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 429 2024-06-07 12:49 spdk-test_gen_spec/module/accel/mlx5/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 31743 2024-06-07 12:49 spdk-test_gen_spec/module/accel/mlx5/accel_mlx5.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 450 2024-06-07 12:49 spdk-test_gen_spec/module/accel/mlx5/accel_mlx5.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 1314 2024-06-07 12:49 spdk-test_gen_spec/module/accel/mlx5/accel_mlx5_rpc.c 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ 00:46:59.210 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/crypto/ 00:46:59.210 -rw-r--r-- vagrant/vagrant 410 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/crypto/Makefile 00:46:59.210 -rw-r--r-- vagrant/vagrant 30760 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/crypto/vbdev_crypto.c 00:46:59.210 -rw-r--r-- vagrant/vagrant 1558 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/crypto/vbdev_crypto.h 00:46:59.210 -rw-r--r-- vagrant/vagrant 7359 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/crypto/vbdev_crypto_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 577 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 19614 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/bdev_mdns_client.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 237288 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/bdev_nvme.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 13700 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/bdev_nvme.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 3188 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/bdev_nvme_cuse_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 87481 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/bdev_nvme_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 11344 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/nvme_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 15719 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/vbdev_opal.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 859 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/vbdev_opal.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 13346 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/nvme/vbdev_opal_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/daos/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/daos/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 23086 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/daos/bdev_daos.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 772 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/daos/bdev_daos.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 4935 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/daos/bdev_daos_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 538 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 8912 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/ctx.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 964 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/ctx.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 1738 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/data.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 679 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/data.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 3455 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/stats.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 633 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/stats.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 2771 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/utils.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1627 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/utils.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 43222 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/vbdev_ocf.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 5957 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/vbdev_ocf.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 17961 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/vbdev_ocf_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 9329 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/volume.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 629 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ocf/volume.h 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/delay/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 422 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/delay/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 29519 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/delay/vbdev_delay.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1792 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/delay/vbdev_delay.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 5955 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/delay/vbdev_delay_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/passthru/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 431 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/passthru/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 25440 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/passthru/vbdev_passthru.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 976 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/passthru/vbdev_passthru.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 3518 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/passthru/vbdev_passthru_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 762 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/Makefile 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/error/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/error/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 14357 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/error/vbdev_error.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1618 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/error/vbdev_error.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 5900 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/error/vbdev_error_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 509 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 105736 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/bdev_raid.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 18171 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/bdev_raid.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 17540 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/bdev_raid_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 11084 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/bdev_raid_sb.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 9420 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/concat.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 13964 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/raid0.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 14246 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/raid1.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 35073 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/raid/raid5f.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ftl/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 414 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ftl/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 16797 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ftl/bdev_ftl.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 2308 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ftl/bdev_ftl.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 11132 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/ftl/bdev_ftl_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/rbd/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 374 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/rbd/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 34792 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/rbd/bdev_rbd.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1895 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/rbd/bdev_rbd.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 9920 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/rbd/bdev_rbd_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/gpt/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 366 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/gpt/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 7627 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/gpt/gpt.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1812 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/gpt/gpt.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 16029 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/gpt/vbdev_gpt.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/split/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 382 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/split/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 12421 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/split/vbdev_split.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1392 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/split/vbdev_split.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 3634 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/split/vbdev_split_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/iscsi/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 643 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/iscsi/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 28991 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/iscsi/bdev_iscsi.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 1672 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/iscsi/bdev_iscsi.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 4571 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/iscsi/bdev_iscsi_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/uring/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 518 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/uring/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 23280 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/uring/bdev_uring.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 704 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/uring/bdev_uring.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 4541 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/uring/bdev_uring_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/lvol/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 379 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/lvol/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 52087 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/lvol/vbdev_lvol.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 5044 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/lvol/vbdev_lvol.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 47971 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/lvol/vbdev_lvol_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 406 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 7444 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/bdev_virtio.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 20254 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/bdev_virtio_blk.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 8985 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/bdev_virtio_rpc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 49322 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/virtio/bdev_virtio_scsi.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/aio/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 423 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/aio/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 26742 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/aio/bdev_aio.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 559 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/aio/bdev_aio.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 4623 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/aio/bdev_aio_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/malloc/ 00:46:59.211 -rw-r--r-- vagrant/vagrant 383 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/malloc/Makefile 00:46:59.211 -rw-r--r-- vagrant/vagrant 25493 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/malloc/bdev_malloc.c 00:46:59.211 -rw-r--r-- vagrant/vagrant 862 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/malloc/bdev_malloc.h 00:46:59.211 -rw-r--r-- vagrant/vagrant 3706 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/malloc/bdev_malloc_rpc.c 00:46:59.211 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/xnvme/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 430 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/xnvme/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 11902 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/xnvme/bdev_xnvme.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 531 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/xnvme/bdev_xnvme.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 3588 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/xnvme/bdev_xnvme_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/compress/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 455 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/compress/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 36168 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/compress/vbdev_compress.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 1900 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/compress/vbdev_compress.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 5498 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/compress/vbdev_compress_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/null/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/null/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 12663 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/null/bdev_null.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 1214 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/null/bdev_null.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 6307 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/null/bdev_null_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/zone_block/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/zone_block/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 25665 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/zone_block/vbdev_zone_block.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 530 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/zone_block/vbdev_zone_block.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 3529 2024-06-07 12:49 spdk-test_gen_spec/module/bdev/zone_block/vbdev_zone_block_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/blob/ 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/blob/bdev/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 367 2024-06-07 12:49 spdk-test_gen_spec/module/blob/bdev/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 16270 2024-06-07 12:49 spdk-test_gen_spec/module/blob/bdev/blob_bdev.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 155 2024-06-07 12:49 spdk-test_gen_spec/module/blob/bdev/spdk_blob_bdev.map 00:46:59.212 -rw-r--r-- vagrant/vagrant 318 2024-06-07 12:49 spdk-test_gen_spec/module/blob/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/ 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 486 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 7043 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/blobfs_bdev.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 7915 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/blobfs_bdev_rpc.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 6945 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/blobfs_fuse.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 642 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/blobfs_fuse.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 103 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/bdev/spdk_blobfs_bdev.map 00:46:59.212 -rw-r--r-- vagrant/vagrant 318 2024-06-07 12:49 spdk-test_gen_spec/module/blobfs/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/env_dpdk/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 364 2024-06-07 12:49 spdk-test_gen_spec/module/env_dpdk/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 1235 2024-06-07 12:49 spdk-test_gen_spec/module/env_dpdk/env_dpdk_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/ 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/ 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scheduler/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scheduler/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 1625 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scheduler/scheduler.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scsi/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scsi/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 618 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/scsi/scsi.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/sock/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 406 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/sock/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 1252 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/sock/sock.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 970 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/accel/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 362 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/accel/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 820 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/accel/accel.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/ublk/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/ublk/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 920 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/ublk/ublk.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/bdev/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/bdev/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 1159 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/bdev/bdev.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vfu_tgt/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 366 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vfu_tgt/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 775 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vfu_tgt/vfu_tgt.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iobuf/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 374 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iobuf/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 1553 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iobuf/iobuf.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 2990 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iobuf/iobuf_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_blk/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_blk/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 869 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_blk/vhost_blk.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iscsi/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 395 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iscsi/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 993 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/iscsi/iscsi.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_scsi/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 372 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_scsi/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 884 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vhost_scsi/vhost_scsi.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/keyring/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 360 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/keyring/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 829 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/keyring/keyring.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vmd/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 368 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vmd/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 236 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vmd/event_vmd.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 1808 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vmd/vmd.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 2536 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/vmd/vmd_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nbd/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nbd/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 855 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nbd/nbd.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nvmf/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 375 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nvmf/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 708 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nvmf/event_nvmf.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 7842 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nvmf/nvmf_rpc.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 15674 2024-06-07 12:49 spdk-test_gen_spec/module/event/subsystems/nvmf/nvmf_tgt.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 324 2024-06-07 12:49 spdk-test_gen_spec/module/event/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 352 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/file/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 370 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/file/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 3494 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/file/keyring.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 318 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/file/keyring_file.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 2543 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/file/keyring_rpc.c 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/linux/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 400 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/linux/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 2930 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/linux/keyring.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 397 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/linux/keyring_linux.h 00:46:59.212 -rw-r--r-- vagrant/vagrant 1150 2024-06-07 12:49 spdk-test_gen_spec/module/keyring/linux/keyring_rpc.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 2558 2024-06-07 12:49 spdk-test_gen_spec/module/Makefile 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/ 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dpdk_governor/ 00:46:59.212 -rw-r--r-- vagrant/vagrant 404 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dpdk_governor/Makefile 00:46:59.212 -rw-r--r-- vagrant/vagrant 3110 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dpdk_governor/dpdk_governor.c 00:46:59.212 -rw-r--r-- vagrant/vagrant 2853 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dpdk_governor/dpdk_governor.d 00:46:59.212 -rw-r--r-- vagrant/vagrant 26360 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dpdk_governor/dpdk_governor.o 00:46:59.212 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dynamic/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 377 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dynamic/Makefile 00:46:59.213 -rw-r--r-- vagrant/vagrant 11097 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/dynamic/scheduler_dynamic.c 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/gscheduler/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 373 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/gscheduler/Makefile 00:46:59.213 -rw-r--r-- vagrant/vagrant 1957 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/gscheduler/gscheduler.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1943 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/gscheduler/gscheduler.d 00:46:59.213 -rw-r--r-- vagrant/vagrant 22088 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/gscheduler/gscheduler.o 00:46:59.213 -rw-r--r-- vagrant/vagrant 613 2024-06-07 12:49 spdk-test_gen_spec/module/scheduler/Makefile 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/sock/ 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/sock/posix/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 381 2024-06-07 12:49 spdk-test_gen_spec/module/sock/posix/Makefile 00:46:59.213 -rw-r--r-- vagrant/vagrant 55908 2024-06-07 12:49 spdk-test_gen_spec/module/sock/posix/posix.c 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/module/sock/uring/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 358 2024-06-07 12:49 spdk-test_gen_spec/module/sock/uring/Makefile 00:46:59.213 -rw-r--r-- vagrant/vagrant 51177 2024-06-07 12:49 spdk-test_gen_spec/module/sock/uring/uring.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 375 2024-06-07 12:49 spdk-test_gen_spec/module/sock/Makefile 00:46:59.213 -rw-r--r-- vagrant/vagrant 1744 2024-06-07 12:49 spdk-test_gen_spec/module/sock/sock_kernel.h 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/xnvmebuild/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 637 2024-06-07 12:49 spdk-test_gen_spec/xnvmebuild/Makefile 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.github/ 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/.github/ISSUE_TEMPLATE/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 871 2024-06-07 12:49 spdk-test_gen_spec/.github/ISSUE_TEMPLATE/bug_report.md 00:46:59.213 -rw-r--r-- vagrant/vagrant 343 2024-06-07 12:49 spdk-test_gen_spec/.github/ISSUE_TEMPLATE/config.yml 00:46:59.213 -rw-r--r-- vagrant/vagrant 772 2024-06-07 12:49 spdk-test_gen_spec/.github/ISSUE_TEMPLATE/intermittent_failure.md 00:46:59.213 -rw-r--r-- vagrant/vagrant 499 2024-06-07 12:49 spdk-test_gen_spec/.github/dependabot.yml 00:46:59.213 -rw-r--r-- vagrant/vagrant 319 2024-06-07 12:49 spdk-test_gen_spec/.github/mistaken-pull-closer.yml 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ipsecbuild/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 579 2024-06-07 12:49 spdk-test_gen_spec/ipsecbuild/Makefile 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/ 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 13513 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_cache.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 4348 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_cache_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 9999 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_composite_volume.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 289 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_composite_volume_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 12585 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_core.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 2589 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_core_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 5740 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_ctx.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 4476 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_ctx_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1447 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_def_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 4272 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_io.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1889 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_io_class.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1603 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_io_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1412 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_logger.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 709 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_logger_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 28777 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_lru.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1705 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_lru.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 518 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_lru_structs.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 2497 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_metadata.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 324 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 2735 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_queue.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 903 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_queue_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 6137 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_request.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 9926 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_request.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 9446 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_seq_cutoff.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1231 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_seq_cutoff.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 4139 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_space.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 878 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_space.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 11088 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_stats.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 12108 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_stats_builder.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 6017 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_stats_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 7951 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_volume.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1696 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/ocf_volume_priv.h 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 23916 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/acp.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1351 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/acp.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 493 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/acp_structs.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 26734 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/alru.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1205 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/alru.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 694 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/alru_structs.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 2565 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/cleaning.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1254 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/cleaning.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 8212 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/cleaning_ops.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 563 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/cleaning_priv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 290 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/nop.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 309 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/nop.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 292 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/cleaning/nop_structs.h 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 6657 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_cache_line_concurrency.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 5714 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_cache_line_concurrency.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 577 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_concurrency.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 662 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_concurrency.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 15299 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_metadata_concurrency.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 5963 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_metadata_concurrency.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 4439 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_mio_concurrency.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 613 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_mio_concurrency.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 6135 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_pio_concurrency.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 519 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/concurrency/ocf_pio_concurrency.h 00:46:59.213 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/ 00:46:59.213 -rw-r--r-- vagrant/vagrant 6474 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/cache_engine.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 1758 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/cache_engine.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 2730 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_bf.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 214 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_bf.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 17520 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_common.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 8111 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_common.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1231 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_d2c.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 207 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_d2c.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1276 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_debug.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 6744 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_discard.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 202 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_discard.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 5993 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_fast.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_fast.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1660 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_inv.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_inv.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1309 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_ops.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 235 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_ops.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 3963 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_pt.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 315 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_pt.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 6165 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_rd.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 270 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_rd.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 1124 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wa.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 206 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wa.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 4700 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wb.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 252 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wb.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 5760 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wi.c 00:46:59.213 -rw-r--r-- vagrant/vagrant 206 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wi.h 00:46:59.213 -rw-r--r-- vagrant/vagrant 6464 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wo.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 205 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wo.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 5625 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wt.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 206 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_wt.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 4079 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_zero.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 221 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/engine/engine_zero.h 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 45322 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 6058 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 7082 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_bit.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 549 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_cache_line.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 591 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_cleaning_policy.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 329 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_cleaning_policy.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 5496 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_collision.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2286 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_collision.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 250 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_common.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 2060 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_core.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 812 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_core.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 518 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_eviction_policy.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 288 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_eviction_policy.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 990 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_internal.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 14317 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_io.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 4442 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_io.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1274 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_misc.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1158 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_misc.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1002 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_partition.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 676 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_partition.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 2325 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_partition_structs.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 4783 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_passive_update.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 392 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_passive_update.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 18392 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 9523 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 6116 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_atomic.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_atomic.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 14976 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_dynamic.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2360 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_dynamic.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1818 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_volatile.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1611 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_raw_volatile.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 3924 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_segment.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1103 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_segment.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1408 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_segment_id.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 10282 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_status.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1994 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_structs.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 20407 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_superblock.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 3586 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/metadata/metadata_superblock.h 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 100787 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_cache.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 10909 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_common.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 980 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_common.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 31693 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_core.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2709 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_core_pool.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 329 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_core_pool_priv.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 342 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_core_priv.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 28092 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_flush.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 8386 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_io_class.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 596 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/mngt/ocf_mngt_misc.c 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 1413 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/ops.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 4196 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/promotion.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 3066 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/promotion.h 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 5973 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/nhit.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 783 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/nhit.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 11617 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/nhit_hash.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 677 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/nhit_hash.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 343 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/promotion/nhit/nhit_structs.h 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 19660 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_alock.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2800 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_alock.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 4832 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_async_lock.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1325 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_async_lock.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 4891 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_cache_line.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 10251 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_cache_line.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 26631 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_cleaner.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 4256 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_cleaner.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 5606 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_generator.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 482 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_generator.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 8425 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_io.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2649 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_io.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1724 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_io_allocator.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1466 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_list.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 5447 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_list.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 3563 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_parallelize.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 942 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_parallelize.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 3272 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_pipeline.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 3946 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_pipeline.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 10444 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_rbtree.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1134 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_rbtree.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 2161 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_realloc.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 2071 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_realloc.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1340 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_refcnt.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 1454 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_refcnt.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 821 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_request.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 353 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_request.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 1004 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_stats.h 00:46:59.214 -rw-r--r-- vagrant/vagrant 5057 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_user_part.c 00:46:59.214 -rw-r--r-- vagrant/vagrant 4607 2024-06-07 12:49 spdk-test_gen_spec/ocf/src/utils/utils_user_part.h 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/ 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 89 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/.gitignore 00:46:59.214 -rw-r--r-- vagrant/vagrant 1205 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/Makefile 00:46:59.214 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/__init__.py 00:46:59.214 -rw-r--r-- vagrant/vagrant 142 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pytest.ini 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/config/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 97 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/config/random.cfg 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/ 00:46:59.214 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/__init__.py 00:46:59.214 -rw-r--r-- vagrant/vagrant 1193 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/helpers.py 00:46:59.214 -rw-r--r-- vagrant/vagrant 706 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/ocf.py 00:46:59.214 -rw-r--r-- vagrant/vagrant 7661 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/rio.py 00:46:59.214 -rw-r--r-- vagrant/vagrant 6144 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/utils.py 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/ 00:46:59.214 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/helpers/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 1617 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/helpers/metadata_helpers.c 00:46:59.215 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/helpers/metadata_helpers.h 00:46:59.215 -rw-r--r-- vagrant/vagrant 219 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/helpers/volume_type.c 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 256 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ocf_core_wrappers.c 00:46:59.215 -rw-r--r-- vagrant/vagrant 754 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ocf_io_wrappers.c 00:46:59.215 -rw-r--r-- vagrant/vagrant 802 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ocf_logger_wrappers.c 00:46:59.215 -rw-r--r-- vagrant/vagrant 243 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ocf_mngt_wrappers.c 00:46:59.215 -rw-r--r-- vagrant/vagrant 254 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/c/wrappers/ocf_volume_wrappers.c 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/ 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/stats/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/stats/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 1080 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/stats/cache.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 517 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/stats/core.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2058 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/stats/shared.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 32397 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/cache.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 927 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/cleaner.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 6195 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/core.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4205 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/ctx.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2144 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/cvolume.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 5862 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/data.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3275 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/io.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 863 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/ioclass.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4400 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/logger.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4287 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/queue.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 5000 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/shared.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 16205 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/volume.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 727 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/volume_cache.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 570 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/volume_core.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3998 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/volume_exp_obj.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2704 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/pyocf/types/volume_replicated.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/ 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/basic/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/basic/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3515 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/basic/test_pyocf.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2027 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_flush.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3542 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_io_flags.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2392 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_large_io.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 7446 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_pp.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 14765 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_read.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4759 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/engine/test_seq_cutoff.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/eviction/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/eviction/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 7033 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/eviction/test_eviction.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/failover/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 2505 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/failover/test_standby_io.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2005 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/conftest.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/flush/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/flush/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3175 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/flush/test_flush_after_mngmt.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 11710 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_add_remove.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 3372 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_attach_cache.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4678 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_change_params.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 23511 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_composite_volume.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 6495 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_disable_cleaner.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 17892 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_failover.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 5558 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_metadata_volatile.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 23429 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/management/test_start_stop.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/__init__.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 1896 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/conftest.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 24364 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/test_management_fuzzy.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 4734 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/test_management_start_fuzzy.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 6699 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/test_negative_io.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 5687 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/security/test_secure_erase.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/surprise_shutdown/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 36283 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/surprise_shutdown/test_management_surprise_shutdown.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/utils/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 2752 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/tests/utils/random.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/utils/ 00:46:59.215 -rwxr-xr-x vagrant/vagrant 230 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/functional/utils/configure_random.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/ 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/ 00:46:59.215 -rw-r--r-- vagrant/vagrant 13 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/.gitignore 00:46:59.215 -rw-r--r-- vagrant/vagrant 586 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/README 00:46:59.215 -rwxr-xr-x vagrant/vagrant 5797 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/add_new_test_file.py 00:46:59.215 -rwxr-xr-x vagrant/vagrant 26268 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/prepare_sources_for_testing.py 00:46:59.215 -rwxr-xr-x vagrant/vagrant 4029 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/run_unit_tests.py 00:46:59.215 -rw-r--r-- vagrant/vagrant 2069 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/framework/tests_config.py 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ 00:46:59.215 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 1289 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_dec.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 2275 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_freeze.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 1057 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_inc.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 1125 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_init.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 2075 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_register_zero_cb.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 2058 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_refcnt.c/utils_refcnt_unfreeze.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_generator.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 4356 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_generator.c/utils_generator_bisect.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_rbtree.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 5376 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/utils/utils_rbtree.c/utils_rbtree.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 102 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/.gitignore 00:46:59.216 -rwxr-xr-x vagrant/vagrant 1032 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/add_new_test_file.py 00:46:59.216 -rw-r--r-- vagrant/vagrant 93 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/header.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 186 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/print_desc.h 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/cleaning/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/cleaning/cleaning.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 3646 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/cleaning/cleaning.c/ocf_cleaner_run_test.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/concurrency/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/concurrency/ocf_cache_line_concurrency.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 13881 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/concurrency/ocf_cache_line_concurrency.c/ocf_cache_line_concurrency.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/concurrency/ocf_metadata_concurrency.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 3203 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/concurrency/ocf_metadata_concurrency.c/ocf_metadata_concurrency.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/engine/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/engine/engine_common.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 4351 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/engine/engine_common.c/prepare_clines_miss.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/metadata/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/metadata/metadata_collision.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 1916 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/metadata/metadata_collision.c/metadata_collision.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ocf_mngt_cache.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 5934 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ocf_mngt_cache.c/_cache_mngt_set_cache_mode_test.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 4847 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ocf_mngt_cache.c/ocf_mngt_cache_set_fallback_pt_error_threshold.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ocf_mngt_io_class.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 6582 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/mngt/ocf_mngt_io_class.c/ocf_mngt_io_class.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_env/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 181 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_env/CMakeLists.txt 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_lru.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 7627 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_lru.c/lru.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 13783 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_lru.c/lru_iter.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_space.c/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 7713 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/unit/tests/ocf_space.c/ocf_space.c 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/build/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 832 2024-06-07 12:49 spdk-test_gen_spec/ocf/tests/build/Makefile 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/ISSUE_TEMPLATE/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 1038 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/ISSUE_TEMPLATE/bug.md 00:46:59.216 -rw-r--r-- vagrant/vagrant 683 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/ISSUE_TEMPLATE/enhancement.md 00:46:59.216 -rw-r--r-- vagrant/vagrant 670 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/ISSUE_TEMPLATE/question.md 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/workflows/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 910 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/workflows/pullrequest.yml 00:46:59.216 -rwxr-xr-x vagrant/vagrant 577 2024-06-07 12:49 spdk-test_gen_spec/ocf/.github/verify_header.sh 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/img/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 18524 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/img/deployment-1.png 00:46:59.216 -rw-r--r-- vagrant/vagrant 19038 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/img/deployment-2.png 00:46:59.216 -rw-r--r-- vagrant/vagrant 66489 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/img/io-path.png 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/requirements/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 6859 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/requirements/composite_volume 00:46:59.216 -rw-r--r-- vagrant/vagrant 2682 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/requirements/disable_cleaner 00:46:59.216 -rw-r--r-- vagrant/vagrant 6 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/.gitignore 00:46:59.216 -rw-r--r-- vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/HOME.md 00:46:59.216 -rw-r--r-- vagrant/vagrant 11566 2024-06-07 12:49 spdk-test_gen_spec/ocf/doc/doxygen.cfg 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 4067 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/ocf_env.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 13263 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/ocf_env.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 500 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/ocf_env_headers.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 4535 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/ocf_env_list.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 3029 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/utils_mpool.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 2266 2024-06-07 12:49 spdk-test_gen_spec/ocf/env/posix/utils_mpool.h 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 650 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/Makefile 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 5481 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/ctx.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 334 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/ctx.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 189 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/data.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 10342 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/main.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 3114 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/queue_thread.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 267 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/queue_thread.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 3956 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/volume.c 00:46:59.216 -rw-r--r-- vagrant/vagrant 424 2024-06-07 12:49 spdk-test_gen_spec/ocf/example/simple/src/volume.h 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/promotion/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 512 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/promotion/nhit.h 00:46:59.216 drwxr-xr-x vagrant/vagrant 0 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/cleaning/ 00:46:59.216 -rw-r--r-- vagrant/vagrant 1101 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/cleaning/acp.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 1847 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/cleaning/alru.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 779 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 5839 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_cache.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 695 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_cfg.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 1369 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_cleaner.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 1380 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_composite_volume.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 5028 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_core.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 6590 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_ctx.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 439 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_debug.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 7193 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_def.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 3405 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_err.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 4310 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_io.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 3458 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_io_class.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 844 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_logger.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 3463 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_metadata.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 37232 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_mngt.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 2655 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_queue.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 10117 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_stats.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 1787 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_types.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 7198 2024-06-07 12:49 spdk-test_gen_spec/ocf/inc/ocf_volume.h 00:46:59.216 -rw-r--r-- vagrant/vagrant 28 2024-06-07 12:49 spdk-test_gen_spec/ocf/.git 00:46:59.216 -rw-r--r-- vagrant/vagrant 266 2024-06-07 12:49 spdk-test_gen_spec/ocf/.pep8speaks.yml 00:46:59.217 -rw-r--r-- vagrant/vagrant 1470 2024-06-07 12:49 spdk-test_gen_spec/ocf/LICENSE 00:46:59.217 -rw-r--r-- vagrant/vagrant 3379 2024-06-07 12:49 spdk-test_gen_spec/ocf/Makefile 00:46:59.217 -rw-r--r-- vagrant/vagrant 4364 2024-06-07 12:49 spdk-test_gen_spec/ocf/README.md 00:46:59.217 -rw-r--r-- vagrant/vagrant 346 2024-06-07 12:49 spdk-test_gen_spec/ocf/codecov.yml 00:46:59.217 + STATUS=0 00:46:59.217 + '[' 0 -ne 0 ']' 00:46:59.217 + cd spdk-test_gen_spec 00:46:59.217 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . 00:46:59.217 + RPM_EC=0 00:46:59.217 ++ jobs -p 00:46:59.217 + exit 0 00:46:59.217 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.EcJntR 00:46:59.217 + umask 022 00:46:59.217 + cd /home/vagrant/rpmbuild/BUILD 00:46:59.217 + cd spdk-test_gen_spec 00:46:59.217 + set +x 00:46:59.474 Using default SPDK env in /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/lib/env_dpdk 00:46:59.475 Using default DPDK in /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build 00:47:14.965 Configuring ISA-L (logfile: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/.spdk-isal.log)...done. 00:47:27.171 Configuring ISA-L-crypto (logfile: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/.spdk-isal-crypto.log)...done. 00:47:27.171 Creating mk/config.mk...done. 00:47:27.171 Creating mk/cc.flags.mk...done. 00:47:27.171 Type 'make' to build. 00:47:27.429 make[1]: Nothing to be done for 'all'. 00:47:53.962 The Meson build system 00:47:53.962 Version: 1.4.0 00:47:53.962 Source dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk 00:47:53.962 Build dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp 00:47:53.962 Build type: native build 00:47:53.962 Program cat found: YES (/bin/cat) 00:47:53.962 Project name: DPDK 00:47:53.962 Project version: 24.03.0 00:47:53.962 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:47:53.962 C linker for the host machine: cc ld.bfd 2.35.2-42 00:47:53.962 Host machine cpu family: x86_64 00:47:53.962 Host machine cpu: x86_64 00:47:53.962 Message: ## Building in Developer Mode ## 00:47:53.962 Program pkg-config found: YES (/bin/pkg-config) 00:47:53.962 Program check-symbols.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/check-symbols.sh) 00:47:53.962 Program options-ibverbs-static.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/options-ibverbs-static.sh) 00:47:53.962 Program python3 found: YES (/usr/bin/python3) 00:47:53.962 Program cat found: YES (/bin/cat) 00:47:53.962 Compiler for C supports arguments -march=native: YES 00:47:53.962 Checking for size of "void *" : 8 00:47:53.962 Checking for size of "void *" : 8 (cached) 00:47:53.962 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:47:53.962 Library m found: YES 00:47:53.962 Library numa found: YES 00:47:53.962 Has header "numaif.h" : YES 00:47:53.962 Library fdt found: NO 00:47:53.962 Library execinfo found: NO 00:47:53.962 Has header "execinfo.h" : YES 00:47:53.962 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:47:53.962 Run-time dependency libarchive found: NO (tried pkgconfig) 00:47:53.962 Run-time dependency libbsd found: NO (tried pkgconfig) 00:47:53.962 Run-time dependency jansson found: NO (tried pkgconfig) 00:47:53.962 Run-time dependency openssl found: YES 3.0.7 00:47:53.962 Run-time dependency libpcap found: NO (tried pkgconfig) 00:47:53.962 Library pcap found: NO 00:47:53.962 Compiler for C supports arguments -Wcast-qual: YES 00:47:53.962 Compiler for C supports arguments -Wdeprecated: YES 00:47:53.962 Compiler for C supports arguments -Wformat: YES 00:47:53.962 Compiler for C supports arguments -Wformat-nonliteral: YES 00:47:53.962 Compiler for C supports arguments -Wformat-security: YES 00:47:53.962 Compiler for C supports arguments -Wmissing-declarations: YES 00:47:53.962 Compiler for C supports arguments -Wmissing-prototypes: YES 00:47:53.962 Compiler for C supports arguments -Wnested-externs: YES 00:47:53.962 Compiler for C supports arguments -Wold-style-definition: YES 00:47:53.962 Compiler for C supports arguments -Wpointer-arith: YES 00:47:53.962 Compiler for C supports arguments -Wsign-compare: YES 00:47:53.962 Compiler for C supports arguments -Wstrict-prototypes: YES 00:47:53.962 Compiler for C supports arguments -Wundef: YES 00:47:53.962 Compiler for C supports arguments -Wwrite-strings: YES 00:47:53.962 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:47:53.962 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:47:53.962 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:47:53.962 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:47:53.962 Program objdump found: YES (/bin/objdump) 00:47:53.962 Compiler for C supports arguments -mavx512f: YES 00:47:53.962 Checking if "AVX512 checking" compiles: YES 00:47:53.962 Fetching value of define "__SSE4_2__" : 1 00:47:53.962 Fetching value of define "__AES__" : 1 00:47:53.962 Fetching value of define "__AVX__" : 1 00:47:53.962 Fetching value of define "__AVX2__" : 1 00:47:53.962 Fetching value of define "__AVX512BW__" : 1 00:47:53.962 Fetching value of define "__AVX512CD__" : 1 00:47:53.962 Fetching value of define "__AVX512DQ__" : 1 00:47:53.962 Fetching value of define "__AVX512F__" : 1 00:47:53.962 Fetching value of define "__AVX512VL__" : 1 00:47:53.962 Fetching value of define "__PCLMUL__" : 1 00:47:53.962 Fetching value of define "__RDRND__" : 1 00:47:53.962 Fetching value of define "__RDSEED__" : 1 00:47:53.962 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:47:53.962 Fetching value of define "__znver1__" : (undefined) 00:47:53.962 Fetching value of define "__znver2__" : (undefined) 00:47:53.962 Fetching value of define "__znver3__" : (undefined) 00:47:53.962 Fetching value of define "__znver4__" : (undefined) 00:47:53.962 Compiler for C supports arguments -Wno-format-truncation: YES 00:47:53.962 Message: lib/log: Defining dependency "log" 00:47:53.962 Message: lib/kvargs: Defining dependency "kvargs" 00:47:53.962 Message: lib/telemetry: Defining dependency "telemetry" 00:47:53.962 Checking for function "getentropy" : NO 00:47:53.962 Message: lib/eal: Defining dependency "eal" 00:47:53.962 Message: lib/ring: Defining dependency "ring" 00:47:53.962 Message: lib/rcu: Defining dependency "rcu" 00:47:53.962 Message: lib/mempool: Defining dependency "mempool" 00:47:53.962 Message: lib/mbuf: Defining dependency "mbuf" 00:47:53.962 Fetching value of define "__PCLMUL__" : 1 (cached) 00:47:53.963 Fetching value of define "__AVX512F__" : 1 (cached) 00:47:53.963 Fetching value of define "__AVX512BW__" : 1 (cached) 00:47:53.963 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:47:53.963 Fetching value of define "__AVX512VL__" : 1 (cached) 00:47:53.963 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:47:53.963 Compiler for C supports arguments -mpclmul: YES 00:47:53.963 Compiler for C supports arguments -maes: YES 00:47:53.963 Compiler for C supports arguments -mavx512f: YES (cached) 00:47:53.963 Compiler for C supports arguments -mavx512bw: YES 00:47:53.963 Compiler for C supports arguments -mavx512dq: YES 00:47:53.963 Compiler for C supports arguments -mavx512vl: YES 00:47:53.963 Compiler for C supports arguments -mvpclmulqdq: YES 00:47:53.963 Compiler for C supports arguments -mavx2: YES 00:47:53.963 Compiler for C supports arguments -mavx: YES 00:47:53.963 Message: lib/net: Defining dependency "net" 00:47:53.963 Message: lib/meter: Defining dependency "meter" 00:47:53.963 Message: lib/ethdev: Defining dependency "ethdev" 00:47:53.963 Message: lib/pci: Defining dependency "pci" 00:47:53.963 Message: lib/cmdline: Defining dependency "cmdline" 00:47:53.963 Message: lib/hash: Defining dependency "hash" 00:47:53.963 Message: lib/timer: Defining dependency "timer" 00:47:53.963 Message: lib/compressdev: Defining dependency "compressdev" 00:47:53.963 Message: lib/cryptodev: Defining dependency "cryptodev" 00:47:53.963 Message: lib/dmadev: Defining dependency "dmadev" 00:47:53.963 Compiler for C supports arguments -Wno-cast-qual: YES 00:47:53.963 Message: lib/power: Defining dependency "power" 00:47:53.963 Message: lib/reorder: Defining dependency "reorder" 00:47:53.963 Message: lib/security: Defining dependency "security" 00:47:53.963 Has header "linux/userfaultfd.h" : YES 00:47:53.963 Has header "linux/vduse.h" : NO 00:47:53.963 Message: lib/vhost: Defining dependency "vhost" 00:47:53.963 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:47:53.963 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:47:53.963 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:47:53.963 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:47:53.963 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:47:53.963 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:47:53.963 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:47:53.963 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:47:53.963 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:47:53.963 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:47:53.963 Program doxygen found: YES (/bin/doxygen) 00:47:53.963 Configuring doxy-api-html.conf using configuration 00:47:53.963 Configuring doxy-api-man.conf using configuration 00:47:53.963 Program mandb found: YES (/bin/mandb) 00:47:53.963 Program sphinx-build found: NO 00:47:53.963 Configuring rte_build_config.h using configuration 00:47:53.963 Message: 00:47:53.963 ================= 00:47:53.963 Applications Enabled 00:47:53.963 ================= 00:47:53.963 00:47:53.963 apps: 00:47:53.963 00:47:53.963 00:47:53.963 Message: 00:47:53.963 ================= 00:47:53.963 Libraries Enabled 00:47:53.963 ================= 00:47:53.963 00:47:53.963 libs: 00:47:53.963 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:47:53.963 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:47:53.963 cryptodev, dmadev, power, reorder, security, vhost, 00:47:53.963 00:47:53.963 Message: 00:47:53.963 =============== 00:47:53.963 Drivers Enabled 00:47:53.963 =============== 00:47:53.963 00:47:53.963 common: 00:47:53.963 00:47:53.963 bus: 00:47:53.963 pci, vdev, 00:47:53.963 mempool: 00:47:53.963 ring, 00:47:53.963 dma: 00:47:53.963 00:47:53.963 net: 00:47:53.963 00:47:53.963 crypto: 00:47:53.963 00:47:53.963 compress: 00:47:53.963 00:47:53.963 vdpa: 00:47:53.963 00:47:53.963 00:47:53.963 Message: 00:47:53.963 ================= 00:47:53.963 Content Skipped 00:47:53.963 ================= 00:47:53.963 00:47:53.963 apps: 00:47:53.963 dumpcap: explicitly disabled via build config 00:47:53.963 graph: explicitly disabled via build config 00:47:53.963 pdump: explicitly disabled via build config 00:47:53.963 proc-info: explicitly disabled via build config 00:47:53.963 test-acl: explicitly disabled via build config 00:47:53.963 test-bbdev: explicitly disabled via build config 00:47:53.963 test-cmdline: explicitly disabled via build config 00:47:53.963 test-compress-perf: explicitly disabled via build config 00:47:53.963 test-crypto-perf: explicitly disabled via build config 00:47:53.963 test-dma-perf: explicitly disabled via build config 00:47:53.963 test-eventdev: explicitly disabled via build config 00:47:53.963 test-fib: explicitly disabled via build config 00:47:53.963 test-flow-perf: explicitly disabled via build config 00:47:53.963 test-gpudev: explicitly disabled via build config 00:47:53.963 test-mldev: explicitly disabled via build config 00:47:53.963 test-pipeline: explicitly disabled via build config 00:47:53.963 test-pmd: explicitly disabled via build config 00:47:53.963 test-regex: explicitly disabled via build config 00:47:53.963 test-sad: explicitly disabled via build config 00:47:53.963 test-security-perf: explicitly disabled via build config 00:47:53.963 00:47:53.963 libs: 00:47:53.963 argparse: explicitly disabled via build config 00:47:53.963 metrics: explicitly disabled via build config 00:47:53.963 acl: explicitly disabled via build config 00:47:53.963 bbdev: explicitly disabled via build config 00:47:53.963 bitratestats: explicitly disabled via build config 00:47:53.963 bpf: explicitly disabled via build config 00:47:53.963 cfgfile: explicitly disabled via build config 00:47:53.963 distributor: explicitly disabled via build config 00:47:53.963 efd: explicitly disabled via build config 00:47:53.963 eventdev: explicitly disabled via build config 00:47:53.963 dispatcher: explicitly disabled via build config 00:47:53.963 gpudev: explicitly disabled via build config 00:47:53.963 gro: explicitly disabled via build config 00:47:53.963 gso: explicitly disabled via build config 00:47:53.963 ip_frag: explicitly disabled via build config 00:47:53.963 jobstats: explicitly disabled via build config 00:47:53.963 latencystats: explicitly disabled via build config 00:47:53.963 lpm: explicitly disabled via build config 00:47:53.963 member: explicitly disabled via build config 00:47:53.963 pcapng: explicitly disabled via build config 00:47:53.963 rawdev: explicitly disabled via build config 00:47:53.963 regexdev: explicitly disabled via build config 00:47:53.963 mldev: explicitly disabled via build config 00:47:53.963 rib: explicitly disabled via build config 00:47:53.963 sched: explicitly disabled via build config 00:47:53.963 stack: explicitly disabled via build config 00:47:53.963 ipsec: explicitly disabled via build config 00:47:53.963 pdcp: explicitly disabled via build config 00:47:53.963 fib: explicitly disabled via build config 00:47:53.963 port: explicitly disabled via build config 00:47:53.963 pdump: explicitly disabled via build config 00:47:53.963 table: explicitly disabled via build config 00:47:53.963 pipeline: explicitly disabled via build config 00:47:53.963 graph: explicitly disabled via build config 00:47:53.963 node: explicitly disabled via build config 00:47:53.963 00:47:53.963 drivers: 00:47:53.963 common/cpt: not in enabled drivers build config 00:47:53.963 common/dpaax: not in enabled drivers build config 00:47:53.963 common/iavf: not in enabled drivers build config 00:47:53.963 common/idpf: not in enabled drivers build config 00:47:53.963 common/ionic: not in enabled drivers build config 00:47:53.963 common/mvep: not in enabled drivers build config 00:47:53.963 common/octeontx: not in enabled drivers build config 00:47:53.963 bus/auxiliary: not in enabled drivers build config 00:47:53.963 bus/cdx: not in enabled drivers build config 00:47:53.963 bus/dpaa: not in enabled drivers build config 00:47:53.963 bus/fslmc: not in enabled drivers build config 00:47:53.963 bus/ifpga: not in enabled drivers build config 00:47:53.963 bus/platform: not in enabled drivers build config 00:47:53.963 bus/uacce: not in enabled drivers build config 00:47:53.963 bus/vmbus: not in enabled drivers build config 00:47:53.963 common/cnxk: not in enabled drivers build config 00:47:53.963 common/mlx5: not in enabled drivers build config 00:47:53.963 common/nfp: not in enabled drivers build config 00:47:53.963 common/nitrox: not in enabled drivers build config 00:47:53.963 common/qat: not in enabled drivers build config 00:47:53.963 common/sfc_efx: not in enabled drivers build config 00:47:53.963 mempool/bucket: not in enabled drivers build config 00:47:53.963 mempool/cnxk: not in enabled drivers build config 00:47:53.963 mempool/dpaa: not in enabled drivers build config 00:47:53.963 mempool/dpaa2: not in enabled drivers build config 00:47:53.963 mempool/octeontx: not in enabled drivers build config 00:47:53.963 mempool/stack: not in enabled drivers build config 00:47:53.963 dma/cnxk: not in enabled drivers build config 00:47:53.963 dma/dpaa: not in enabled drivers build config 00:47:53.963 dma/dpaa2: not in enabled drivers build config 00:47:53.963 dma/hisilicon: not in enabled drivers build config 00:47:53.963 dma/idxd: not in enabled drivers build config 00:47:53.963 dma/ioat: not in enabled drivers build config 00:47:53.963 dma/skeleton: not in enabled drivers build config 00:47:53.963 net/af_packet: not in enabled drivers build config 00:47:53.963 net/af_xdp: not in enabled drivers build config 00:47:53.963 net/ark: not in enabled drivers build config 00:47:53.963 net/atlantic: not in enabled drivers build config 00:47:53.963 net/avp: not in enabled drivers build config 00:47:53.963 net/axgbe: not in enabled drivers build config 00:47:53.963 net/bnx2x: not in enabled drivers build config 00:47:53.963 net/bnxt: not in enabled drivers build config 00:47:53.963 net/bonding: not in enabled drivers build config 00:47:53.963 net/cnxk: not in enabled drivers build config 00:47:53.963 net/cpfl: not in enabled drivers build config 00:47:53.963 net/cxgbe: not in enabled drivers build config 00:47:53.963 net/dpaa: not in enabled drivers build config 00:47:53.963 net/dpaa2: not in enabled drivers build config 00:47:53.963 net/e1000: not in enabled drivers build config 00:47:53.964 net/ena: not in enabled drivers build config 00:47:53.964 net/enetc: not in enabled drivers build config 00:47:53.964 net/enetfec: not in enabled drivers build config 00:47:53.964 net/enic: not in enabled drivers build config 00:47:53.964 net/failsafe: not in enabled drivers build config 00:47:53.964 net/fm10k: not in enabled drivers build config 00:47:53.964 net/gve: not in enabled drivers build config 00:47:53.964 net/hinic: not in enabled drivers build config 00:47:53.964 net/hns3: not in enabled drivers build config 00:47:53.964 net/i40e: not in enabled drivers build config 00:47:53.964 net/iavf: not in enabled drivers build config 00:47:53.964 net/ice: not in enabled drivers build config 00:47:53.964 net/idpf: not in enabled drivers build config 00:47:53.964 net/igc: not in enabled drivers build config 00:47:53.964 net/ionic: not in enabled drivers build config 00:47:53.964 net/ipn3ke: not in enabled drivers build config 00:47:53.964 net/ixgbe: not in enabled drivers build config 00:47:53.964 net/mana: not in enabled drivers build config 00:47:53.964 net/memif: not in enabled drivers build config 00:47:53.964 net/mlx4: not in enabled drivers build config 00:47:53.964 net/mlx5: not in enabled drivers build config 00:47:53.964 net/mvneta: not in enabled drivers build config 00:47:53.964 net/mvpp2: not in enabled drivers build config 00:47:53.964 net/netvsc: not in enabled drivers build config 00:47:53.964 net/nfb: not in enabled drivers build config 00:47:53.964 net/nfp: not in enabled drivers build config 00:47:53.964 net/ngbe: not in enabled drivers build config 00:47:53.964 net/null: not in enabled drivers build config 00:47:53.964 net/octeontx: not in enabled drivers build config 00:47:53.964 net/octeon_ep: not in enabled drivers build config 00:47:53.964 net/pcap: not in enabled drivers build config 00:47:53.964 net/pfe: not in enabled drivers build config 00:47:53.964 net/qede: not in enabled drivers build config 00:47:53.964 net/ring: not in enabled drivers build config 00:47:53.964 net/sfc: not in enabled drivers build config 00:47:53.964 net/softnic: not in enabled drivers build config 00:47:53.964 net/tap: not in enabled drivers build config 00:47:53.964 net/thunderx: not in enabled drivers build config 00:47:53.964 net/txgbe: not in enabled drivers build config 00:47:53.964 net/vdev_netvsc: not in enabled drivers build config 00:47:53.964 net/vhost: not in enabled drivers build config 00:47:53.964 net/virtio: not in enabled drivers build config 00:47:53.964 net/vmxnet3: not in enabled drivers build config 00:47:53.964 raw/*: missing internal dependency, "rawdev" 00:47:53.964 crypto/armv8: not in enabled drivers build config 00:47:53.964 crypto/bcmfs: not in enabled drivers build config 00:47:53.964 crypto/caam_jr: not in enabled drivers build config 00:47:53.964 crypto/ccp: not in enabled drivers build config 00:47:53.964 crypto/cnxk: not in enabled drivers build config 00:47:53.964 crypto/dpaa_sec: not in enabled drivers build config 00:47:53.964 crypto/dpaa2_sec: not in enabled drivers build config 00:47:53.964 crypto/ipsec_mb: not in enabled drivers build config 00:47:53.964 crypto/mlx5: not in enabled drivers build config 00:47:53.964 crypto/mvsam: not in enabled drivers build config 00:47:53.964 crypto/nitrox: not in enabled drivers build config 00:47:53.964 crypto/null: not in enabled drivers build config 00:47:53.964 crypto/octeontx: not in enabled drivers build config 00:47:53.964 crypto/openssl: not in enabled drivers build config 00:47:53.964 crypto/scheduler: not in enabled drivers build config 00:47:53.964 crypto/uadk: not in enabled drivers build config 00:47:53.964 crypto/virtio: not in enabled drivers build config 00:47:53.964 compress/isal: not in enabled drivers build config 00:47:53.964 compress/mlx5: not in enabled drivers build config 00:47:53.964 compress/nitrox: not in enabled drivers build config 00:47:53.964 compress/octeontx: not in enabled drivers build config 00:47:53.964 compress/zlib: not in enabled drivers build config 00:47:53.964 regex/*: missing internal dependency, "regexdev" 00:47:53.964 ml/*: missing internal dependency, "mldev" 00:47:53.964 vdpa/ifc: not in enabled drivers build config 00:47:53.964 vdpa/mlx5: not in enabled drivers build config 00:47:53.964 vdpa/nfp: not in enabled drivers build config 00:47:53.964 vdpa/sfc: not in enabled drivers build config 00:47:53.964 event/*: missing internal dependency, "eventdev" 00:47:53.964 baseband/*: missing internal dependency, "bbdev" 00:47:53.964 gpu/*: missing internal dependency, "gpudev" 00:47:53.964 00:47:53.964 00:47:53.964 Build targets in project: 85 00:47:53.964 00:47:53.964 DPDK 24.03.0 00:47:53.964 00:47:53.964 User defined options 00:47:53.964 default_library : shared 00:47:53.964 libdir : lib 00:47:53.964 prefix : /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build 00:47:53.964 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:47:53.964 c_link_args : 00:47:53.964 cpu_instruction_set: native 00:47:53.964 disable_apps : dumpcap,test-eventdev,test,graph,test-fib,pdump,test-flow-perf,proc-info,test-gpudev,test-acl,test-mldev,test-bbdev,test-pipeline,test-cmdline,test-pmd,test-compress-perf,test-regex,test-crypto-perf,test-sad,test-dma-perf,test-security-perf 00:47:53.964 disable_libs : bitratestats,efd,ipsec,metrics,table,bpf,jobstats,mldev,rawdev,cfgfile,eventdev,fib,latencystats,node,regexdev,gpudev,pcapng,graph,lpm,rib,dispatcher,gro,pdcp,acl,distributor,gso,member,pdump,sched,argparse,pipeline,bbdev,ip_frag,port,stack 00:47:53.964 enable_docs : false 00:47:53.964 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:47:53.964 enable_kmods : false 00:47:53.964 tests : false 00:47:53.964 00:47:53.964 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:47:53.964 ninja: Entering directory `/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp' 00:47:53.964 [1/267] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:47:53.964 [2/267] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:47:53.964 [3/267] Compiling C object lib/librte_log.a.p/log_log.c.o 00:47:53.964 [4/267] Linking static target lib/librte_log.a 00:47:53.964 [5/267] Linking static target lib/librte_kvargs.a 00:47:53.964 [6/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:47:53.964 [7/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:47:53.964 [8/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:47:53.964 [9/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:47:53.964 [10/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:47:53.964 [11/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:47:53.964 [12/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:47:53.964 [13/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:47:53.964 [14/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:47:53.964 [15/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:47:53.964 [16/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:47:53.964 [17/267] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:47:53.964 [18/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:47:53.964 [19/267] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:47:53.964 [20/267] Linking static target lib/librte_telemetry.a 00:47:53.964 [21/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:47:53.964 [22/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:47:53.964 [23/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:47:53.964 [24/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:47:53.964 [25/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:47:53.964 [26/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:47:53.964 [27/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:47:53.964 [28/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:47:53.964 [29/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:47:53.964 [30/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:47:53.964 [31/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:47:53.964 [32/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:47:53.964 [33/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:47:53.964 [34/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:47:53.964 [35/267] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:47:53.964 [36/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:47:53.964 [37/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:47:53.964 [38/267] Linking target lib/librte_log.so.24.1 00:47:53.964 [39/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:47:53.964 [40/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:47:53.964 [41/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:47:53.964 [42/267] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:47:53.964 [43/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:47:53.964 [44/267] Linking target lib/librte_kvargs.so.24.1 00:47:53.964 [45/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:47:53.964 [46/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:47:53.964 [47/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:47:53.964 [48/267] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:47:53.964 [49/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:47:53.964 [50/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:47:53.964 [51/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:47:53.964 [52/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:47:53.964 [53/267] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:47:53.964 [54/267] Linking target lib/librte_telemetry.so.24.1 00:47:53.964 [55/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:47:53.964 [56/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:47:53.964 [57/267] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:47:53.964 [58/267] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:47:54.245 [59/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:47:54.245 [60/267] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:47:54.245 [61/267] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:47:54.245 [62/267] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:47:54.245 [63/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:47:54.245 [64/267] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:47:54.504 [65/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:47:54.504 [66/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:47:54.504 [67/267] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:47:54.763 [68/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:47:54.763 [69/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:47:54.763 [70/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:47:55.022 [71/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:47:55.022 [72/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:47:55.022 [73/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:47:55.022 [74/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:47:55.022 [75/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:47:55.022 [76/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:47:55.280 [77/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:47:55.280 [78/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:47:55.280 [79/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:47:55.540 [80/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:47:55.540 [81/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:47:55.540 [82/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:47:55.799 [83/267] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:47:55.799 [84/267] Linking static target lib/librte_ring.a 00:47:55.799 [85/267] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:47:56.057 [86/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:47:56.057 [87/267] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:47:56.057 [88/267] Linking static target lib/librte_eal.a 00:47:56.057 [89/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:47:56.057 [90/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:47:56.057 [91/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:47:56.316 [92/267] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:47:56.574 [93/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:47:56.574 [94/267] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:47:56.574 [95/267] Linking static target lib/net/libnet_crc_avx512_lib.a 00:47:56.574 [96/267] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:47:56.574 [97/267] Linking static target lib/librte_rcu.a 00:47:56.574 [98/267] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:47:56.833 [99/267] Linking static target lib/librte_mbuf.a 00:47:56.833 [100/267] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:47:56.833 [101/267] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:47:56.833 [102/267] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:47:56.833 [103/267] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:47:56.833 [104/267] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:47:56.833 [105/267] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:47:57.091 [106/267] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:47:57.091 [107/267] Linking static target lib/librte_mempool.a 00:47:57.091 [108/267] Linking static target lib/librte_net.a 00:47:57.091 [109/267] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:47:57.091 [110/267] Linking static target lib/librte_meter.a 00:47:57.349 [111/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:47:57.349 [112/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:47:57.349 [113/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:47:57.608 [114/267] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:47:57.867 [115/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:47:57.867 [116/267] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:47:57.867 [117/267] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:47:58.127 [118/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:47:58.386 [119/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:47:58.386 [120/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:47:58.644 [121/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:47:58.902 [122/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:47:59.160 [123/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:47:59.160 [124/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:47:59.160 [125/267] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:47:59.419 [126/267] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:47:59.419 [127/267] Linking static target lib/librte_pci.a 00:47:59.419 [128/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:47:59.419 [129/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:47:59.419 [130/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:47:59.678 [131/267] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:47:59.678 [132/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:47:59.678 [133/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:47:59.678 [134/267] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:47:59.678 [135/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:47:59.678 [136/267] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:47:59.936 [137/267] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:47:59.936 [138/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:47:59.936 [139/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:47:59.936 [140/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:47:59.936 [141/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:47:59.936 [142/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:47:59.936 [143/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:47:59.936 [144/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:47:59.936 [145/267] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:47:59.936 [146/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:48:00.194 [147/267] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:48:00.194 [148/267] Linking static target lib/librte_cmdline.a 00:48:00.194 [149/267] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:48:00.194 [150/267] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:48:00.451 [151/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:48:00.451 [152/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:48:00.708 [153/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:48:00.966 [154/267] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:48:00.966 [155/267] Linking static target lib/librte_compressdev.a 00:48:00.966 [156/267] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:48:00.966 [157/267] Linking static target lib/librte_timer.a 00:48:00.966 [158/267] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:48:00.966 [159/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:48:01.224 [160/267] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:48:01.481 [161/267] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:48:01.481 [162/267] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:48:01.481 [163/267] Linking static target lib/librte_dmadev.a 00:48:01.739 [164/267] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:48:01.739 [165/267] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:48:01.996 [166/267] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:48:01.996 [167/267] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:48:01.996 [168/267] Linking static target lib/librte_ethdev.a 00:48:02.254 [169/267] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:48:02.254 [170/267] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:48:02.512 [171/267] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:48:02.512 [172/267] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:48:02.512 [173/267] Linking static target lib/librte_hash.a 00:48:02.770 [174/267] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:48:02.770 [175/267] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:48:02.770 [176/267] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:48:02.770 [177/267] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:48:03.029 [178/267] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:48:03.287 [179/267] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:48:03.287 [180/267] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:48:03.287 [181/267] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:48:03.287 [182/267] Linking static target lib/librte_power.a 00:48:03.546 [183/267] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:48:03.546 [184/267] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:48:03.546 [185/267] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:48:03.546 [186/267] Linking static target lib/librte_reorder.a 00:48:03.546 [187/267] Linking static target lib/librte_security.a 00:48:03.805 [188/267] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:48:03.805 [189/267] Linking static target lib/librte_cryptodev.a 00:48:04.064 [190/267] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:48:04.064 [191/267] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:48:04.064 [192/267] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:48:04.322 [193/267] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:48:04.579 [194/267] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:48:05.145 [195/267] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:48:05.145 [196/267] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:48:05.402 [197/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:48:05.402 [198/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:48:05.402 [199/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:48:05.660 [200/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:48:05.660 [201/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:48:05.919 [202/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:48:05.919 [203/267] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:48:05.919 [204/267] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:48:05.919 [205/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:48:06.486 [206/267] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:48:06.486 [207/267] Linking static target drivers/libtmp_rte_bus_pci.a 00:48:06.745 [208/267] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:48:06.745 [209/267] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:48:06.745 [210/267] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:48:06.745 [211/267] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:48:06.745 [212/267] Linking static target drivers/libtmp_rte_bus_vdev.a 00:48:06.745 [213/267] Linking static target drivers/librte_bus_pci.a 00:48:07.004 [214/267] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:48:07.265 [215/267] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:48:07.265 [216/267] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:48:07.265 [217/267] Linking static target drivers/librte_bus_vdev.a 00:48:07.529 [218/267] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:48:07.529 [219/267] Linking static target drivers/libtmp_rte_mempool_ring.a 00:48:07.789 [220/267] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:48:07.789 [221/267] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:48:07.789 [222/267] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:48:07.789 [223/267] Linking static target drivers/librte_mempool_ring.a 00:48:07.789 [224/267] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:48:08.047 [225/267] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:48:08.306 [226/267] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:48:10.208 [227/267] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:48:12.741 [228/267] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:48:12.999 [229/267] Linking target lib/librte_eal.so.24.1 00:48:13.257 [230/267] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:48:13.257 [231/267] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:48:13.257 [232/267] Linking target drivers/librte_bus_vdev.so.24.1 00:48:13.257 [233/267] Linking target lib/librte_ring.so.24.1 00:48:13.257 [234/267] Linking target lib/librte_pci.so.24.1 00:48:13.257 [235/267] Linking target lib/librte_meter.so.24.1 00:48:13.257 [236/267] Linking target lib/librte_dmadev.so.24.1 00:48:13.257 [237/267] Linking target lib/librte_timer.so.24.1 00:48:13.257 [238/267] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:48:13.257 [239/267] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:48:13.257 [240/267] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:48:13.257 [241/267] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:48:13.257 [242/267] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:48:13.516 [243/267] Linking target drivers/librte_bus_pci.so.24.1 00:48:13.516 [244/267] Linking target lib/librte_rcu.so.24.1 00:48:13.516 [245/267] Linking target lib/librte_mempool.so.24.1 00:48:13.516 [246/267] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:48:13.516 [247/267] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:48:13.516 [248/267] Linking target lib/librte_mbuf.so.24.1 00:48:13.516 [249/267] Linking target drivers/librte_mempool_ring.so.24.1 00:48:13.775 [250/267] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:48:13.775 [251/267] Linking target lib/librte_compressdev.so.24.1 00:48:14.034 [252/267] Linking target lib/librte_net.so.24.1 00:48:14.034 [253/267] Linking target lib/librte_cryptodev.so.24.1 00:48:14.034 [254/267] Linking target lib/librte_reorder.so.24.1 00:48:14.034 [255/267] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:48:14.034 [256/267] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:48:14.034 [257/267] Linking target lib/librte_hash.so.24.1 00:48:14.034 [258/267] Linking target lib/librte_cmdline.so.24.1 00:48:14.034 [259/267] Linking target lib/librte_security.so.24.1 00:48:14.034 [260/267] Linking target lib/librte_ethdev.so.24.1 00:48:14.292 [261/267] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:48:14.292 [262/267] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:48:14.292 [263/267] Linking target lib/librte_power.so.24.1 00:48:16.900 [264/267] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:48:16.900 [265/267] Linking static target lib/librte_vhost.a 00:48:18.805 [266/267] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:48:18.805 [267/267] Linking target lib/librte_vhost.so.24.1 00:48:18.805 INFO: autodetecting backend as ninja 00:48:18.805 INFO: calculating backend command to run: /usr/local/bin/ninja -C /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp -j 10 00:48:20.179 CC lib/ut_mock/mock.o 00:48:20.179 CC lib/log/log_deprecated.o 00:48:20.179 CC lib/log/log.o 00:48:20.179 CC lib/log/log_flags.o 00:48:20.179 LIB libspdk_ut_mock.a 00:48:20.179 SO libspdk_ut_mock.so.6.0 00:48:20.179 SYMLINK libspdk_ut_mock.so 00:48:20.179 LIB libspdk_log.a 00:48:20.179 SO libspdk_log.so.7.0 00:48:20.452 SYMLINK libspdk_log.so 00:48:20.452 CC lib/util/base64.o 00:48:20.452 CC lib/util/bit_array.o 00:48:20.452 CXX lib/trace_parser/trace.o 00:48:20.452 CC lib/util/crc32.o 00:48:20.452 CC lib/util/cpuset.o 00:48:20.452 CC lib/util/crc16.o 00:48:20.452 CC lib/util/crc32c.o 00:48:20.452 CC lib/dma/dma.o 00:48:20.710 CC lib/ioat/ioat.o 00:48:20.710 CC lib/vfio_user/host/vfio_user_pci.o 00:48:20.710 CC lib/vfio_user/host/vfio_user.o 00:48:20.710 CC lib/util/crc32_ieee.o 00:48:20.710 CC lib/util/crc64.o 00:48:20.710 CC lib/util/dif.o 00:48:20.710 CC lib/util/fd.o 00:48:20.710 LIB libspdk_dma.a 00:48:20.710 SO libspdk_dma.so.4.0 00:48:20.968 CC lib/util/file.o 00:48:20.968 SYMLINK libspdk_dma.so 00:48:20.968 CC lib/util/hexlify.o 00:48:20.968 CC lib/util/iov.o 00:48:20.968 CC lib/util/math.o 00:48:20.968 LIB libspdk_ioat.a 00:48:20.968 CC lib/util/pipe.o 00:48:20.968 CC lib/util/strerror_tls.o 00:48:20.968 SO libspdk_ioat.so.7.0 00:48:20.968 LIB libspdk_vfio_user.a 00:48:20.968 SO libspdk_vfio_user.so.5.0 00:48:20.968 CC lib/util/string.o 00:48:20.968 SYMLINK libspdk_ioat.so 00:48:20.968 CC lib/util/uuid.o 00:48:20.968 CC lib/util/fd_group.o 00:48:20.968 SYMLINK libspdk_vfio_user.so 00:48:20.968 CC lib/util/xor.o 00:48:20.968 CC lib/util/zipf.o 00:48:21.535 LIB libspdk_trace_parser.a 00:48:21.535 SO libspdk_trace_parser.so.5.0 00:48:21.535 SYMLINK libspdk_trace_parser.so 00:48:21.794 LIB libspdk_util.a 00:48:21.794 SO libspdk_util.so.9.0 00:48:22.053 SYMLINK libspdk_util.so 00:48:22.311 CC lib/json/json_parse.o 00:48:22.311 CC lib/json/json_util.o 00:48:22.311 CC lib/json/json_write.o 00:48:22.311 CC lib/conf/conf.o 00:48:22.311 CC lib/env_dpdk/env.o 00:48:22.311 CC lib/env_dpdk/memory.o 00:48:22.311 CC lib/env_dpdk/pci.o 00:48:22.311 CC lib/env_dpdk/init.o 00:48:22.311 CC lib/env_dpdk/threads.o 00:48:22.311 CC lib/vmd/vmd.o 00:48:22.311 CC lib/env_dpdk/pci_ioat.o 00:48:22.569 LIB libspdk_conf.a 00:48:22.569 CC lib/vmd/led.o 00:48:22.569 SO libspdk_conf.so.6.0 00:48:22.569 SYMLINK libspdk_conf.so 00:48:22.569 CC lib/env_dpdk/pci_virtio.o 00:48:22.569 CC lib/env_dpdk/pci_vmd.o 00:48:22.569 CC lib/env_dpdk/pci_event.o 00:48:22.569 CC lib/env_dpdk/pci_idxd.o 00:48:22.569 CC lib/env_dpdk/sigbus_handler.o 00:48:22.827 CC lib/env_dpdk/pci_dpdk.o 00:48:22.827 CC lib/env_dpdk/pci_dpdk_2207.o 00:48:22.827 CC lib/env_dpdk/pci_dpdk_2211.o 00:48:22.827 LIB libspdk_json.a 00:48:22.827 SO libspdk_json.so.6.0 00:48:22.827 LIB libspdk_vmd.a 00:48:22.827 SYMLINK libspdk_json.so 00:48:22.827 SO libspdk_vmd.so.6.0 00:48:23.085 SYMLINK libspdk_vmd.so 00:48:23.085 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:48:23.085 CC lib/jsonrpc/jsonrpc_client.o 00:48:23.085 CC lib/jsonrpc/jsonrpc_server.o 00:48:23.085 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:48:23.343 LIB libspdk_jsonrpc.a 00:48:23.343 SO libspdk_jsonrpc.so.6.0 00:48:23.601 SYMLINK libspdk_jsonrpc.so 00:48:23.601 LIB libspdk_env_dpdk.a 00:48:23.601 SO libspdk_env_dpdk.so.14.1 00:48:23.601 SYMLINK libspdk_env_dpdk.so 00:48:23.860 CC lib/rpc/rpc.o 00:48:24.118 LIB libspdk_rpc.a 00:48:24.118 SO libspdk_rpc.so.6.0 00:48:24.118 SYMLINK libspdk_rpc.so 00:48:24.376 CC lib/keyring/keyring.o 00:48:24.376 CC lib/keyring/keyring_rpc.o 00:48:24.376 CC lib/trace/trace.o 00:48:24.376 CC lib/notify/notify.o 00:48:24.376 CC lib/trace/trace_flags.o 00:48:24.376 CC lib/notify/notify_rpc.o 00:48:24.377 CC lib/trace/trace_rpc.o 00:48:24.635 LIB libspdk_notify.a 00:48:24.635 SO libspdk_notify.so.6.0 00:48:24.635 LIB libspdk_keyring.a 00:48:24.635 SYMLINK libspdk_notify.so 00:48:24.635 SO libspdk_keyring.so.1.0 00:48:24.635 LIB libspdk_trace.a 00:48:24.893 SYMLINK libspdk_keyring.so 00:48:24.893 SO libspdk_trace.so.10.0 00:48:24.893 SYMLINK libspdk_trace.so 00:48:25.152 CC lib/sock/sock.o 00:48:25.152 CC lib/thread/thread.o 00:48:25.152 CC lib/thread/iobuf.o 00:48:25.152 CC lib/sock/sock_rpc.o 00:48:25.411 LIB libspdk_sock.a 00:48:25.411 SO libspdk_sock.so.9.0 00:48:25.669 SYMLINK libspdk_sock.so 00:48:25.929 CC lib/nvme/nvme_ctrlr_cmd.o 00:48:25.929 CC lib/nvme/nvme_ctrlr.o 00:48:25.929 CC lib/nvme/nvme_ns_cmd.o 00:48:25.929 CC lib/nvme/nvme_fabric.o 00:48:25.929 CC lib/nvme/nvme_ns.o 00:48:25.929 CC lib/nvme/nvme_pcie.o 00:48:25.929 CC lib/nvme/nvme_pcie_common.o 00:48:25.929 CC lib/nvme/nvme_qpair.o 00:48:25.929 CC lib/nvme/nvme.o 00:48:26.496 LIB libspdk_thread.a 00:48:26.496 SO libspdk_thread.so.10.0 00:48:26.496 CC lib/nvme/nvme_quirks.o 00:48:26.496 SYMLINK libspdk_thread.so 00:48:26.496 CC lib/nvme/nvme_transport.o 00:48:26.755 CC lib/accel/accel.o 00:48:26.755 CC lib/blob/blobstore.o 00:48:27.012 CC lib/nvme/nvme_discovery.o 00:48:27.013 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:48:27.013 CC lib/blob/request.o 00:48:27.013 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:48:27.013 CC lib/init/json_config.o 00:48:27.013 CC lib/virtio/virtio.o 00:48:27.270 CC lib/init/subsystem.o 00:48:27.270 CC lib/init/subsystem_rpc.o 00:48:27.270 CC lib/blob/zeroes.o 00:48:27.270 CC lib/blob/blob_bs_dev.o 00:48:27.270 CC lib/init/rpc.o 00:48:27.270 CC lib/virtio/virtio_vhost_user.o 00:48:27.270 CC lib/virtio/virtio_vfio_user.o 00:48:27.529 CC lib/virtio/virtio_pci.o 00:48:27.529 CC lib/accel/accel_rpc.o 00:48:27.529 CC lib/accel/accel_sw.o 00:48:27.529 LIB libspdk_init.a 00:48:27.529 SO libspdk_init.so.5.0 00:48:27.529 CC lib/nvme/nvme_tcp.o 00:48:27.529 SYMLINK libspdk_init.so 00:48:27.529 CC lib/nvme/nvme_opal.o 00:48:27.529 CC lib/nvme/nvme_io_msg.o 00:48:27.787 CC lib/nvme/nvme_poll_group.o 00:48:27.787 CC lib/nvme/nvme_zns.o 00:48:27.787 CC lib/nvme/nvme_stubs.o 00:48:27.787 CC lib/nvme/nvme_auth.o 00:48:27.787 LIB libspdk_virtio.a 00:48:28.045 LIB libspdk_accel.a 00:48:28.045 SO libspdk_virtio.so.7.0 00:48:28.045 SO libspdk_accel.so.15.0 00:48:28.045 SYMLINK libspdk_virtio.so 00:48:28.045 SYMLINK libspdk_accel.so 00:48:28.045 CC lib/nvme/nvme_cuse.o 00:48:28.303 CC lib/event/app.o 00:48:28.303 CC lib/event/log_rpc.o 00:48:28.303 CC lib/event/reactor.o 00:48:28.303 CC lib/event/app_rpc.o 00:48:28.303 CC lib/bdev/bdev.o 00:48:28.562 CC lib/event/scheduler_static.o 00:48:28.562 CC lib/bdev/bdev_rpc.o 00:48:28.562 CC lib/bdev/bdev_zone.o 00:48:28.562 CC lib/bdev/part.o 00:48:28.562 CC lib/bdev/scsi_nvme.o 00:48:28.821 LIB libspdk_event.a 00:48:28.821 SO libspdk_event.so.13.1 00:48:28.821 SYMLINK libspdk_event.so 00:48:29.079 LIB libspdk_nvme.a 00:48:29.079 SO libspdk_nvme.so.13.0 00:48:29.079 SYMLINK libspdk_nvme.so 00:48:29.645 LIB libspdk_blob.a 00:48:29.902 SO libspdk_blob.so.11.0 00:48:29.902 SYMLINK libspdk_blob.so 00:48:30.160 CC lib/blobfs/tree.o 00:48:30.160 CC lib/blobfs/blobfs.o 00:48:30.160 CC lib/lvol/lvol.o 00:48:31.096 LIB libspdk_bdev.a 00:48:31.096 LIB libspdk_blobfs.a 00:48:31.096 SO libspdk_bdev.so.15.0 00:48:31.096 LIB libspdk_lvol.a 00:48:31.096 SO libspdk_blobfs.so.10.0 00:48:31.096 SO libspdk_lvol.so.10.0 00:48:31.096 SYMLINK libspdk_bdev.so 00:48:31.096 SYMLINK libspdk_blobfs.so 00:48:31.096 SYMLINK libspdk_lvol.so 00:48:31.354 CC lib/nbd/nbd.o 00:48:31.354 CC lib/nbd/nbd_rpc.o 00:48:31.354 CC lib/scsi/dev.o 00:48:31.354 CC lib/scsi/lun.o 00:48:31.354 CC lib/scsi/port.o 00:48:31.354 CC lib/scsi/scsi.o 00:48:31.354 CC lib/scsi/scsi_pr.o 00:48:31.354 CC lib/scsi/scsi_bdev.o 00:48:31.354 CC lib/ftl/ftl_core.o 00:48:31.354 CC lib/nvmf/ctrlr.o 00:48:31.612 CC lib/nvmf/ctrlr_discovery.o 00:48:31.613 CC lib/nvmf/ctrlr_bdev.o 00:48:31.613 CC lib/ftl/ftl_init.o 00:48:31.613 CC lib/ftl/ftl_layout.o 00:48:31.871 CC lib/ftl/ftl_debug.o 00:48:31.871 CC lib/nvmf/subsystem.o 00:48:31.871 CC lib/nvmf/nvmf.o 00:48:31.871 LIB libspdk_nbd.a 00:48:31.871 CC lib/nvmf/nvmf_rpc.o 00:48:31.871 SO libspdk_nbd.so.7.0 00:48:31.871 CC lib/nvmf/transport.o 00:48:31.871 SYMLINK libspdk_nbd.so 00:48:31.871 CC lib/nvmf/tcp.o 00:48:31.871 CC lib/scsi/scsi_rpc.o 00:48:31.871 CC lib/scsi/task.o 00:48:32.129 CC lib/ftl/ftl_io.o 00:48:32.129 CC lib/ftl/ftl_sb.o 00:48:32.129 CC lib/ftl/ftl_l2p.o 00:48:32.129 LIB libspdk_scsi.a 00:48:32.129 SO libspdk_scsi.so.9.0 00:48:32.388 CC lib/nvmf/stubs.o 00:48:32.388 CC lib/nvmf/mdns_server.o 00:48:32.388 SYMLINK libspdk_scsi.so 00:48:32.388 CC lib/nvmf/auth.o 00:48:32.388 CC lib/ftl/ftl_l2p_flat.o 00:48:32.388 CC lib/ftl/ftl_nv_cache.o 00:48:32.388 CC lib/ftl/ftl_band.o 00:48:32.646 CC lib/iscsi/conn.o 00:48:32.646 CC lib/iscsi/init_grp.o 00:48:32.646 CC lib/iscsi/iscsi.o 00:48:32.646 CC lib/vhost/vhost.o 00:48:32.905 CC lib/vhost/vhost_rpc.o 00:48:32.905 CC lib/vhost/vhost_scsi.o 00:48:32.905 CC lib/iscsi/md5.o 00:48:32.905 CC lib/vhost/vhost_blk.o 00:48:33.163 CC lib/vhost/rte_vhost_user.o 00:48:33.163 CC lib/ftl/ftl_band_ops.o 00:48:33.163 CC lib/ftl/ftl_writer.o 00:48:33.163 LIB libspdk_nvmf.a 00:48:33.163 CC lib/iscsi/param.o 00:48:33.163 SO libspdk_nvmf.so.18.1 00:48:33.421 SYMLINK libspdk_nvmf.so 00:48:33.421 CC lib/ftl/ftl_rq.o 00:48:33.421 CC lib/ftl/ftl_reloc.o 00:48:33.421 CC lib/ftl/ftl_l2p_cache.o 00:48:33.421 CC lib/iscsi/portal_grp.o 00:48:33.421 CC lib/iscsi/tgt_node.o 00:48:33.678 CC lib/iscsi/iscsi_subsystem.o 00:48:33.678 CC lib/iscsi/iscsi_rpc.o 00:48:33.678 CC lib/iscsi/task.o 00:48:33.678 CC lib/ftl/ftl_p2l.o 00:48:33.678 CC lib/ftl/mngt/ftl_mngt.o 00:48:33.935 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:48:33.936 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:48:33.936 CC lib/ftl/mngt/ftl_mngt_startup.o 00:48:33.936 CC lib/ftl/mngt/ftl_mngt_md.o 00:48:33.936 CC lib/ftl/mngt/ftl_mngt_misc.o 00:48:33.936 LIB libspdk_vhost.a 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:48:34.191 SO libspdk_vhost.so.8.0 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_band.o 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:48:34.191 SYMLINK libspdk_vhost.so 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:48:34.191 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:48:34.191 LIB libspdk_iscsi.a 00:48:34.191 SO libspdk_iscsi.so.8.0 00:48:34.191 CC lib/ftl/utils/ftl_conf.o 00:48:34.191 CC lib/ftl/utils/ftl_md.o 00:48:34.191 CC lib/ftl/utils/ftl_mempool.o 00:48:34.191 CC lib/ftl/utils/ftl_bitmap.o 00:48:34.448 CC lib/ftl/utils/ftl_property.o 00:48:34.448 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:48:34.448 SYMLINK libspdk_iscsi.so 00:48:34.448 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:48:34.448 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:48:34.448 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:48:34.448 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:48:34.448 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:48:34.448 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:48:34.706 CC lib/ftl/upgrade/ftl_sb_v3.o 00:48:34.706 CC lib/ftl/upgrade/ftl_sb_v5.o 00:48:34.706 CC lib/ftl/nvc/ftl_nvc_dev.o 00:48:34.706 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:48:34.706 CC lib/ftl/base/ftl_base_dev.o 00:48:34.706 CC lib/ftl/base/ftl_base_bdev.o 00:48:34.964 LIB libspdk_ftl.a 00:48:34.964 SO libspdk_ftl.so.9.0 00:48:35.220 SYMLINK libspdk_ftl.so 00:48:35.478 CC module/env_dpdk/env_dpdk_rpc.o 00:48:35.478 CC module/keyring/linux/keyring.o 00:48:35.478 CC module/sock/posix/posix.o 00:48:35.735 CC module/scheduler/gscheduler/gscheduler.o 00:48:35.736 CC module/keyring/file/keyring.o 00:48:35.736 CC module/accel/ioat/accel_ioat.o 00:48:35.736 CC module/scheduler/dynamic/scheduler_dynamic.o 00:48:35.736 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:48:35.736 CC module/accel/error/accel_error.o 00:48:35.736 CC module/blob/bdev/blob_bdev.o 00:48:35.736 LIB libspdk_env_dpdk_rpc.a 00:48:35.736 SO libspdk_env_dpdk_rpc.so.6.0 00:48:35.736 LIB libspdk_scheduler_gscheduler.a 00:48:35.736 CC module/keyring/linux/keyring_rpc.o 00:48:35.736 CC module/keyring/file/keyring_rpc.o 00:48:35.736 SO libspdk_scheduler_gscheduler.so.4.0 00:48:35.736 LIB libspdk_scheduler_dpdk_governor.a 00:48:35.736 CC module/accel/ioat/accel_ioat_rpc.o 00:48:35.736 SYMLINK libspdk_env_dpdk_rpc.so 00:48:35.736 CC module/accel/error/accel_error_rpc.o 00:48:35.736 SO libspdk_scheduler_dpdk_governor.so.4.0 00:48:35.736 SYMLINK libspdk_scheduler_gscheduler.so 00:48:35.736 LIB libspdk_scheduler_dynamic.a 00:48:35.994 SO libspdk_scheduler_dynamic.so.4.0 00:48:35.994 SYMLINK libspdk_scheduler_dpdk_governor.so 00:48:35.994 SYMLINK libspdk_scheduler_dynamic.so 00:48:35.994 LIB libspdk_keyring_linux.a 00:48:35.994 SO libspdk_keyring_linux.so.1.0 00:48:35.994 LIB libspdk_accel_ioat.a 00:48:35.994 LIB libspdk_blob_bdev.a 00:48:35.994 LIB libspdk_keyring_file.a 00:48:35.994 LIB libspdk_accel_error.a 00:48:35.994 SO libspdk_blob_bdev.so.11.0 00:48:35.994 SO libspdk_accel_ioat.so.6.0 00:48:35.994 SYMLINK libspdk_keyring_linux.so 00:48:35.994 SO libspdk_keyring_file.so.1.0 00:48:35.994 SO libspdk_accel_error.so.2.0 00:48:35.994 SYMLINK libspdk_blob_bdev.so 00:48:35.994 SYMLINK libspdk_accel_ioat.so 00:48:35.994 SYMLINK libspdk_keyring_file.so 00:48:35.994 SYMLINK libspdk_accel_error.so 00:48:36.251 CC module/bdev/lvol/vbdev_lvol.o 00:48:36.251 CC module/bdev/gpt/gpt.o 00:48:36.251 CC module/bdev/passthru/vbdev_passthru.o 00:48:36.251 CC module/blobfs/bdev/blobfs_bdev.o 00:48:36.251 CC module/bdev/error/vbdev_error.o 00:48:36.252 CC module/bdev/null/bdev_null.o 00:48:36.252 CC module/bdev/malloc/bdev_malloc.o 00:48:36.252 CC module/bdev/nvme/bdev_nvme.o 00:48:36.252 CC module/bdev/delay/vbdev_delay.o 00:48:36.252 LIB libspdk_sock_posix.a 00:48:36.510 SO libspdk_sock_posix.so.6.0 00:48:36.511 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:48:36.511 SYMLINK libspdk_sock_posix.so 00:48:36.511 CC module/bdev/nvme/bdev_nvme_rpc.o 00:48:36.511 CC module/bdev/gpt/vbdev_gpt.o 00:48:36.768 CC module/bdev/null/bdev_null_rpc.o 00:48:36.768 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:48:36.768 CC module/bdev/error/vbdev_error_rpc.o 00:48:36.768 CC module/bdev/malloc/bdev_malloc_rpc.o 00:48:36.768 LIB libspdk_blobfs_bdev.a 00:48:36.768 SO libspdk_blobfs_bdev.so.6.0 00:48:36.768 CC module/bdev/delay/vbdev_delay_rpc.o 00:48:36.768 SYMLINK libspdk_blobfs_bdev.so 00:48:36.768 CC module/bdev/nvme/nvme_rpc.o 00:48:36.768 LIB libspdk_bdev_null.a 00:48:36.768 LIB libspdk_bdev_malloc.a 00:48:36.768 SO libspdk_bdev_null.so.6.0 00:48:36.768 LIB libspdk_bdev_gpt.a 00:48:36.768 LIB libspdk_bdev_error.a 00:48:36.768 SO libspdk_bdev_malloc.so.6.0 00:48:37.026 SO libspdk_bdev_error.so.6.0 00:48:37.026 SO libspdk_bdev_gpt.so.6.0 00:48:37.026 SYMLINK libspdk_bdev_null.so 00:48:37.026 LIB libspdk_bdev_passthru.a 00:48:37.026 SYMLINK libspdk_bdev_malloc.so 00:48:37.026 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:48:37.026 LIB libspdk_bdev_delay.a 00:48:37.026 SYMLINK libspdk_bdev_gpt.so 00:48:37.026 SYMLINK libspdk_bdev_error.so 00:48:37.026 CC module/bdev/nvme/bdev_mdns_client.o 00:48:37.026 SO libspdk_bdev_passthru.so.6.0 00:48:37.026 CC module/bdev/nvme/vbdev_opal.o 00:48:37.026 SO libspdk_bdev_delay.so.6.0 00:48:37.026 SYMLINK libspdk_bdev_passthru.so 00:48:37.026 CC module/bdev/nvme/vbdev_opal_rpc.o 00:48:37.026 SYMLINK libspdk_bdev_delay.so 00:48:37.026 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:48:37.284 CC module/bdev/raid/bdev_raid.o 00:48:37.284 CC module/bdev/split/vbdev_split.o 00:48:37.284 CC module/bdev/split/vbdev_split_rpc.o 00:48:37.284 CC module/bdev/zone_block/vbdev_zone_block.o 00:48:37.284 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:48:37.284 CC module/bdev/raid/bdev_raid_rpc.o 00:48:37.284 CC module/bdev/aio/bdev_aio.o 00:48:37.284 CC module/bdev/aio/bdev_aio_rpc.o 00:48:37.543 LIB libspdk_bdev_lvol.a 00:48:37.543 SO libspdk_bdev_lvol.so.6.0 00:48:37.543 LIB libspdk_bdev_split.a 00:48:37.543 CC module/bdev/ftl/bdev_ftl.o 00:48:37.543 CC module/bdev/raid/bdev_raid_sb.o 00:48:37.543 SO libspdk_bdev_split.so.6.0 00:48:37.543 SYMLINK libspdk_bdev_lvol.so 00:48:37.543 CC module/bdev/ftl/bdev_ftl_rpc.o 00:48:37.543 SYMLINK libspdk_bdev_split.so 00:48:37.543 CC module/bdev/raid/raid0.o 00:48:37.543 CC module/bdev/raid/raid1.o 00:48:37.543 LIB libspdk_bdev_zone_block.a 00:48:37.543 SO libspdk_bdev_zone_block.so.6.0 00:48:37.801 LIB libspdk_bdev_aio.a 00:48:37.801 SO libspdk_bdev_aio.so.6.0 00:48:37.801 SYMLINK libspdk_bdev_zone_block.so 00:48:37.801 CC module/bdev/raid/concat.o 00:48:37.801 CC module/bdev/virtio/bdev_virtio_scsi.o 00:48:37.801 SYMLINK libspdk_bdev_aio.so 00:48:37.801 CC module/bdev/virtio/bdev_virtio_blk.o 00:48:37.801 CC module/bdev/virtio/bdev_virtio_rpc.o 00:48:37.801 LIB libspdk_bdev_ftl.a 00:48:37.801 SO libspdk_bdev_ftl.so.6.0 00:48:38.059 SYMLINK libspdk_bdev_ftl.so 00:48:38.059 LIB libspdk_bdev_raid.a 00:48:38.317 SO libspdk_bdev_raid.so.6.0 00:48:38.317 SYMLINK libspdk_bdev_raid.so 00:48:38.317 LIB libspdk_bdev_virtio.a 00:48:38.317 SO libspdk_bdev_virtio.so.6.0 00:48:38.317 SYMLINK libspdk_bdev_virtio.so 00:48:38.575 LIB libspdk_bdev_nvme.a 00:48:38.575 SO libspdk_bdev_nvme.so.7.0 00:48:38.833 SYMLINK libspdk_bdev_nvme.so 00:48:39.437 CC module/event/subsystems/scheduler/scheduler.o 00:48:39.437 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:48:39.437 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:48:39.437 CC module/event/subsystems/iobuf/iobuf.o 00:48:39.437 CC module/event/subsystems/keyring/keyring.o 00:48:39.437 CC module/event/subsystems/sock/sock.o 00:48:39.437 CC module/event/subsystems/vmd/vmd.o 00:48:39.437 CC module/event/subsystems/vmd/vmd_rpc.o 00:48:39.437 LIB libspdk_event_keyring.a 00:48:39.437 LIB libspdk_event_vhost_blk.a 00:48:39.437 LIB libspdk_event_scheduler.a 00:48:39.437 SO libspdk_event_vhost_blk.so.3.0 00:48:39.437 SO libspdk_event_scheduler.so.4.0 00:48:39.437 SO libspdk_event_keyring.so.1.0 00:48:39.437 LIB libspdk_event_vmd.a 00:48:39.437 SYMLINK libspdk_event_scheduler.so 00:48:39.437 SYMLINK libspdk_event_vhost_blk.so 00:48:39.437 LIB libspdk_event_sock.a 00:48:39.437 SYMLINK libspdk_event_keyring.so 00:48:39.437 LIB libspdk_event_iobuf.a 00:48:39.437 SO libspdk_event_vmd.so.6.0 00:48:39.437 SO libspdk_event_sock.so.5.0 00:48:39.437 SO libspdk_event_iobuf.so.3.0 00:48:39.696 SYMLINK libspdk_event_vmd.so 00:48:39.696 SYMLINK libspdk_event_sock.so 00:48:39.696 SYMLINK libspdk_event_iobuf.so 00:48:39.954 CC module/event/subsystems/accel/accel.o 00:48:40.211 LIB libspdk_event_accel.a 00:48:40.211 SO libspdk_event_accel.so.6.0 00:48:40.211 SYMLINK libspdk_event_accel.so 00:48:40.469 CC module/event/subsystems/bdev/bdev.o 00:48:40.727 LIB libspdk_event_bdev.a 00:48:40.727 SO libspdk_event_bdev.so.6.0 00:48:40.727 SYMLINK libspdk_event_bdev.so 00:48:40.985 CC module/event/subsystems/scsi/scsi.o 00:48:40.985 CC module/event/subsystems/nbd/nbd.o 00:48:40.985 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:48:40.985 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:48:41.243 LIB libspdk_event_nbd.a 00:48:41.243 SO libspdk_event_nbd.so.6.0 00:48:41.243 LIB libspdk_event_scsi.a 00:48:41.243 SYMLINK libspdk_event_nbd.so 00:48:41.244 SO libspdk_event_scsi.so.6.0 00:48:41.244 SYMLINK libspdk_event_scsi.so 00:48:41.501 LIB libspdk_event_nvmf.a 00:48:41.501 SO libspdk_event_nvmf.so.6.0 00:48:41.501 SYMLINK libspdk_event_nvmf.so 00:48:41.759 CC module/event/subsystems/iscsi/iscsi.o 00:48:41.759 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:48:41.759 LIB libspdk_event_vhost_scsi.a 00:48:41.759 SO libspdk_event_vhost_scsi.so.3.0 00:48:41.759 LIB libspdk_event_iscsi.a 00:48:41.759 SO libspdk_event_iscsi.so.6.0 00:48:41.759 SYMLINK libspdk_event_vhost_scsi.so 00:48:42.017 SYMLINK libspdk_event_iscsi.so 00:48:42.017 SO libspdk.so.6.0 00:48:42.017 SYMLINK libspdk.so 00:48:42.274 make[1]: Nothing to be done for 'all'. 00:48:42.274 CXX app/trace/trace.o 00:48:42.532 CC examples/nvme/hello_world/hello_world.o 00:48:42.532 CC examples/ioat/perf/perf.o 00:48:42.532 CC examples/accel/perf/accel_perf.o 00:48:42.532 CC examples/vmd/lsvmd/lsvmd.o 00:48:42.532 CC examples/util/zipf/zipf.o 00:48:42.532 CC examples/bdev/hello_world/hello_bdev.o 00:48:42.532 CC examples/blob/hello_world/hello_blob.o 00:48:42.532 CC examples/nvmf/nvmf/nvmf.o 00:48:42.532 CC examples/sock/hello_world/hello_sock.o 00:48:42.791 LINK lsvmd 00:48:42.791 LINK zipf 00:48:42.791 LINK hello_world 00:48:42.792 LINK ioat_perf 00:48:42.792 LINK hello_bdev 00:48:42.792 LINK hello_blob 00:48:42.792 LINK hello_sock 00:48:42.792 LINK spdk_trace 00:48:42.792 LINK nvmf 00:48:42.792 CC examples/vmd/led/led.o 00:48:43.051 CC examples/ioat/verify/verify.o 00:48:43.051 CC examples/blob/cli/blobcli.o 00:48:43.051 CC examples/nvme/reconnect/reconnect.o 00:48:43.051 CC app/trace_record/trace_record.o 00:48:43.051 LINK accel_perf 00:48:43.051 CC examples/bdev/bdevperf/bdevperf.o 00:48:43.051 LINK led 00:48:43.051 CC app/nvmf_tgt/nvmf_main.o 00:48:43.309 LINK verify 00:48:43.309 CC examples/interrupt_tgt/interrupt_tgt.o 00:48:43.309 LINK nvmf_tgt 00:48:43.309 CC app/iscsi_tgt/iscsi_tgt.o 00:48:43.309 LINK spdk_trace_record 00:48:43.309 CC examples/thread/thread/thread_ex.o 00:48:43.309 LINK reconnect 00:48:43.569 CC app/spdk_lspci/spdk_lspci.o 00:48:43.569 CC app/spdk_tgt/spdk_tgt.o 00:48:43.569 LINK interrupt_tgt 00:48:43.569 LINK iscsi_tgt 00:48:43.569 LINK blobcli 00:48:43.569 LINK spdk_lspci 00:48:43.569 CC examples/nvme/nvme_manage/nvme_manage.o 00:48:43.569 CC app/spdk_nvme_perf/perf.o 00:48:43.569 CC examples/nvme/arbitration/arbitration.o 00:48:43.828 LINK spdk_tgt 00:48:43.828 LINK thread 00:48:43.828 CC app/spdk_nvme_identify/identify.o 00:48:43.828 CC examples/nvme/hotplug/hotplug.o 00:48:43.828 CC app/spdk_nvme_discover/discovery_aer.o 00:48:43.828 CC examples/nvme/cmb_copy/cmb_copy.o 00:48:44.087 CC app/spdk_top/spdk_top.o 00:48:44.087 LINK bdevperf 00:48:44.087 CC examples/nvme/abort/abort.o 00:48:44.087 LINK arbitration 00:48:44.087 LINK cmb_copy 00:48:44.087 LINK spdk_nvme_discover 00:48:44.087 LINK hotplug 00:48:44.346 CC app/vhost/vhost.o 00:48:44.346 CC app/spdk_dd/spdk_dd.o 00:48:44.346 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:48:44.346 LINK nvme_manage 00:48:44.605 LINK abort 00:48:44.605 LINK vhost 00:48:44.605 LINK pmr_persistence 00:48:44.886 LINK spdk_dd 00:48:44.886 LINK spdk_nvme_perf 00:48:45.166 LINK spdk_top 00:48:45.424 LINK spdk_nvme_identify 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/assert.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/accel.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/barrier.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/accel_module.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/bdev.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/base64.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/bdev_module.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/bdev_zone.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/bit_array.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/blob.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/bit_pool.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/blobfs_bdev.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/blobfs.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/blob_bdev.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/conf.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/config.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/cpuset.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/crc16.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/crc64.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/crc32.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/dif.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/dma.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/env.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/endian.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/env_dpdk.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/event.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/fd_group.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/file.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/fd.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/ftl.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/gpt_spec.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/hexlify.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/histogram_data.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/idxd.h 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/idxd_spec.h 00:48:46.013 cp /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/scripts/rpc.py /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/build/bin/spdk_rpc 00:48:46.013 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/ioat.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/init.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/iscsi_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/ioat_spec.h 00:48:46.014 chmod +x /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/build/bin/spdk_rpc 00:48:46.014 cp /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/scripts/spdkcli.py /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/build/bin/spdk_cli 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/json.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_rpc 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/keyring.h 00:48:46.014 chmod +x /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/build/bin/spdk_cli 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/jsonrpc.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/likely.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/keyring_module.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/log.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_cli 00:48:46.014 patchelf: not an ELF executable 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/mmio.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/lvol.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nbd.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/memory.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme_intel.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/notify.h 00:48:46.014 patchelf: not an ELF executable 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme_ocssd.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme_ocssd_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvmf_cmd.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvmf.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvme_zns.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvmf_fc_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvmf_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/nvmf_transport.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/opal.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/opal_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/pipe.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/pci_ids.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/queue.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/queue_extras.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/rpc.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/reduce.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/scheduler.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/scsi.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/sock.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/stdinc.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/scsi_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/thread.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/string.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/trace.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/trace_parser.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/ublk.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/tree.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/util.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/uuid.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/version.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/vfio_user_pci.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/vfio_user_spec.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/vfu_target.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/vhost.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/xor.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/vmd.h 00:48:46.014 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/spdk/zipf.h 00:48:46.014 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:48:46.271 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:48:46.271 Processing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/python 00:48:46.271 DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. 00:48:46.271 pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. 00:48:46.839 Using legacy 'setup.py install' for spdk, since package 'wheel' is not installed. 00:48:47.097 Installing collected packages: spdk 00:48:47.097 Running setup.py install for spdk: started 00:48:47.356 Running setup.py install for spdk: finished with status 'done' 00:48:47.356 Successfully installed spdk-24.9rc0 00:48:47.614 rm -rf /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/python/spdk.egg-info 00:48:48.993 The Meson build system 00:48:48.993 Version: 1.4.0 00:48:48.993 Source dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk 00:48:48.993 Build dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp 00:48:48.993 Build type: native build 00:48:48.993 Program cat found: YES (/bin/cat) 00:48:48.993 Project name: DPDK 00:48:48.993 Project version: 24.03.0 00:48:48.993 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:48:48.993 C linker for the host machine: cc ld.bfd 2.35.2-42 00:48:48.993 Host machine cpu family: x86_64 00:48:48.993 Host machine cpu: x86_64 00:48:48.993 Message: ## Building in Developer Mode ## 00:48:48.993 Program pkg-config found: YES (/bin/pkg-config) 00:48:48.993 Program check-symbols.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/check-symbols.sh) 00:48:48.993 Program options-ibverbs-static.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/options-ibverbs-static.sh) 00:48:48.993 Program python3 found: YES (/usr/bin/python3) 00:48:48.993 Program cat found: YES (/bin/cat) 00:48:48.993 Compiler for C supports arguments -march=native: YES (cached) 00:48:48.994 Checking for size of "void *" : 8 (cached) 00:48:48.994 Checking for size of "void *" : 8 (cached) 00:48:48.994 Compiler for C supports link arguments -Wl,--undefined-version: NO (cached) 00:48:48.994 Library m found: YES 00:48:48.994 Library numa found: YES 00:48:48.994 Has header "numaif.h" : YES (cached) 00:48:48.994 Library fdt found: NO 00:48:48.994 Library execinfo found: NO 00:48:48.994 Has header "execinfo.h" : YES (cached) 00:48:48.994 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:48:48.994 Run-time dependency libarchive found: NO (tried pkgconfig) 00:48:48.994 Run-time dependency libbsd found: NO (tried pkgconfig) 00:48:48.994 Run-time dependency jansson found: NO (tried pkgconfig) 00:48:48.994 Dependency openssl found: YES 3.0.7 (cached) 00:48:48.994 Run-time dependency libpcap found: NO (tried pkgconfig) 00:48:48.994 Library pcap found: NO 00:48:48.994 Compiler for C supports arguments -Wcast-qual: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wdeprecated: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wformat: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wformat-nonliteral: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wformat-security: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wmissing-declarations: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wmissing-prototypes: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wnested-externs: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wold-style-definition: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wpointer-arith: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wsign-compare: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wstrict-prototypes: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wundef: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wwrite-strings: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wno-address-of-packed-member: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wno-packed-not-aligned: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wno-missing-field-initializers: YES (cached) 00:48:48.994 Compiler for C supports arguments -Wno-zero-length-bounds: YES (cached) 00:48:48.994 Program objdump found: YES (/bin/objdump) 00:48:48.994 Compiler for C supports arguments -mavx512f: YES (cached) 00:48:48.994 Checking if "AVX512 checking" compiles: YES (cached) 00:48:48.994 Fetching value of define "__SSE4_2__" : 1 (cached) 00:48:48.994 Fetching value of define "__AES__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX2__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512BW__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512CD__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512F__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512VL__" : 1 (cached) 00:48:48.994 Fetching value of define "__PCLMUL__" : 1 (cached) 00:48:48.994 Fetching value of define "__RDRND__" : 1 (cached) 00:48:48.994 Fetching value of define "__RDSEED__" : 1 (cached) 00:48:48.994 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:48:48.994 Fetching value of define "__znver1__" : (undefined) (cached) 00:48:48.994 Fetching value of define "__znver2__" : (undefined) (cached) 00:48:48.994 Fetching value of define "__znver3__" : (undefined) (cached) 00:48:48.994 Fetching value of define "__znver4__" : (undefined) (cached) 00:48:48.994 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:48:48.994 Message: lib/log: Defining dependency "log" 00:48:48.994 Message: lib/kvargs: Defining dependency "kvargs" 00:48:48.994 Message: lib/telemetry: Defining dependency "telemetry" 00:48:48.994 Checking for function "getentropy" : NO (cached) 00:48:48.994 Message: lib/eal: Defining dependency "eal" 00:48:48.994 Message: lib/ring: Defining dependency "ring" 00:48:48.994 Message: lib/rcu: Defining dependency "rcu" 00:48:48.994 Message: lib/mempool: Defining dependency "mempool" 00:48:48.994 Message: lib/mbuf: Defining dependency "mbuf" 00:48:48.994 Fetching value of define "__PCLMUL__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512F__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512BW__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:48:48.994 Fetching value of define "__AVX512VL__" : 1 (cached) 00:48:48.994 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:48:48.994 Compiler for C supports arguments -mpclmul: YES (cached) 00:48:48.994 Compiler for C supports arguments -maes: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx512f: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx512bw: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx512dq: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx512vl: YES (cached) 00:48:48.994 Compiler for C supports arguments -mvpclmulqdq: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx2: YES (cached) 00:48:48.994 Compiler for C supports arguments -mavx: YES (cached) 00:48:48.994 Message: lib/net: Defining dependency "net" 00:48:48.994 Message: lib/meter: Defining dependency "meter" 00:48:48.994 Message: lib/ethdev: Defining dependency "ethdev" 00:48:48.994 Message: lib/pci: Defining dependency "pci" 00:48:48.994 Message: lib/cmdline: Defining dependency "cmdline" 00:48:48.994 Message: lib/hash: Defining dependency "hash" 00:48:48.994 Message: lib/timer: Defining dependency "timer" 00:48:48.994 Message: lib/compressdev: Defining dependency "compressdev" 00:48:48.994 Message: lib/cryptodev: Defining dependency "cryptodev" 00:48:48.994 Message: lib/dmadev: Defining dependency "dmadev" 00:48:48.994 Compiler for C supports arguments -Wno-cast-qual: YES (cached) 00:48:48.994 Message: lib/power: Defining dependency "power" 00:48:48.994 Message: lib/reorder: Defining dependency "reorder" 00:48:48.994 Message: lib/security: Defining dependency "security" 00:48:48.994 Has header "linux/userfaultfd.h" : YES (cached) 00:48:48.994 Has header "linux/vduse.h" : NO (cached) 00:48:48.994 Message: lib/vhost: Defining dependency "vhost" 00:48:48.994 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:48:48.994 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:48:48.994 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:48:48.994 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:48:48.994 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:48:48.994 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:48:48.994 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:48:48.994 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:48:48.994 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:48:48.994 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:48:48.994 Program doxygen found: YES (/bin/doxygen) 00:48:48.994 Configuring doxy-api-html.conf using configuration 00:48:48.994 Configuring doxy-api-man.conf using configuration 00:48:48.994 Program mandb found: YES (/bin/mandb) 00:48:48.994 Program sphinx-build found: NO 00:48:48.994 Configuring rte_build_config.h using configuration 00:48:48.994 Message: 00:48:48.994 ================= 00:48:48.994 Applications Enabled 00:48:48.994 ================= 00:48:48.994 00:48:48.994 apps: 00:48:48.994 00:48:48.994 00:48:48.994 Message: 00:48:48.994 ================= 00:48:48.994 Libraries Enabled 00:48:48.994 ================= 00:48:48.994 00:48:48.994 libs: 00:48:48.994 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:48:48.994 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:48:48.994 cryptodev, dmadev, power, reorder, security, vhost, 00:48:48.994 00:48:48.994 Message: 00:48:48.994 =============== 00:48:48.994 Drivers Enabled 00:48:48.994 =============== 00:48:48.994 00:48:48.994 common: 00:48:48.994 00:48:48.994 bus: 00:48:48.994 pci, vdev, 00:48:48.994 mempool: 00:48:48.994 ring, 00:48:48.994 dma: 00:48:48.994 00:48:48.994 net: 00:48:48.994 00:48:48.994 crypto: 00:48:48.994 00:48:48.994 compress: 00:48:48.994 00:48:48.994 vdpa: 00:48:48.994 00:48:48.994 00:48:48.994 Message: 00:48:48.994 ================= 00:48:48.994 Content Skipped 00:48:48.994 ================= 00:48:48.994 00:48:48.994 apps: 00:48:48.994 dumpcap: explicitly disabled via build config 00:48:48.994 graph: explicitly disabled via build config 00:48:48.994 pdump: explicitly disabled via build config 00:48:48.994 proc-info: explicitly disabled via build config 00:48:48.994 test-acl: explicitly disabled via build config 00:48:48.994 test-bbdev: explicitly disabled via build config 00:48:48.994 test-cmdline: explicitly disabled via build config 00:48:48.994 test-compress-perf: explicitly disabled via build config 00:48:48.994 test-crypto-perf: explicitly disabled via build config 00:48:48.994 test-dma-perf: explicitly disabled via build config 00:48:48.994 test-eventdev: explicitly disabled via build config 00:48:48.994 test-fib: explicitly disabled via build config 00:48:48.994 test-flow-perf: explicitly disabled via build config 00:48:48.994 test-gpudev: explicitly disabled via build config 00:48:48.994 test-mldev: explicitly disabled via build config 00:48:48.994 test-pipeline: explicitly disabled via build config 00:48:48.994 test-pmd: explicitly disabled via build config 00:48:48.994 test-regex: explicitly disabled via build config 00:48:48.994 test-sad: explicitly disabled via build config 00:48:48.994 test-security-perf: explicitly disabled via build config 00:48:48.994 00:48:48.994 libs: 00:48:48.994 argparse: explicitly disabled via build config 00:48:48.994 metrics: explicitly disabled via build config 00:48:48.994 acl: explicitly disabled via build config 00:48:48.994 bbdev: explicitly disabled via build config 00:48:48.994 bitratestats: explicitly disabled via build config 00:48:48.994 bpf: explicitly disabled via build config 00:48:48.994 cfgfile: explicitly disabled via build config 00:48:48.994 distributor: explicitly disabled via build config 00:48:48.994 efd: explicitly disabled via build config 00:48:48.994 eventdev: explicitly disabled via build config 00:48:48.994 dispatcher: explicitly disabled via build config 00:48:48.994 gpudev: explicitly disabled via build config 00:48:48.994 gro: explicitly disabled via build config 00:48:48.994 gso: explicitly disabled via build config 00:48:48.995 ip_frag: explicitly disabled via build config 00:48:48.995 jobstats: explicitly disabled via build config 00:48:48.995 latencystats: explicitly disabled via build config 00:48:48.995 lpm: explicitly disabled via build config 00:48:48.995 member: explicitly disabled via build config 00:48:48.995 pcapng: explicitly disabled via build config 00:48:48.995 rawdev: explicitly disabled via build config 00:48:48.995 regexdev: explicitly disabled via build config 00:48:48.995 mldev: explicitly disabled via build config 00:48:48.995 rib: explicitly disabled via build config 00:48:48.995 sched: explicitly disabled via build config 00:48:48.995 stack: explicitly disabled via build config 00:48:48.995 ipsec: explicitly disabled via build config 00:48:48.995 pdcp: explicitly disabled via build config 00:48:48.995 fib: explicitly disabled via build config 00:48:48.995 port: explicitly disabled via build config 00:48:48.995 pdump: explicitly disabled via build config 00:48:48.995 table: explicitly disabled via build config 00:48:48.995 pipeline: explicitly disabled via build config 00:48:48.995 graph: explicitly disabled via build config 00:48:48.995 node: explicitly disabled via build config 00:48:48.995 00:48:48.995 drivers: 00:48:48.995 common/cpt: not in enabled drivers build config 00:48:48.995 common/dpaax: not in enabled drivers build config 00:48:48.995 common/iavf: not in enabled drivers build config 00:48:48.995 common/idpf: not in enabled drivers build config 00:48:48.995 common/ionic: not in enabled drivers build config 00:48:48.995 common/mvep: not in enabled drivers build config 00:48:48.995 common/octeontx: not in enabled drivers build config 00:48:48.995 bus/auxiliary: not in enabled drivers build config 00:48:48.995 bus/cdx: not in enabled drivers build config 00:48:48.995 bus/dpaa: not in enabled drivers build config 00:48:48.995 bus/fslmc: not in enabled drivers build config 00:48:48.995 bus/ifpga: not in enabled drivers build config 00:48:48.995 bus/platform: not in enabled drivers build config 00:48:48.995 bus/uacce: not in enabled drivers build config 00:48:48.995 bus/vmbus: not in enabled drivers build config 00:48:48.995 common/cnxk: not in enabled drivers build config 00:48:48.995 common/mlx5: not in enabled drivers build config 00:48:48.995 common/nfp: not in enabled drivers build config 00:48:48.995 common/nitrox: not in enabled drivers build config 00:48:48.995 common/qat: not in enabled drivers build config 00:48:48.995 common/sfc_efx: not in enabled drivers build config 00:48:48.995 mempool/bucket: not in enabled drivers build config 00:48:48.995 mempool/cnxk: not in enabled drivers build config 00:48:48.995 mempool/dpaa: not in enabled drivers build config 00:48:48.995 mempool/dpaa2: not in enabled drivers build config 00:48:48.995 mempool/octeontx: not in enabled drivers build config 00:48:48.995 mempool/stack: not in enabled drivers build config 00:48:48.995 dma/cnxk: not in enabled drivers build config 00:48:48.995 dma/dpaa: not in enabled drivers build config 00:48:48.995 dma/dpaa2: not in enabled drivers build config 00:48:48.995 dma/hisilicon: not in enabled drivers build config 00:48:48.995 dma/idxd: not in enabled drivers build config 00:48:48.995 dma/ioat: not in enabled drivers build config 00:48:48.995 dma/skeleton: not in enabled drivers build config 00:48:48.995 net/af_packet: not in enabled drivers build config 00:48:48.995 net/af_xdp: not in enabled drivers build config 00:48:48.995 net/ark: not in enabled drivers build config 00:48:48.995 net/atlantic: not in enabled drivers build config 00:48:48.995 net/avp: not in enabled drivers build config 00:48:48.995 net/axgbe: not in enabled drivers build config 00:48:48.995 net/bnx2x: not in enabled drivers build config 00:48:48.995 net/bnxt: not in enabled drivers build config 00:48:48.995 net/bonding: not in enabled drivers build config 00:48:48.995 net/cnxk: not in enabled drivers build config 00:48:48.995 net/cpfl: not in enabled drivers build config 00:48:48.995 net/cxgbe: not in enabled drivers build config 00:48:48.995 net/dpaa: not in enabled drivers build config 00:48:48.995 net/dpaa2: not in enabled drivers build config 00:48:48.995 net/e1000: not in enabled drivers build config 00:48:48.995 net/ena: not in enabled drivers build config 00:48:48.995 net/enetc: not in enabled drivers build config 00:48:48.995 net/enetfec: not in enabled drivers build config 00:48:48.995 net/enic: not in enabled drivers build config 00:48:48.995 net/failsafe: not in enabled drivers build config 00:48:48.995 net/fm10k: not in enabled drivers build config 00:48:48.995 net/gve: not in enabled drivers build config 00:48:48.995 net/hinic: not in enabled drivers build config 00:48:48.995 net/hns3: not in enabled drivers build config 00:48:48.995 net/i40e: not in enabled drivers build config 00:48:48.995 net/iavf: not in enabled drivers build config 00:48:48.995 net/ice: not in enabled drivers build config 00:48:48.995 net/idpf: not in enabled drivers build config 00:48:48.995 net/igc: not in enabled drivers build config 00:48:48.995 net/ionic: not in enabled drivers build config 00:48:48.995 net/ipn3ke: not in enabled drivers build config 00:48:48.995 net/ixgbe: not in enabled drivers build config 00:48:48.995 net/mana: not in enabled drivers build config 00:48:48.995 net/memif: not in enabled drivers build config 00:48:48.995 net/mlx4: not in enabled drivers build config 00:48:48.995 net/mlx5: not in enabled drivers build config 00:48:48.995 net/mvneta: not in enabled drivers build config 00:48:48.995 net/mvpp2: not in enabled drivers build config 00:48:48.995 net/netvsc: not in enabled drivers build config 00:48:48.995 net/nfb: not in enabled drivers build config 00:48:48.995 net/nfp: not in enabled drivers build config 00:48:48.995 net/ngbe: not in enabled drivers build config 00:48:48.995 net/null: not in enabled drivers build config 00:48:48.995 net/octeontx: not in enabled drivers build config 00:48:48.995 net/octeon_ep: not in enabled drivers build config 00:48:48.995 net/pcap: not in enabled drivers build config 00:48:48.995 net/pfe: not in enabled drivers build config 00:48:48.995 net/qede: not in enabled drivers build config 00:48:48.995 net/ring: not in enabled drivers build config 00:48:48.995 net/sfc: not in enabled drivers build config 00:48:48.995 net/softnic: not in enabled drivers build config 00:48:48.995 net/tap: not in enabled drivers build config 00:48:48.995 net/thunderx: not in enabled drivers build config 00:48:48.995 net/txgbe: not in enabled drivers build config 00:48:48.995 net/vdev_netvsc: not in enabled drivers build config 00:48:48.995 net/vhost: not in enabled drivers build config 00:48:48.995 net/virtio: not in enabled drivers build config 00:48:48.995 net/vmxnet3: not in enabled drivers build config 00:48:48.995 raw/*: missing internal dependency, "rawdev" 00:48:48.995 crypto/armv8: not in enabled drivers build config 00:48:48.995 crypto/bcmfs: not in enabled drivers build config 00:48:48.995 crypto/caam_jr: not in enabled drivers build config 00:48:48.995 crypto/ccp: not in enabled drivers build config 00:48:48.995 crypto/cnxk: not in enabled drivers build config 00:48:48.995 crypto/dpaa_sec: not in enabled drivers build config 00:48:48.995 crypto/dpaa2_sec: not in enabled drivers build config 00:48:48.995 crypto/ipsec_mb: not in enabled drivers build config 00:48:48.995 crypto/mlx5: not in enabled drivers build config 00:48:48.995 crypto/mvsam: not in enabled drivers build config 00:48:48.995 crypto/nitrox: not in enabled drivers build config 00:48:48.995 crypto/null: not in enabled drivers build config 00:48:48.995 crypto/octeontx: not in enabled drivers build config 00:48:48.995 crypto/openssl: not in enabled drivers build config 00:48:48.995 crypto/scheduler: not in enabled drivers build config 00:48:48.995 crypto/uadk: not in enabled drivers build config 00:48:48.995 crypto/virtio: not in enabled drivers build config 00:48:48.995 compress/isal: not in enabled drivers build config 00:48:48.995 compress/mlx5: not in enabled drivers build config 00:48:48.995 compress/nitrox: not in enabled drivers build config 00:48:48.995 compress/octeontx: not in enabled drivers build config 00:48:48.995 compress/zlib: not in enabled drivers build config 00:48:48.995 regex/*: missing internal dependency, "regexdev" 00:48:48.995 ml/*: missing internal dependency, "mldev" 00:48:48.995 vdpa/ifc: not in enabled drivers build config 00:48:48.995 vdpa/mlx5: not in enabled drivers build config 00:48:48.995 vdpa/nfp: not in enabled drivers build config 00:48:48.995 vdpa/sfc: not in enabled drivers build config 00:48:48.995 event/*: missing internal dependency, "eventdev" 00:48:48.995 baseband/*: missing internal dependency, "bbdev" 00:48:48.995 gpu/*: missing internal dependency, "gpudev" 00:48:48.995 00:48:48.995 00:48:49.563 Cleaning... 0 files. 00:48:49.563 Build targets in project: 85 00:48:49.563 00:48:49.563 DPDK 24.03.0 00:48:49.563 00:48:49.563 User defined options 00:48:49.563 default_library : shared 00:48:49.563 libdir : lib 00:48:49.563 prefix : /usr/local 00:48:49.563 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:48:49.563 c_link_args : 00:48:49.563 cpu_instruction_set: native 00:48:49.563 disable_apps : dumpcap,test-eventdev,test,graph,test-fib,pdump,test-flow-perf,proc-info,test-gpudev,test-acl,test-mldev,test-bbdev,test-pipeline,test-cmdline,test-pmd,test-compress-perf,test-regex,test-crypto-perf,test-sad,test-dma-perf,test-security-perf 00:48:49.563 disable_libs : bitratestats,efd,ipsec,metrics,table,bpf,jobstats,mldev,rawdev,cfgfile,eventdev,fib,latencystats,node,regexdev,gpudev,pcapng,graph,lpm,rib,dispatcher,gro,pdcp,acl,distributor,gso,member,pdump,sched,argparse,pipeline,bbdev,ip_frag,port,stack 00:48:49.563 enable_docs : false 00:48:49.563 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:48:49.563 enable_kmods : false 00:48:49.563 tests : false 00:48:49.563 00:48:49.563 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:48:50.139 Installing subdir /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/commands.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/commands.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/parse_obj_list.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/cmdline/parse_obj_list.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/cmdline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/action.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/action.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/cli.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/cli.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/conn.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/conn.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/cryptodev.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/cryptodev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/link.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/link.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/mempool.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/mempool.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/parser.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/parser.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/pipeline.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/pipeline.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/swq.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/swq.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/tap.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/tap.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/thread.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/thread.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/tmgr.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/tmgr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline 00:48:50.139 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/firewall.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/flow.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/route.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/rss.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_pipeline/examples/tap.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_pipeline/examples 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_meter/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_meter/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_meter/main.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_meter/rte_policer.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_meter/rte_policer.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_meter 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_crypto/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_crypto 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_crypto/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_crypto 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/common/pkt_group.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/common 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/common/altivec/port_group.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/common/altivec 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/common/neon/port_group.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/common/neon 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/common/sse/port_group.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/common/sse 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_reassembly/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_reassembly 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_reassembly/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_reassembly 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-graph/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-graph 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-graph/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-graph 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/app_thread.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/args.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/cfg_file.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/cfg_file.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/cmdline.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/init.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/main.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/profile.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/profile_ov.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/profile_pie.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/profile_red.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/qos_sched/stats.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/qos_sched 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_manager.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_manager.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_monitor.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/channel_monitor.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/parse.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/parse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/power_manager.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/power_manager.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/vm_power_cli.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/vm_power_cli.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/parse.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/parse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vm_power_manager/guest_cli 00:48:50.140 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/distributor/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/distributor 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/distributor/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/distributor 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ep0.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ep1.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/esp.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/esp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/event_helper.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/event_helper.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/flow.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/flow.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipip.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_neon.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_process.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_worker.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/ipsec_worker.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/parser.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/parser.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/rt.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sa.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sad.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sad.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sp4.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/sp6.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/linux_test.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/load_env.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/pkttest.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/pkttest.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/run_test.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipsec-secgw/test 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-power/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-power/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-power/main.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-power/perf_core.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd-power/perf_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd-power 00:48:50.141 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/rxtx_callbacks 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/rxtx_callbacks/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/rxtx_callbacks 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vmdq/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vmdq 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vmdq/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vmdq 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/dma/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/dma 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/dma/dmafwd.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/dma 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipv4_multicast/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipv4_multicast 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ipv4_multicast/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ipv4_multicast 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/em_default_v4.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/em_default_v6.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/em_route_parse.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_altivec.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event_generic.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_fib.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_neon.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_route.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/l3fwd_sse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_default_v4.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_default_v6.cfg to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/lpm_route_parse.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l3fwd/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l3fwd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_node 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_node/node.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_node 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/args.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/args.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/init.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/init.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/efd_server/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/efd_server 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/server_node_efd/shared/common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/server_node_efd/shared 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vmdq_dcb/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vmdq_dcb 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vmdq_dcb/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vmdq_dcb 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/ethapp.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/ethapp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/ethtool-app/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/ethtool-app 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/lib/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/lib/rte_ethtool.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ethtool/lib/rte_ethtool.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ethtool/lib 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-cat/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:48:50.142 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-cat/cat.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-cat/cat.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-cat 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/link_status_interrupt/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/link_status_interrupt 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/link_status_interrupt/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/link_status_interrupt 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/service_cores/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/service_cores 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/service_cores/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/service_cores 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/eventdev_pipeline 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-crypto 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-crypto/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-crypto 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/client_server_mp/shared/common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/client_server_mp/shared 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/commands.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/commands.list to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/hotplug_mp/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/hotplug_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/commands.list to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/mp_commands.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/simple_mp/mp_commands.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/simple_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/symmetric_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/multi_process/symmetric_mp/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/multi_process/symmetric_mp 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/skeleton/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/skeleton 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/skeleton/basicfwd.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/skeleton 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_dev_self_test.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_dev_self_test.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_aes.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_ccm.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_cmac.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_gcm.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_hmac.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_rsa.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_sha.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_tdes.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/fips_validation_xts.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/fips_validation/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/fips_validation 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_common.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.143 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_poll.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/l2fwd_poll.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-event/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-event 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ntb/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ntb 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ntb/commands.list to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ntb 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ntb/ntb_fwd.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ntb 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/timer/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/timer 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/timer/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/timer 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bbdev_app/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bbdev_app 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bbdev_app/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bbdev_app 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/flow_filtering/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/flow_filtering/flow_blocks.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/flow_filtering/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/flow_filtering 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-jobstats 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-jobstats/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-jobstats 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/packet_ordering/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/packet_ordering 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/packet_ordering/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/packet_ordering 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vdpa/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vdpa/commands.list to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vdpa/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vdpa/vdpa_blk_compact.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vdpa 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bond/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bond 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bond/commands.list to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bond 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bond/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bond 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/helloworld/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/helloworld 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/helloworld/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/helloworld 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/shm.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/shm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/cli.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/cli.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/conn.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/conn.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/obj.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/obj.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/thread.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/thread.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ethdev.io to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.144 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/fib_routing_table.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/hash_func.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/hash_func.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.io to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipsec_sa.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/learner.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/learner.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/meter.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/meter.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/mirroring.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/mirroring.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/packet.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/pcap.io to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/recirculation.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/recirculation.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/registers.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/registers.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/rss.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/rss.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/selector.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/varbit.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/varbit.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan.spec to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_table.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/pipeline/examples/vxlan_table.txt to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/pipeline/examples 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost/main.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost/virtio_net.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bpf/README to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bpf 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bpf/dummy.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bpf 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bpf/t1.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bpf 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bpf/t2.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bpf 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/bpf/t3.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/bpf 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_fragmentation/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_fragmentation 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ip_fragmentation/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ip_fragmentation 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-macsec 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/l2fwd-macsec/main.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/l2fwd-macsec 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ptpclient/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ptpclient 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/ptpclient/ptpclient.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/ptpclient 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/Makefile to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/blk.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/blk_spec.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/examples/vhost_blk/vhost_blk_compat.c to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/share/dpdk/examples/vhost_blk 00:48:50.145 Installing lib/librte_log.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.145 Installing lib/librte_log.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.145 Installing lib/librte_kvargs.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.145 Installing lib/librte_kvargs.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.145 Installing lib/librte_telemetry.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_telemetry.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_eal.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_eal.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_ring.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_ring.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_rcu.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_rcu.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_mempool.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_mempool.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_mbuf.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_mbuf.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_net.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_net.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_meter.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_meter.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_ethdev.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_ethdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_pci.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_pci.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_cmdline.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_cmdline.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_hash.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_hash.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_timer.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_timer.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_compressdev.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_compressdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_cryptodev.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_cryptodev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_dmadev.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_dmadev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_power.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_power.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_reorder.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_reorder.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_security.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_security.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_vhost.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing lib/librte_vhost.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing drivers/librte_bus_pci.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing drivers/librte_bus_pci.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:48:50.146 Installing drivers/librte_bus_vdev.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing drivers/librte_bus_vdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:48:50.146 Installing drivers/librte_mempool_ring.a to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib 00:48:50.146 Installing drivers/librte_mempool_ring.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/config/rte_config.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/log/rte_log.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/kvargs/rte_kvargs.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/telemetry/rte_telemetry.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_atomic.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_byteorder.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_cpuflags.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_cycles.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_io.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_memcpy.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_pause.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_prefetch.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_rwlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_spinlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/generic/rte_vect.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include/generic 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_cpuflags.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_cycles.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_io.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_memcpy.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_pause.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_prefetch.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_rtm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_rwlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_spinlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_vect.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic_32.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_atomic_64.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_alarm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_bitmap.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_bitops.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_branch_prediction.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_bus.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_class.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.146 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_common.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_compat.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_debug.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_dev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_devargs.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal_memconfig.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_eal_trace.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_errno.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_epoll.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_fbarray.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_hexdump.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_hypervisor.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_interrupts.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_keepalive.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_launch.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_lcore.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_lock_annotations.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_malloc.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_mcslock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_memory.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_memzone.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_pci_dev_features.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_per_lcore.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_pflock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_random.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_reciprocal.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_seqcount.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_seqlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_service.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_service_component.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_stdatomic.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_string_fns.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_tailq.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_thread.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_ticketlock.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_time.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace_point.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_trace_point_register.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_uuid.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_version.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/include/rte_vfio.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/eal/linux/include/rte_os.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_elem.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_elem_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_c11_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_generic_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_hts.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_peek_zc.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_rts.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/rcu/rte_rcu_qsbr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mempool/rte_mempool_trace_fp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_ptype.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/mbuf/rte_mbuf_dyn.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.147 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_ip.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_tcp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_udp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_tls.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_dtls.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_esp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_sctp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_icmp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_arp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_ether.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_macsec.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_vxlan.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_gre.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_gtp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_net.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_net_crc.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_mpls.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_higig.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_ecpri.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_pdcp_hdr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_geneve.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_l2tpv2.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_ppp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/net/rte_ib.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/meter/rte_meter.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_cman.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_dev_info.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_flow.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_flow_driver.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_mtr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_mtr_driver.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_tm.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_tm_driver.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_ethdev_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/ethdev/rte_eth_ctrl.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/pci/rte_pci.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_num.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_string.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_rdline.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_vt100.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_socket.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_cirbuf.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cmdline/cmdline_parse_portlist.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_fbk_hash.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_hash_crc.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_hash.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_jhash.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_thash.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_thash_gfni.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_crc_arm64.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_crc_generic.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_crc_sw.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_crc_x86.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/hash/rte_thash_x86_gfni.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/timer/rte_timer.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/compressdev/rte_compressdev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/compressdev/rte_comp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.148 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto_sym.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_crypto_asym.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/cryptodev/rte_cryptodev_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_core.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/power/rte_power.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/power/rte_power_guest_channel.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/power/rte_power_pmd_mgmt.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/power/rte_power_uncore.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/reorder/rte_reorder.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/security/rte_security.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/security/rte_security_driver.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/vhost/rte_vdpa.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost_async.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/lib/vhost/rte_vhost_crypto.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/drivers/bus/pci/rte_bus_pci.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/dpdk-cmdline-gen.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/usertools/dpdk-devbind.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/usertools/dpdk-pmdinfo.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/usertools/dpdk-telemetry.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/usertools/dpdk-hugepages.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/usertools/dpdk-rss-flows.py to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp/rte_build_config.h to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/include 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig 00:48:50.149 Installing /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp/meson-private/libdpdk.pc to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig 00:48:50.149 Installing symlink pointing to librte_log.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_log.so.24 00:48:50.149 Installing symlink pointing to librte_log.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_log.so 00:48:50.149 Installing symlink pointing to librte_kvargs.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_kvargs.so.24 00:48:50.149 Installing symlink pointing to librte_kvargs.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_kvargs.so 00:48:50.149 Installing symlink pointing to librte_telemetry.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_telemetry.so.24 00:48:50.149 Installing symlink pointing to librte_telemetry.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_telemetry.so 00:48:50.149 Installing symlink pointing to librte_eal.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_eal.so.24 00:48:50.149 Installing symlink pointing to librte_eal.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_eal.so 00:48:50.149 './librte_bus_pci.so' -> 'dpdk/pmds-24.1/librte_bus_pci.so' 00:48:50.149 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.1/librte_bus_pci.so.24' 00:48:50.149 './librte_bus_pci.so.24.1' -> 'dpdk/pmds-24.1/librte_bus_pci.so.24.1' 00:48:50.149 './librte_bus_vdev.so' -> 'dpdk/pmds-24.1/librte_bus_vdev.so' 00:48:50.149 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.1/librte_bus_vdev.so.24' 00:48:50.149 './librte_bus_vdev.so.24.1' -> 'dpdk/pmds-24.1/librte_bus_vdev.so.24.1' 00:48:50.149 './librte_mempool_ring.so' -> 'dpdk/pmds-24.1/librte_mempool_ring.so' 00:48:50.149 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.1/librte_mempool_ring.so.24' 00:48:50.149 './librte_mempool_ring.so.24.1' -> 'dpdk/pmds-24.1/librte_mempool_ring.so.24.1' 00:48:50.149 Installing symlink pointing to librte_ring.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_ring.so.24 00:48:50.149 Installing symlink pointing to librte_ring.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_ring.so 00:48:50.149 Installing symlink pointing to librte_rcu.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_rcu.so.24 00:48:50.149 Installing symlink pointing to librte_rcu.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_rcu.so 00:48:50.149 Installing symlink pointing to librte_mempool.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_mempool.so.24 00:48:50.149 Installing symlink pointing to librte_mempool.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_mempool.so 00:48:50.149 Installing symlink pointing to librte_mbuf.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_mbuf.so.24 00:48:50.149 Installing symlink pointing to librte_mbuf.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_mbuf.so 00:48:50.149 Installing symlink pointing to librte_net.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_net.so.24 00:48:50.149 Installing symlink pointing to librte_net.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_net.so 00:48:50.149 Installing symlink pointing to librte_meter.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_meter.so.24 00:48:50.149 Installing symlink pointing to librte_meter.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_meter.so 00:48:50.149 Installing symlink pointing to librte_ethdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_ethdev.so.24 00:48:50.149 Installing symlink pointing to librte_ethdev.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_ethdev.so 00:48:50.149 Installing symlink pointing to librte_pci.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_pci.so.24 00:48:50.149 Installing symlink pointing to librte_pci.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_pci.so 00:48:50.149 Installing symlink pointing to librte_cmdline.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_cmdline.so.24 00:48:50.149 Installing symlink pointing to librte_cmdline.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_cmdline.so 00:48:50.149 Installing symlink pointing to librte_hash.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_hash.so.24 00:48:50.149 Installing symlink pointing to librte_hash.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_hash.so 00:48:50.149 Installing symlink pointing to librte_timer.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_timer.so.24 00:48:50.149 Installing symlink pointing to librte_timer.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_timer.so 00:48:50.149 Installing symlink pointing to librte_compressdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_compressdev.so.24 00:48:50.149 Installing symlink pointing to librte_compressdev.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_compressdev.so 00:48:50.149 Installing symlink pointing to librte_cryptodev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_cryptodev.so.24 00:48:50.149 Installing symlink pointing to librte_cryptodev.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_cryptodev.so 00:48:50.149 Installing symlink pointing to librte_dmadev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_dmadev.so.24 00:48:50.149 Installing symlink pointing to librte_dmadev.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_dmadev.so 00:48:50.149 Installing symlink pointing to librte_power.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_power.so.24 00:48:50.149 Installing symlink pointing to librte_power.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_power.so 00:48:50.149 Installing symlink pointing to librte_reorder.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_reorder.so.24 00:48:50.149 Installing symlink pointing to librte_reorder.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_reorder.so 00:48:50.149 Installing symlink pointing to librte_security.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_security.so.24 00:48:50.149 Installing symlink pointing to librte_security.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_security.so 00:48:50.150 Installing symlink pointing to librte_vhost.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_vhost.so.24 00:48:50.150 Installing symlink pointing to librte_vhost.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/librte_vhost.so 00:48:50.150 Installing symlink pointing to librte_bus_pci.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_pci.so.24 00:48:50.150 Installing symlink pointing to librte_bus_pci.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_pci.so 00:48:50.150 Installing symlink pointing to librte_bus_vdev.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_vdev.so.24 00:48:50.150 Installing symlink pointing to librte_bus_vdev.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_bus_vdev.so 00:48:50.150 Installing symlink pointing to librte_mempool_ring.so.24.1 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_mempool_ring.so.24 00:48:50.150 Installing symlink pointing to librte_mempool_ring.so.24 to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/dpdk/pmds-24.1/librte_mempool_ring.so 00:48:50.150 Running custom install script '/bin/sh /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.1' 00:48:53.436 The Meson build system 00:48:53.436 Version: 1.4.0 00:48:53.436 Source dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk 00:48:53.436 Build dir: /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build-tmp 00:48:53.436 Build type: native build 00:48:53.436 Program cat found: YES (/bin/cat) 00:48:53.436 Project name: DPDK 00:48:53.436 Project version: 24.03.0 00:48:53.436 C compiler for the host machine: cc (gcc 11.4.1 "cc (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)") 00:48:53.436 C linker for the host machine: cc ld.bfd 2.35.2-42 00:48:53.436 Host machine cpu family: x86_64 00:48:53.436 Host machine cpu: x86_64 00:48:53.436 Message: ## Building in Developer Mode ## 00:48:53.436 Program pkg-config found: YES (/bin/pkg-config) 00:48:53.436 Program check-symbols.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/check-symbols.sh) 00:48:53.436 Program options-ibverbs-static.sh found: YES (/home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/buildtools/options-ibverbs-static.sh) 00:48:53.436 Program python3 found: YES (/usr/bin/python3) 00:48:53.436 Program cat found: YES (/bin/cat) 00:48:53.436 Compiler for C supports arguments -march=native: YES (cached) 00:48:53.436 Checking for size of "void *" : 8 (cached) 00:48:53.436 Checking for size of "void *" : 8 (cached) 00:48:53.436 Compiler for C supports link arguments -Wl,--undefined-version: NO (cached) 00:48:53.436 Library m found: YES 00:48:53.436 Library numa found: YES 00:48:53.436 Has header "numaif.h" : YES (cached) 00:48:53.436 Library fdt found: NO 00:48:53.436 Library execinfo found: NO 00:48:53.436 Has header "execinfo.h" : YES (cached) 00:48:53.436 Found pkg-config: YES (/bin/pkg-config) 1.7.3 00:48:53.436 Run-time dependency libarchive found: NO (tried pkgconfig) 00:48:53.436 Run-time dependency libbsd found: NO (tried pkgconfig) 00:48:53.436 Run-time dependency jansson found: NO (tried pkgconfig) 00:48:53.436 Dependency openssl found: YES 3.0.7 (cached) 00:48:53.436 Run-time dependency libpcap found: NO (tried pkgconfig) 00:48:53.436 Library pcap found: NO 00:48:53.436 Compiler for C supports arguments -Wcast-qual: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wdeprecated: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wformat: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wformat-nonliteral: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wformat-security: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wmissing-declarations: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wmissing-prototypes: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wnested-externs: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wold-style-definition: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wpointer-arith: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wsign-compare: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wstrict-prototypes: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wundef: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wwrite-strings: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wno-address-of-packed-member: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wno-packed-not-aligned: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wno-missing-field-initializers: YES (cached) 00:48:53.436 Compiler for C supports arguments -Wno-zero-length-bounds: YES (cached) 00:48:53.436 Program objdump found: YES (/bin/objdump) 00:48:53.436 Compiler for C supports arguments -mavx512f: YES (cached) 00:48:53.436 Checking if "AVX512 checking" compiles: YES (cached) 00:48:53.436 Fetching value of define "__SSE4_2__" : 1 (cached) 00:48:53.436 Fetching value of define "__AES__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX2__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512BW__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512CD__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512F__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512VL__" : 1 (cached) 00:48:53.436 Fetching value of define "__PCLMUL__" : 1 (cached) 00:48:53.436 Fetching value of define "__RDRND__" : 1 (cached) 00:48:53.436 Fetching value of define "__RDSEED__" : 1 (cached) 00:48:53.436 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:48:53.436 Fetching value of define "__znver1__" : (undefined) (cached) 00:48:53.436 Fetching value of define "__znver2__" : (undefined) (cached) 00:48:53.436 Fetching value of define "__znver3__" : (undefined) (cached) 00:48:53.436 Fetching value of define "__znver4__" : (undefined) (cached) 00:48:53.436 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:48:53.436 Message: lib/log: Defining dependency "log" 00:48:53.436 Message: lib/kvargs: Defining dependency "kvargs" 00:48:53.436 Message: lib/telemetry: Defining dependency "telemetry" 00:48:53.436 Checking for function "getentropy" : NO (cached) 00:48:53.436 Message: lib/eal: Defining dependency "eal" 00:48:53.436 Message: lib/ring: Defining dependency "ring" 00:48:53.436 Message: lib/rcu: Defining dependency "rcu" 00:48:53.436 Message: lib/mempool: Defining dependency "mempool" 00:48:53.436 Message: lib/mbuf: Defining dependency "mbuf" 00:48:53.436 Fetching value of define "__PCLMUL__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512F__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512BW__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:48:53.436 Fetching value of define "__AVX512VL__" : 1 (cached) 00:48:53.436 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:48:53.436 Compiler for C supports arguments -mpclmul: YES (cached) 00:48:53.436 Compiler for C supports arguments -maes: YES (cached) 00:48:53.436 Compiler for C supports arguments -mavx512f: YES (cached) 00:48:53.436 Compiler for C supports arguments -mavx512bw: YES (cached) 00:48:53.436 Compiler for C supports arguments -mavx512dq: YES (cached) 00:48:53.436 Compiler for C supports arguments -mavx512vl: YES (cached) 00:48:53.437 Compiler for C supports arguments -mvpclmulqdq: YES (cached) 00:48:53.437 Compiler for C supports arguments -mavx2: YES (cached) 00:48:53.437 Compiler for C supports arguments -mavx: YES (cached) 00:48:53.437 Message: lib/net: Defining dependency "net" 00:48:53.437 Message: lib/meter: Defining dependency "meter" 00:48:53.437 Message: lib/ethdev: Defining dependency "ethdev" 00:48:53.437 Message: lib/pci: Defining dependency "pci" 00:48:53.437 Message: lib/cmdline: Defining dependency "cmdline" 00:48:53.437 Message: lib/hash: Defining dependency "hash" 00:48:53.437 Message: lib/timer: Defining dependency "timer" 00:48:53.437 Message: lib/compressdev: Defining dependency "compressdev" 00:48:53.437 Message: lib/cryptodev: Defining dependency "cryptodev" 00:48:53.437 Message: lib/dmadev: Defining dependency "dmadev" 00:48:53.437 Compiler for C supports arguments -Wno-cast-qual: YES (cached) 00:48:53.437 Message: lib/power: Defining dependency "power" 00:48:53.437 Message: lib/reorder: Defining dependency "reorder" 00:48:53.437 Message: lib/security: Defining dependency "security" 00:48:53.437 Has header "linux/userfaultfd.h" : YES (cached) 00:48:53.437 Has header "linux/vduse.h" : NO (cached) 00:48:53.437 Message: lib/vhost: Defining dependency "vhost" 00:48:53.437 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:48:53.437 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:48:53.437 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:48:53.437 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:48:53.437 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:48:53.437 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:48:53.437 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:48:53.437 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:48:53.437 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:48:53.437 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:48:53.437 Program doxygen found: YES (/bin/doxygen) 00:48:53.437 Configuring doxy-api-html.conf using configuration 00:48:53.437 Configuring doxy-api-man.conf using configuration 00:48:53.437 Program mandb found: YES (/bin/mandb) 00:48:53.437 Program sphinx-build found: NO 00:48:53.437 Configuring rte_build_config.h using configuration 00:48:53.437 Message: 00:48:53.437 ================= 00:48:53.437 Applications Enabled 00:48:53.437 ================= 00:48:53.437 00:48:53.437 apps: 00:48:53.437 00:48:53.437 00:48:53.437 Message: 00:48:53.437 ================= 00:48:53.437 Libraries Enabled 00:48:53.437 ================= 00:48:53.437 00:48:53.437 libs: 00:48:53.437 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:48:53.437 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:48:53.437 cryptodev, dmadev, power, reorder, security, vhost, 00:48:53.437 00:48:53.437 Message: 00:48:53.437 =============== 00:48:53.437 Drivers Enabled 00:48:53.437 =============== 00:48:53.437 00:48:53.437 common: 00:48:53.437 00:48:53.437 bus: 00:48:53.437 pci, vdev, 00:48:53.437 mempool: 00:48:53.437 ring, 00:48:53.437 dma: 00:48:53.437 00:48:53.437 net: 00:48:53.437 00:48:53.437 crypto: 00:48:53.437 00:48:53.437 compress: 00:48:53.437 00:48:53.437 vdpa: 00:48:53.437 00:48:53.437 00:48:53.437 Message: 00:48:53.437 ================= 00:48:53.437 Content Skipped 00:48:53.437 ================= 00:48:53.437 00:48:53.437 apps: 00:48:53.437 dumpcap: explicitly disabled via build config 00:48:53.437 graph: explicitly disabled via build config 00:48:53.437 pdump: explicitly disabled via build config 00:48:53.437 proc-info: explicitly disabled via build config 00:48:53.437 test-acl: explicitly disabled via build config 00:48:53.437 test-bbdev: explicitly disabled via build config 00:48:53.437 test-cmdline: explicitly disabled via build config 00:48:53.437 test-compress-perf: explicitly disabled via build config 00:48:53.437 test-crypto-perf: explicitly disabled via build config 00:48:53.437 test-dma-perf: explicitly disabled via build config 00:48:53.437 test-eventdev: explicitly disabled via build config 00:48:53.437 test-fib: explicitly disabled via build config 00:48:53.437 test-flow-perf: explicitly disabled via build config 00:48:53.437 test-gpudev: explicitly disabled via build config 00:48:53.437 test-mldev: explicitly disabled via build config 00:48:53.437 test-pipeline: explicitly disabled via build config 00:48:53.437 test-pmd: explicitly disabled via build config 00:48:53.437 test-regex: explicitly disabled via build config 00:48:53.437 test-sad: explicitly disabled via build config 00:48:53.437 test-security-perf: explicitly disabled via build config 00:48:53.437 00:48:53.437 libs: 00:48:53.437 argparse: explicitly disabled via build config 00:48:53.437 metrics: explicitly disabled via build config 00:48:53.437 acl: explicitly disabled via build config 00:48:53.437 bbdev: explicitly disabled via build config 00:48:53.437 bitratestats: explicitly disabled via build config 00:48:53.437 bpf: explicitly disabled via build config 00:48:53.437 cfgfile: explicitly disabled via build config 00:48:53.437 distributor: explicitly disabled via build config 00:48:53.437 efd: explicitly disabled via build config 00:48:53.437 eventdev: explicitly disabled via build config 00:48:53.437 dispatcher: explicitly disabled via build config 00:48:53.437 gpudev: explicitly disabled via build config 00:48:53.437 gro: explicitly disabled via build config 00:48:53.437 gso: explicitly disabled via build config 00:48:53.437 ip_frag: explicitly disabled via build config 00:48:53.437 jobstats: explicitly disabled via build config 00:48:53.437 latencystats: explicitly disabled via build config 00:48:53.437 lpm: explicitly disabled via build config 00:48:53.437 member: explicitly disabled via build config 00:48:53.437 pcapng: explicitly disabled via build config 00:48:53.437 rawdev: explicitly disabled via build config 00:48:53.437 regexdev: explicitly disabled via build config 00:48:53.437 mldev: explicitly disabled via build config 00:48:53.437 rib: explicitly disabled via build config 00:48:53.437 sched: explicitly disabled via build config 00:48:53.437 stack: explicitly disabled via build config 00:48:53.437 ipsec: explicitly disabled via build config 00:48:53.437 pdcp: explicitly disabled via build config 00:48:53.437 fib: explicitly disabled via build config 00:48:53.437 port: explicitly disabled via build config 00:48:53.437 pdump: explicitly disabled via build config 00:48:53.437 table: explicitly disabled via build config 00:48:53.437 pipeline: explicitly disabled via build config 00:48:53.437 graph: explicitly disabled via build config 00:48:53.437 node: explicitly disabled via build config 00:48:53.437 00:48:53.437 drivers: 00:48:53.437 common/cpt: not in enabled drivers build config 00:48:53.437 common/dpaax: not in enabled drivers build config 00:48:53.437 common/iavf: not in enabled drivers build config 00:48:53.437 common/idpf: not in enabled drivers build config 00:48:53.437 common/ionic: not in enabled drivers build config 00:48:53.437 common/mvep: not in enabled drivers build config 00:48:53.437 common/octeontx: not in enabled drivers build config 00:48:53.437 bus/auxiliary: not in enabled drivers build config 00:48:53.437 bus/cdx: not in enabled drivers build config 00:48:53.437 bus/dpaa: not in enabled drivers build config 00:48:53.437 bus/fslmc: not in enabled drivers build config 00:48:53.437 bus/ifpga: not in enabled drivers build config 00:48:53.437 bus/platform: not in enabled drivers build config 00:48:53.437 bus/uacce: not in enabled drivers build config 00:48:53.437 bus/vmbus: not in enabled drivers build config 00:48:53.437 common/cnxk: not in enabled drivers build config 00:48:53.437 common/mlx5: not in enabled drivers build config 00:48:53.437 common/nfp: not in enabled drivers build config 00:48:53.437 common/nitrox: not in enabled drivers build config 00:48:53.437 common/qat: not in enabled drivers build config 00:48:53.437 common/sfc_efx: not in enabled drivers build config 00:48:53.437 mempool/bucket: not in enabled drivers build config 00:48:53.437 mempool/cnxk: not in enabled drivers build config 00:48:53.437 mempool/dpaa: not in enabled drivers build config 00:48:53.437 mempool/dpaa2: not in enabled drivers build config 00:48:53.437 mempool/octeontx: not in enabled drivers build config 00:48:53.437 mempool/stack: not in enabled drivers build config 00:48:53.437 dma/cnxk: not in enabled drivers build config 00:48:53.437 dma/dpaa: not in enabled drivers build config 00:48:53.437 dma/dpaa2: not in enabled drivers build config 00:48:53.437 dma/hisilicon: not in enabled drivers build config 00:48:53.437 dma/idxd: not in enabled drivers build config 00:48:53.437 dma/ioat: not in enabled drivers build config 00:48:53.437 dma/skeleton: not in enabled drivers build config 00:48:53.437 net/af_packet: not in enabled drivers build config 00:48:53.437 net/af_xdp: not in enabled drivers build config 00:48:53.437 net/ark: not in enabled drivers build config 00:48:53.437 net/atlantic: not in enabled drivers build config 00:48:53.437 net/avp: not in enabled drivers build config 00:48:53.437 net/axgbe: not in enabled drivers build config 00:48:53.437 net/bnx2x: not in enabled drivers build config 00:48:53.437 net/bnxt: not in enabled drivers build config 00:48:53.437 net/bonding: not in enabled drivers build config 00:48:53.437 net/cnxk: not in enabled drivers build config 00:48:53.437 net/cpfl: not in enabled drivers build config 00:48:53.437 net/cxgbe: not in enabled drivers build config 00:48:53.437 net/dpaa: not in enabled drivers build config 00:48:53.437 net/dpaa2: not in enabled drivers build config 00:48:53.437 net/e1000: not in enabled drivers build config 00:48:53.437 net/ena: not in enabled drivers build config 00:48:53.437 net/enetc: not in enabled drivers build config 00:48:53.437 net/enetfec: not in enabled drivers build config 00:48:53.437 net/enic: not in enabled drivers build config 00:48:53.437 net/failsafe: not in enabled drivers build config 00:48:53.437 net/fm10k: not in enabled drivers build config 00:48:53.437 net/gve: not in enabled drivers build config 00:48:53.437 net/hinic: not in enabled drivers build config 00:48:53.437 net/hns3: not in enabled drivers build config 00:48:53.437 net/i40e: not in enabled drivers build config 00:48:53.437 net/iavf: not in enabled drivers build config 00:48:53.437 net/ice: not in enabled drivers build config 00:48:53.437 net/idpf: not in enabled drivers build config 00:48:53.438 net/igc: not in enabled drivers build config 00:48:53.438 net/ionic: not in enabled drivers build config 00:48:53.438 net/ipn3ke: not in enabled drivers build config 00:48:53.438 net/ixgbe: not in enabled drivers build config 00:48:53.438 net/mana: not in enabled drivers build config 00:48:53.438 net/memif: not in enabled drivers build config 00:48:53.438 net/mlx4: not in enabled drivers build config 00:48:53.438 net/mlx5: not in enabled drivers build config 00:48:53.438 net/mvneta: not in enabled drivers build config 00:48:53.438 net/mvpp2: not in enabled drivers build config 00:48:53.438 net/netvsc: not in enabled drivers build config 00:48:53.438 net/nfb: not in enabled drivers build config 00:48:53.438 net/nfp: not in enabled drivers build config 00:48:53.438 net/ngbe: not in enabled drivers build config 00:48:53.438 net/null: not in enabled drivers build config 00:48:53.438 net/octeontx: not in enabled drivers build config 00:48:53.438 net/octeon_ep: not in enabled drivers build config 00:48:53.438 net/pcap: not in enabled drivers build config 00:48:53.438 net/pfe: not in enabled drivers build config 00:48:53.438 net/qede: not in enabled drivers build config 00:48:53.438 net/ring: not in enabled drivers build config 00:48:53.438 net/sfc: not in enabled drivers build config 00:48:53.438 net/softnic: not in enabled drivers build config 00:48:53.438 net/tap: not in enabled drivers build config 00:48:53.438 net/thunderx: not in enabled drivers build config 00:48:53.438 net/txgbe: not in enabled drivers build config 00:48:53.438 net/vdev_netvsc: not in enabled drivers build config 00:48:53.438 net/vhost: not in enabled drivers build config 00:48:53.438 net/virtio: not in enabled drivers build config 00:48:53.438 net/vmxnet3: not in enabled drivers build config 00:48:53.438 raw/*: missing internal dependency, "rawdev" 00:48:53.438 crypto/armv8: not in enabled drivers build config 00:48:53.438 crypto/bcmfs: not in enabled drivers build config 00:48:53.438 crypto/caam_jr: not in enabled drivers build config 00:48:53.438 crypto/ccp: not in enabled drivers build config 00:48:53.438 crypto/cnxk: not in enabled drivers build config 00:48:53.438 crypto/dpaa_sec: not in enabled drivers build config 00:48:53.438 crypto/dpaa2_sec: not in enabled drivers build config 00:48:53.438 crypto/ipsec_mb: not in enabled drivers build config 00:48:53.438 crypto/mlx5: not in enabled drivers build config 00:48:53.438 crypto/mvsam: not in enabled drivers build config 00:48:53.438 crypto/nitrox: not in enabled drivers build config 00:48:53.438 crypto/null: not in enabled drivers build config 00:48:53.438 crypto/octeontx: not in enabled drivers build config 00:48:53.438 crypto/openssl: not in enabled drivers build config 00:48:53.438 crypto/scheduler: not in enabled drivers build config 00:48:53.438 crypto/uadk: not in enabled drivers build config 00:48:53.438 crypto/virtio: not in enabled drivers build config 00:48:53.438 compress/isal: not in enabled drivers build config 00:48:53.438 compress/mlx5: not in enabled drivers build config 00:48:53.438 compress/nitrox: not in enabled drivers build config 00:48:53.438 compress/octeontx: not in enabled drivers build config 00:48:53.438 compress/zlib: not in enabled drivers build config 00:48:53.438 regex/*: missing internal dependency, "regexdev" 00:48:53.438 ml/*: missing internal dependency, "mldev" 00:48:53.438 vdpa/ifc: not in enabled drivers build config 00:48:53.438 vdpa/mlx5: not in enabled drivers build config 00:48:53.438 vdpa/nfp: not in enabled drivers build config 00:48:53.438 vdpa/sfc: not in enabled drivers build config 00:48:53.438 event/*: missing internal dependency, "eventdev" 00:48:53.438 baseband/*: missing internal dependency, "bbdev" 00:48:53.438 gpu/*: missing internal dependency, "gpudev" 00:48:53.438 00:48:53.438 00:48:53.438 Cleaning... 0 files. 00:48:53.438 Build targets in project: 85 00:48:53.438 00:48:53.438 DPDK 24.03.0 00:48:53.438 00:48:53.438 User defined options 00:48:53.438 default_library : shared 00:48:53.438 libdir : lib 00:48:53.438 prefix : /home/vagrant/rpmbuild/BUILD/spdk-test_gen_spec/dpdk/build 00:48:53.438 c_args : -Wno-stringop-overflow -fcommon -fPIC -Wno-error 00:48:53.438 c_link_args : 00:48:53.438 cpu_instruction_set: native 00:48:53.438 disable_apps : dumpcap,test-eventdev,test,graph,test-fib,pdump,test-flow-perf,proc-info,test-gpudev,test-acl,test-mldev,test-bbdev,test-pipeline,test-cmdline,test-pmd,test-compress-perf,test-regex,test-crypto-perf,test-sad,test-dma-perf,test-security-perf 00:48:53.438 disable_libs : bitratestats,efd,ipsec,metrics,table,bpf,jobstats,mldev,rawdev,cfgfile,eventdev,fib,latencystats,node,regexdev,gpudev,pcapng,graph,lpm,rib,dispatcher,gro,pdcp,acl,distributor,gso,member,pdump,sched,argparse,pipeline,bbdev,ip_frag,port,stack 00:48:53.438 enable_docs : false 00:48:53.438 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:48:53.438 enable_kmods : false 00:48:53.438 tests : false 00:48:53.438 00:48:53.438 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ut_mock.a 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_log.a 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_ut_mock.pc 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_log.pc 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ut_mock.so 00:48:54.004 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_log.so 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_dma.a 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_dma.pc 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_util.a 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_trace_parser.a 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_util.pc 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_dma.so 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_trace_parser.pc 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ioat.a 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_util.so 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_trace_parser.so 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_ioat.pc 00:48:54.263 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ioat.so 00:48:54.520 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vfio_user.a 00:48:54.520 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_vfio_user.pc 00:48:54.520 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vfio_user.so 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vmd.a 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_conf.a 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_vmd.pc 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_json.a 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_conf.pc 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_json.pc 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vmd.so 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_conf.so 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_json.so 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_dpdklibs.pc 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_env_dpdk.a 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk.pc 00:48:54.521 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_env_dpdk.so 00:48:55.085 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_jsonrpc.a 00:48:55.085 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_jsonrpc.pc 00:48:55.085 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_jsonrpc.so 00:48:55.343 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_rpc.a 00:48:55.343 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_rpc.pc 00:48:55.343 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_rpc.so 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_trace.a 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_notify.a 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_notify.pc 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_trace.pc 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring.a 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring.pc 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_notify.so 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_trace.so 00:48:55.635 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring.so 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_sock.a 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_sock.pc 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_thread.a 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_thread.pc 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_sock.so 00:48:55.894 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_thread.so 00:48:56.152 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nvme.a 00:48:56.152 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_nvme.pc 00:48:56.152 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blob.a 00:48:56.152 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_init.a 00:48:56.152 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_virtio.a 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_blob.pc 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel.a 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nvme.so 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_virtio.pc 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_accel.pc 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_init.pc 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blob.so 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_virtio.so 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_init.so 00:48:56.153 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel.so 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blobfs.a 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs.pc 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev.a 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_lvol.a 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blobfs.so 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev.pc 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event.a 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_lvol.pc 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event.pc 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev.so 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_lvol.so 00:48:56.411 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event.so 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ftl.a 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_ftl.pc 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nbd.a 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nvmf.a 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scsi.a 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_nvmf.pc 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_nbd.pc 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_scsi.pc 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_ftl.so 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nvmf.so 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_nbd.so 00:48:56.670 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scsi.so 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_iscsi.a 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vhost.a 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_iscsi.pc 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_vhost.pc 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_iscsi.so 00:48:56.928 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_vhost.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk_rpc.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring_file.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_file.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blob_bdev.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dpdk_governor.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_blob_bdev.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_sock_posix.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel_error.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_gscheduler.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel_ioat.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dynamic.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring_linux.a 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring_file.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_error.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_ioat.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_posix.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_linux.pc 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blob_bdev.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel_error.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_sock_posix.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_accel_ioat.so 00:48:57.495 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_keyring_linux.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_error.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_error.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_delay.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_delay.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_malloc.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_raid.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_lvol.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_malloc.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_error.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_raid.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_lvol.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_delay.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_passthru.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_malloc.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs_bdev.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_passthru.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_null.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_lvol.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_raid.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_nvme.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_gpt.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_null.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_nvme.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_passthru.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_gpt.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_gpt.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_null.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_nvme.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_split.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_split.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_zone_block.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_aio.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_virtio.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_aio.pc 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_ftl.a 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_split.so 00:48:58.063 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.so 00:48:58.322 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_virtio.pc 00:48:58.322 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_ftl.pc 00:48:58.322 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_aio.so 00:48:58.322 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_virtio.so 00:48:58.322 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_bdev_ftl.so 00:48:58.580 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_sock.a 00:48:58.580 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_sock.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_iobuf.a 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_sock.so 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_scheduler.a 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_keyring.a 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vmd.a 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iobuf.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.a 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scheduler.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_keyring.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_blk.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vmd.pc 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_iobuf.so 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_scheduler.so 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_keyring.so 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.so 00:48:58.839 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vmd.so 00:48:59.098 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_accel.a 00:48:59.098 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_accel.pc 00:48:59.098 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_accel.so 00:48:59.356 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_bdev.a 00:48:59.356 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_bdev.pc 00:48:59.356 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_bdev.so 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_nvmf.a 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_nbd.a 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nvmf.pc 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nbd.pc 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_nvmf.so 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_scsi.a 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_nbd.so 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scsi.pc 00:48:59.628 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_scsi.so 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.a 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_scsi.pc 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_iscsi.a 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.so 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iscsi.pc 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk_event_iscsi.so 00:48:59.890 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_modules.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_modules.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_modules.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_syslibs.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_modules.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_modules.pc 00:49:00.148 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/lib/libspdk.so 00:49:00.406 make[1]: Nothing to be done for 'install'. 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_trace_record 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_nvme_perf 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_trace 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/iscsi_tgt 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/nvmf_tgt 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_lspci 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_tgt 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_nvme_identify 00:49:00.664 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_nvme_discover 00:49:00.922 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_top 00:49:00.922 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/vhost 00:49:00.922 INSTALL /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local/bin/spdk_dd 00:49:01.180 Installed to /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64/usr/local 00:49:02.117 Processing files: spdk-test_gen_spec-1.x86_64 00:49:02.376 Provides: spdk = test_gen_spec-1 spdk(x86-64) = test_gen_spec-1 00:49:02.376 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:49:02.376 Requires: /usr/bin/env libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3.3)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libcrypto.so.3()(64bit) libfuse3.so.3()(64bit) libgcc_s.so.1()(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libm.so.6()(64bit) libmenu.so.6()(64bit) libncurses.so.6()(64bit) libpanel.so.6()(64bit) librte_bus_pci.so.24()(64bit) librte_cryptodev.so.24()(64bit) librte_dmadev.so.24()(64bit) librte_eal.so.24()(64bit) librte_ethdev.so.24()(64bit) librte_hash.so.24()(64bit) librte_kvargs.so.24()(64bit) librte_log.so.24()(64bit) librte_mbuf.so.24()(64bit) librte_mempool.so.24()(64bit) librte_mempool_ring.so.24()(64bit) librte_net.so.24()(64bit) librte_pci.so.24()(64bit) librte_power.so.24()(64bit) librte_rcu.so.24()(64bit) librte_ring.so.24()(64bit) librte_telemetry.so.24()(64bit) librte_vhost.so.24()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libstdc++.so.6()(64bit) libtinfo.so.6()(64bit) libuuid.so.1()(64bit) rtld(GNU_HASH) 00:49:02.376 Processing files: spdk-devel-test_gen_spec-1.x86_64 00:49:05.713 Provides: libisal_crypto.so.2()(64bit) librte_bus_pci.so.24()(64bit) librte_bus_pci.so.24(DPDK_24)(64bit) librte_bus_pci.so.24(EXPERIMENTAL)(64bit) librte_bus_pci.so.24(INTERNAL)(64bit) librte_bus_vdev.so.24()(64bit) librte_bus_vdev.so.24(DPDK_24)(64bit) librte_bus_vdev.so.24(INTERNAL)(64bit) librte_cmdline.so.24()(64bit) librte_cmdline.so.24(DPDK_24)(64bit) librte_compressdev.so.24()(64bit) librte_compressdev.so.24(DPDK_24)(64bit) librte_cryptodev.so.24()(64bit) librte_cryptodev.so.24(DPDK_24)(64bit) librte_cryptodev.so.24(EXPERIMENTAL)(64bit) librte_cryptodev.so.24(INTERNAL)(64bit) librte_dmadev.so.24()(64bit) librte_dmadev.so.24(DPDK_24)(64bit) librte_dmadev.so.24(EXPERIMENTAL)(64bit) librte_dmadev.so.24(INTERNAL)(64bit) librte_eal.so.24()(64bit) librte_eal.so.24(DPDK_24)(64bit) librte_eal.so.24(EXPERIMENTAL)(64bit) librte_eal.so.24(INTERNAL)(64bit) librte_ethdev.so.24()(64bit) librte_ethdev.so.24(DPDK_24)(64bit) librte_ethdev.so.24(EXPERIMENTAL)(64bit) librte_ethdev.so.24(INTERNAL)(64bit) librte_hash.so.24()(64bit) librte_hash.so.24(DPDK_24)(64bit) librte_hash.so.24(INTERNAL)(64bit) librte_kvargs.so.24()(64bit) librte_kvargs.so.24(DPDK_24)(64bit) librte_log.so.24()(64bit) librte_log.so.24(DPDK_24)(64bit) librte_log.so.24(INTERNAL)(64bit) librte_mbuf.so.24()(64bit) librte_mbuf.so.24(DPDK_24)(64bit) librte_mempool.so.24()(64bit) librte_mempool.so.24(DPDK_24)(64bit) librte_mempool.so.24(EXPERIMENTAL)(64bit) librte_mempool.so.24(INTERNAL)(64bit) librte_mempool_ring.so.24()(64bit) librte_mempool_ring.so.24(DPDK_24)(64bit) librte_meter.so.24()(64bit) librte_meter.so.24(DPDK_24)(64bit) librte_net.so.24()(64bit) librte_net.so.24(DPDK_24)(64bit) librte_pci.so.24()(64bit) librte_pci.so.24(DPDK_24)(64bit) librte_power.so.24()(64bit) librte_power.so.24(DPDK_24)(64bit) librte_power.so.24(EXPERIMENTAL)(64bit) librte_rcu.so.24()(64bit) librte_rcu.so.24(DPDK_24)(64bit) librte_reorder.so.24()(64bit) librte_reorder.so.24(DPDK_24)(64bit) librte_reorder.so.24(EXPERIMENTAL)(64bit) librte_ring.so.24()(64bit) librte_ring.so.24(DPDK_24)(64bit) librte_security.so.24()(64bit) librte_security.so.24(DPDK_24)(64bit) librte_security.so.24(EXPERIMENTAL)(64bit) librte_security.so.24(INTERNAL)(64bit) librte_telemetry.so.24()(64bit) librte_telemetry.so.24(DPDK_24)(64bit) librte_telemetry.so.24(EXPERIMENTAL)(64bit) librte_telemetry.so.24(INTERNAL)(64bit) librte_timer.so.24()(64bit) librte_timer.so.24(DPDK_24)(64bit) librte_vhost.so.24()(64bit) librte_vhost.so.24(DPDK_24)(64bit) librte_vhost.so.24(EXPERIMENTAL)(64bit) librte_vhost.so.24(INTERNAL)(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) spdk-devel = test_gen_spec-1 spdk-devel(x86-64) = test_gen_spec-1 00:49:05.713 Requires(interp): /bin/sh 00:49:05.713 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:49:05.713 Requires(post): /bin/sh 00:49:05.714 Requires: ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.22)(64bit) libc.so.6(GLIBC_2.27)(64bit) libc.so.6(GLIBC_2.28)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.32)(64bit) libc.so.6(GLIBC_2.33)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.6)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libcrypto.so.3()(64bit) libcrypto.so.3(OPENSSL_3.0.0)(64bit) libfuse3.so.3()(64bit) libfuse3.so.3(FUSE_3.0)(64bit) libfuse3.so.3(FUSE_3.7)(64bit) libgcc_s.so.1()(64bit) libgcc_s.so.1(GCC_3.0)(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libkeyutils.so.1(KEYUTILS_0.3)(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.2.5)(64bit) libm.so.6(GLIBC_2.29)(64bit) libnuma.so.1()(64bit) libnuma.so.1(libnuma_1.1)(64bit) libnuma.so.1(libnuma_1.2)(64bit) librte_bus_pci.so.24()(64bit) librte_bus_vdev.so.24()(64bit) librte_cmdline.so.24()(64bit) librte_compressdev.so.24()(64bit) librte_cryptodev.so.24()(64bit) librte_cryptodev.so.24(DPDK_24)(64bit) librte_dmadev.so.24()(64bit) librte_dmadev.so.24(DPDK_24)(64bit) librte_dmadev.so.24(INTERNAL)(64bit) librte_eal.so.24()(64bit) librte_eal.so.24(DPDK_24)(64bit) librte_eal.so.24(EXPERIMENTAL)(64bit) librte_eal.so.24(INTERNAL)(64bit) librte_ethdev.so.24()(64bit) librte_ethdev.so.24(DPDK_24)(64bit) librte_ethdev.so.24(EXPERIMENTAL)(64bit) librte_hash.so.24()(64bit) librte_hash.so.24(DPDK_24)(64bit) librte_kvargs.so.24()(64bit) librte_kvargs.so.24(DPDK_24)(64bit) librte_log.so.24()(64bit) librte_log.so.24(DPDK_24)(64bit) librte_log.so.24(INTERNAL)(64bit) librte_mbuf.so.24()(64bit) librte_mbuf.so.24(DPDK_24)(64bit) librte_mempool.so.24()(64bit) librte_mempool.so.24(DPDK_24)(64bit) librte_mempool_ring.so.24()(64bit) librte_meter.so.24()(64bit) librte_net.so.24()(64bit) librte_net.so.24(DPDK_24)(64bit) librte_pci.so.24()(64bit) librte_pci.so.24(DPDK_24)(64bit) librte_power.so.24()(64bit) librte_rcu.so.24()(64bit) librte_rcu.so.24(DPDK_24)(64bit) librte_reorder.so.24()(64bit) librte_ring.so.24()(64bit) librte_ring.so.24(DPDK_24)(64bit) librte_security.so.24()(64bit) librte_telemetry.so.24()(64bit) librte_telemetry.so.24(DPDK_24)(64bit) librte_telemetry.so.24(EXPERIMENTAL)(64bit) librte_telemetry.so.24(INTERNAL)(64bit) librte_timer.so.24()(64bit) librte_vhost.so.24()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libssl.so.3(OPENSSL_3.0.0)(64bit) libstdc++.so.6()(64bit) libstdc++.so.6(CXXABI_1.3)(64bit) libstdc++.so.6(GLIBCXX_3.4)(64bit) libuuid.so.1()(64bit) libuuid.so.1(UUID_1.0)(64bit) libuuid.so.1(UUID_2.31)(64bit) rtld(GNU_HASH) 00:49:05.714 Processing files: spdk-scripts-test_gen_spec-1.x86_64 00:49:05.714 warning: absolute symlink: /etc/bash_completion.d/spdk -> /usr/libexec/spdk/scripts/bash-completion/spdk 00:49:05.714 warning: absolute symlink: /usr/libexec/spdk/include -> /usr/local/include 00:49:06.646 Provides: spdk-scripts = test_gen_spec-1 spdk-scripts(x86-64) = test_gen_spec-1 00:49:06.646 Requires(interp): /bin/sh 00:49:06.646 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:49:06.646 Requires(post): /bin/sh 00:49:06.647 Requires: /bin/bash /usr/bin/env 00:49:06.647 Checking for unpackaged file(s): /usr/lib/rpm/check-files /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64 00:49:16.670 Wrote: /home/vagrant/rpmbuild/SRPMS/spdk-test_gen_spec-1.src.rpm 00:49:16.929 Wrote: /home/vagrant/rpmbuild/RPMS/x86_64/spdk-scripts-test_gen_spec-1.x86_64.rpm 00:49:18.831 Wrote: /home/vagrant/rpmbuild/RPMS/x86_64/spdk-test_gen_spec-1.x86_64.rpm 00:49:28.890 Wrote: /home/vagrant/rpmbuild/RPMS/x86_64/spdk-devel-test_gen_spec-1.x86_64.rpm 00:49:28.890 Executing(%clean): /bin/sh -e /var/tmp/rpm-tmp.DvA3ik 00:49:28.890 + umask 022 00:49:28.890 + cd /home/vagrant/rpmbuild/BUILD 00:49:28.890 + cd spdk-test_gen_spec 00:49:28.890 + /usr/bin/rm -rf /home/vagrant/rpmbuild/BUILDROOT/spdk-test_gen_spec-1.x86_64 00:49:28.890 + RPM_EC=0 00:49:28.890 ++ jobs -p 00:49:28.890 + exit 0 00:49:28.890 12:51:51 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@180 -- $ rm -rf /home/vagrant/rpmbuild/BUILD 00:49:29.148 12:51:52 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@181 -- $ install_uninstall_rpms /home/vagrant/rpmbuild/RPMS 00:49:29.148 12:51:52 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@98 -- $ local rpms 00:49:29.148 12:51:52 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@100 -- $ rpms=("${1:-$builddir/rpm/}/$arch/"*.rpm) 00:49:29.148 12:51:52 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@103 -- $ make -C /home/vagrant/spdk_repo/spdk clean -j10 00:49:29.148 make: Entering directory '/home/vagrant/spdk_repo/spdk' 00:49:29.406 make[1]: Nothing to be done for 'clean'. 00:49:35.999 make[1]: Nothing to be done for 'clean'. 00:49:36.258 make: Leaving directory '/home/vagrant/spdk_repo/spdk' 00:49:36.258 12:51:59 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@105 -- $ sudo rpm -i /home/vagrant/rpmbuild/RPMS/x86_64/spdk-devel-test_gen_spec-1.x86_64.rpm /home/vagrant/rpmbuild/RPMS/x86_64/spdk-scripts-test_gen_spec-1.x86_64.rpm /home/vagrant/rpmbuild/RPMS/x86_64/spdk-test_gen_spec-1.x86_64.rpm 00:49:36.258 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:49:37.633 12:52:01 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@108 -- $ LIST_LIBS=yes 00:49:37.633 12:52:01 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@108 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm-deps.sh spdk_tgt 00:49:37.633 /usr/local/bin/spdk_tgt 00:49:40.164 /usr/lib64/libaio.so.1:libaio-0.3.111-13.el9.x86_64 00:49:40.164 /usr/lib64/libc.so.6:glibc-2.34-83.el9.12.x86_64 00:49:40.164 /usr/lib64/libcrypto.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:49:40.164 /usr/lib64/libfuse3.so.3:fuse3-libs-3.10.2-6.el9.x86_64 00:49:40.164 /usr/lib64/libgcc_s.so.1:libgcc-11.4.1-2.1.el9.x86_64 00:49:40.164 /usr/lib64/libkeyutils.so.1:keyutils-libs-1.6.3-1.el9.x86_64 00:49:40.164 /usr/lib64/libm.so.6:glibc-2.34-83.el9.12.x86_64 00:49:40.164 /usr/lib64/libnuma.so.1:numactl-libs-2.0.16-1.el9.x86_64 00:49:40.164 /usr/lib64/libssl.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:49:40.164 /usr/lib64/libuuid.so.1:libuuid-2.37.4-15.el9.x86_64 00:49:40.164 /usr/lib64/libz.so.1:zlib-1.2.11-40.el9.x86_64 00:49:40.164 /usr/local/lib/libisal_crypto.so.2:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_bus_pci.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_cryptodev.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_dmadev.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_eal.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_ethdev.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_hash.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_kvargs.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_log.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_mbuf.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_mempool.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_mempool_ring.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_meter.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_net.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_pci.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_power.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_rcu.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_ring.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_telemetry.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_timer.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/librte_vhost.so.24:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_accel.so.15.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_accel_error.so.2.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_accel_ioat.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev.so.15.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_aio.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_delay.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_error.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_ftl.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_gpt.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_lvol.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_malloc.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_null.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_nvme.so.7.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_passthru.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_raid.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_split.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_virtio.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_bdev_zone_block.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_blob.so.11.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_blob_bdev.so.11.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_blobfs.so.10.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_blobfs_bdev.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_conf.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_dma.so.4.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_env_dpdk.so.14.1:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_env_dpdk_rpc.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event.so.13.1:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_accel.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_bdev.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_iobuf.so.3.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_iscsi.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_keyring.so.1.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_nbd.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_nvmf.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_scheduler.so.4.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_scsi.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_sock.so.5.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_vhost_blk.so.3.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_vhost_scsi.so.3.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_event_vmd.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_ftl.so.9.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_init.so.5.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_ioat.so.7.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_iscsi.so.8.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_json.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_jsonrpc.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_keyring.so.1.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_keyring_file.so.1.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_keyring_linux.so.1.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_log.so.7.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_lvol.so.10.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_nbd.so.7.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_notify.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_nvme.so.13.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_nvmf.so.18.1:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_rpc.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_scheduler_dpdk_governor.so.4.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_scheduler_dynamic.so.4.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_scheduler_gscheduler.so.4.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_scsi.so.9.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_sock.so.9.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_sock_posix.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_thread.so.10.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_trace.so.10.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_util.so.9.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_vfio_user.so.5.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_vhost.so.8.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_virtio.so.7.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 /usr/local/lib/libspdk_vmd.so.6.0:spdk-devel-test_gen_spec-1.x86_64 00:49:40.164 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@109 -- $ rm /home/vagrant/rpmbuild/RPMS/x86_64/spdk-devel-test_gen_spec-1.x86_64.rpm /home/vagrant/rpmbuild/RPMS/x86_64/spdk-scripts-test_gen_spec-1.x86_64.rpm /home/vagrant/rpmbuild/RPMS/x86_64/spdk-test_gen_spec-1.x86_64.rpm 00:49:40.164 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]##*/}") 00:49:40.164 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]%.rpm}") 00:49:40.164 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- rpm/rpm.sh@111 -- $ sudo rpm -e spdk-devel-test_gen_spec-1.x86_64 spdk-scripts-test_gen_spec-1.x86_64 spdk-test_gen_spec-1.x86_64 00:49:40.164 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:49:40.164 00:49:40.164 real 3m1.203s 00:49:40.164 user 8m26.816s 00:49:40.164 sys 3m40.581s 00:49:40.165 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:49:40.165 12:52:03 packaging.rpm_packaging.build_rpm_from_gen_spec -- common/autotest_common.sh@10 -- $ set +x 00:49:40.165 ************************************ 00:49:40.165 END TEST build_rpm_from_gen_spec 00:49:40.165 ************************************ 00:49:40.165 12:52:03 packaging.rpm_packaging -- rpm/rpm.sh@198 -- $ (( RUN_NIGHTLY == 1 )) 00:49:40.165 12:52:03 packaging.rpm_packaging -- rpm/rpm.sh@199 -- $ run_test build_shared_rpm_with_rpmed_dpdk build_rpm_with_rpmed_dpdk 00:49:40.165 12:52:03 packaging.rpm_packaging -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:49:40.165 12:52:03 packaging.rpm_packaging -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:49:40.165 12:52:03 packaging.rpm_packaging -- common/autotest_common.sh@10 -- $ set +x 00:49:40.165 ************************************ 00:49:40.165 START TEST build_shared_rpm_with_rpmed_dpdk 00:49:40.165 ************************************ 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- common/autotest_common.sh@1124 -- $ build_rpm_with_rpmed_dpdk 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@134 -- $ dpdk_rpms=() 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@134 -- $ local es=0 dpdk_rpms 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@136 -- $ dpdk_rpms=(/var/spdk/dependencies/autotest/dpdk/dpdk?(-devel).rpm) 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@137 -- $ (( 2 == 2 )) 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@138 -- $ rpm -q '--queryformat=%{VERSION}' /var/spdk/dependencies/autotest/dpdk/dpdk-devel.rpm 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@138 -- $ echo 'INFO: Installing DPDK from local package: 22.11' 00:49:40.165 INFO: Installing DPDK from local package: 22.11 00:49:40.165 12:52:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@139 -- $ sudo rpm -i /var/spdk/dependencies/autotest/dpdk/dpdk-devel.rpm /var/spdk/dependencies/autotest/dpdk/dpdk.rpm 00:49:40.165 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:49:41.105 12:52:04 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@144 -- $ build_rpm --with-shared --with-dpdk 00:49:41.105 12:52:04 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@116 -- $ GEN_SPEC=yes 00:49:41.105 12:52:04 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@116 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared --with-dpdk 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 Name: spdk 00:49:41.363 Version: v24.09 00:49:41.363 Release: 1 00:49:41.363 Summary: Storage Performance Development Kit 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 00:49:41.363 Requires: glibc 00:49:41.363 Requires: libaio 00:49:41.363 Requires: libgcc 00:49:41.363 Requires: libstdc++ 00:49:41.363 Requires: libuuid 00:49:41.363 Requires: ncurses-libs 00:49:41.363 Requires: numactl-libs 00:49:41.363 Requires: openssl-libs 00:49:41.363 Requires: zlib 00:49:41.363 00:49:41.363 00:49:41.363 Requires: dpdk-devel >= 22.11 00:49:41.363 00:49:41.363 00:49:41.363 BuildRequires: python3-devel 00:49:41.363 00:49:41.363 00:49:41.363 BuildRequires: dpdk-devel >= 22.11 00:49:41.363 00:49:41.363 00:49:41.363 License: BSD 00:49:41.363 URL: https://spdk.io 00:49:41.363 Source: spdk-v24.09.tar.gz 00:49:41.363 00:49:41.363 %description 00:49:41.363 00:49:41.363 The Storage Performance Development Kit (SPDK) provides a set of tools and libraries for 00:49:41.363 writing high performance, scalable, user-mode storage applications. It achieves high 00:49:41.363 performance by moving all of the necessary drivers into userspace and operating in a 00:49:41.363 polled mode instead of relying on interrupts, which avoids kernel context switches and 00:49:41.363 eliminates interrupt handling overhead. 00:49:41.363 00:49:41.363 %prep 00:49:41.363 make clean -j10 &>/dev/null || : 00:49:41.363 %setup 00:49:41.363 00:49:41.363 %build 00:49:41.363 set +x 00:49:41.363 00:49:41.363 cfs() { 00:49:41.363 (($# > 1)) || return 0 00:49:41.363 00:49:41.363 local dst=$1 f 00:49:41.363 00:49:41.363 mkdir -p "$dst" 00:49:41.363 shift; for f; do [[ -e $f ]] && cp -a "$f" "$dst"; done 00:49:41.363 } 00:49:41.363 00:49:41.363 cl() { 00:49:41.363 [[ -e $2 ]] || return 0 00:49:41.363 00:49:41.363 cfs "$1" $(find "$2" -name '*.so*' -type f -o -type l | grep -v .symbols) 00:49:41.363 } 00:49:41.363 00:49:41.363 00:49:41.363 # Rely mainly on CONFIG 00:49:41.363 git submodule update --init 00:49:41.363 ./configure --disable-unit-tests --disable-tests --with-shared --with-dpdk 00:49:41.363 make -j10 00:49:41.363 make DESTDIR=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 install -j10 00:49:41.363 # DPDK always builds both static and shared, so we need to remove one or the other 00:49:41.363 # SPDK always builds static, so remove it if we want shared. 00:49:41.363 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/lib*.a 00:49:41.363 00:49:41.363 # The ISA-L install may have installed some binaries that we do not want to package 00:49:41.363 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/igzip 00:49:41.363 rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/man 00:49:41.363 00:49:41.363 # Include libvfio-user libs in case --with-vfio-user is in use together with --with-shared 00:49:41.363 00:49:41.363 # And some useful setup scripts SPDK uses 00:49:41.363 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:49:41.363 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d 00:49:41.363 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d 00:49:41.363 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d 00:49:41.363 00:49:41.363 cat <<-EOF > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d/spdk.conf 00:49:41.363 /usr/local/lib 00:49:41.363 /usr/local/lib/dpdk 00:49:41.363 /usr/local/lib/libvfio-user 00:49:41.363 EOF 00:49:41.363 00:49:41.363 cat <<-'EOF' > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d/spdk_path.sh 00:49:41.363 PATH=$PATH:/usr/libexec/spdk/scripts 00:49:41.364 PATH=$PATH:/usr/libexec/spdk/scripts/vagrant 00:49:41.364 PATH=$PATH:/usr/libexec/spdk/test/common/config 00:49:41.364 export PATH 00:49:41.364 EOF 00:49:41.364 00:49:41.364 cfs /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk scripts 00:49:41.364 ln -s /usr/libexec/spdk/scripts/bash-completion/spdk /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d/ 00:49:41.364 00:49:41.364 # We need to take into the account the fact that most of the scripts depend on being 00:49:41.364 # run directly from the repo. To workaround it, create common root space under dir 00:49:41.364 # like /usr/libexec/spdk and link all potential relative paths the script may try 00:49:41.364 # to reference. 00:49:41.364 00:49:41.364 # setup.sh uses pci_ids.h 00:49:41.364 ln -s /usr/local/include /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:49:41.364 00:49:41.364 %files 00:49:41.364 /usr/local/bin/* 00:49:41.364 /usr/local/lib/python3.9/site-packages/spdk*/* 00:49:41.364 00:49:41.364 %package devel 00:49:41.364 00:49:41.364 Summary: SPDK development libraries and headers 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 %description devel 00:49:41.364 00:49:41.364 SPDK development libraries and header 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 00:49:41.364 %files devel 00:49:41.364 /usr/local/include/* 00:49:41.364 /usr/local/lib/pkgconfig/*.pc 00:49:41.364 /usr/local/lib/*.la 00:49:41.364 /usr/local/lib/*.so* 00:49:41.364 /etc/ld.so.conf.d/spdk.conf 00:49:41.364 00:49:41.364 %post devel 00:49:41.364 ldconfig 00:49:41.364 00:49:41.364 %package scripts 00:49:41.364 Summary: SPDK scripts and utilities 00:49:41.364 00:49:41.364 %description scripts 00:49:41.364 SPDK scripts and utilities 00:49:41.364 00:49:41.364 %files scripts 00:49:41.364 /usr/libexec/spdk/* 00:49:41.364 /etc/profile.d/* 00:49:41.364 /etc/bash_completion.d/* 00:49:41.364 00:49:41.364 %post scripts 00:49:41.364 ldconfig 00:49:41.364 00:49:41.364 %changelog 00:49:41.364 * Tue Feb 16 2021 Michal Berger 00:49:41.364 - Initial RPM .spec for the SPDK 00:49:41.364 12:52:04 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@118 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared --with-dpdk 00:49:41.622 * Starting rpmbuild... 00:49:41.622 setting SOURCE_DATE_EPOCH=1613433600 00:49:41.622 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.u6vZ3T 00:49:41.622 + umask 022 00:49:41.622 + cd /home/vagrant/spdk_repo/spdk 00:49:41.622 + make clean -j10 00:49:49.808 + RPM_EC=0 00:49:49.808 ++ jobs -p 00:49:49.808 + exit 0 00:49:49.808 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.aqMsw6 00:49:49.808 + umask 022 00:49:49.808 + cd /home/vagrant/spdk_repo/spdk 00:49:49.808 + set +x 00:49:49.808 Using DPDK lib dir /usr/lib64 00:49:49.808 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:50:04.756 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:50:16.952 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:50:16.952 Creating mk/config.mk...done. 00:50:16.952 Creating mk/cc.flags.mk...done. 00:50:16.952 Type 'make' to build. 00:50:16.952 make[1]: Nothing to be done for 'all'. 00:50:55.675 CC lib/ut_mock/mock.o 00:50:55.675 CC lib/log/log.o 00:50:55.675 CC lib/log/log_flags.o 00:50:55.675 CC lib/log/log_deprecated.o 00:50:55.675 LIB libspdk_ut_mock.a 00:50:55.675 SO libspdk_ut_mock.so.6.0 00:50:55.675 LIB libspdk_log.a 00:50:55.675 SO libspdk_log.so.7.0 00:50:55.675 SYMLINK libspdk_ut_mock.so 00:50:55.675 SYMLINK libspdk_log.so 00:50:55.675 CC lib/util/base64.o 00:50:55.675 CC lib/util/bit_array.o 00:50:55.675 CC lib/util/crc32.o 00:50:55.675 CC lib/util/crc16.o 00:50:55.675 CC lib/util/cpuset.o 00:50:55.675 CC lib/util/crc32c.o 00:50:55.675 CC lib/ioat/ioat.o 00:50:55.675 CXX lib/trace_parser/trace.o 00:50:55.675 CC lib/dma/dma.o 00:50:55.675 CC lib/util/crc32_ieee.o 00:50:55.675 CC lib/util/crc64.o 00:50:55.675 CC lib/vfio_user/host/vfio_user_pci.o 00:50:55.675 CC lib/vfio_user/host/vfio_user.o 00:50:55.675 CC lib/util/dif.o 00:50:55.675 LIB libspdk_dma.a 00:50:55.675 CC lib/util/fd.o 00:50:55.675 SO libspdk_dma.so.4.0 00:50:55.675 CC lib/util/file.o 00:50:55.675 CC lib/util/hexlify.o 00:50:55.675 CC lib/util/iov.o 00:50:55.675 SYMLINK libspdk_dma.so 00:50:55.675 CC lib/util/math.o 00:50:55.675 LIB libspdk_ioat.a 00:50:55.675 SO libspdk_ioat.so.7.0 00:50:55.675 CC lib/util/pipe.o 00:50:55.675 SYMLINK libspdk_ioat.so 00:50:55.675 CC lib/util/strerror_tls.o 00:50:55.675 LIB libspdk_vfio_user.a 00:50:55.675 CC lib/util/uuid.o 00:50:55.675 CC lib/util/string.o 00:50:55.675 CC lib/util/fd_group.o 00:50:55.675 CC lib/util/xor.o 00:50:55.675 SO libspdk_vfio_user.so.5.0 00:50:55.675 CC lib/util/zipf.o 00:50:55.675 SYMLINK libspdk_vfio_user.so 00:50:55.675 LIB libspdk_trace_parser.a 00:50:55.932 SO libspdk_trace_parser.so.5.0 00:50:55.932 SYMLINK libspdk_trace_parser.so 00:50:56.191 LIB libspdk_util.a 00:50:56.191 SO libspdk_util.so.9.0 00:50:56.191 SYMLINK libspdk_util.so 00:50:56.449 CC lib/env_dpdk/memory.o 00:50:56.449 CC lib/env_dpdk/env.o 00:50:56.449 CC lib/env_dpdk/pci.o 00:50:56.449 CC lib/env_dpdk/pci_ioat.o 00:50:56.449 CC lib/env_dpdk/init.o 00:50:56.449 CC lib/json/json_parse.o 00:50:56.449 CC lib/env_dpdk/threads.o 00:50:56.449 CC lib/conf/conf.o 00:50:56.449 CC lib/env_dpdk/pci_virtio.o 00:50:56.449 CC lib/vmd/vmd.o 00:50:56.706 CC lib/json/json_util.o 00:50:56.706 CC lib/json/json_write.o 00:50:56.706 CC lib/env_dpdk/pci_vmd.o 00:50:56.706 LIB libspdk_conf.a 00:50:56.706 CC lib/env_dpdk/pci_idxd.o 00:50:56.706 SO libspdk_conf.so.6.0 00:50:56.706 CC lib/vmd/led.o 00:50:56.706 CC lib/env_dpdk/pci_event.o 00:50:56.706 CC lib/env_dpdk/sigbus_handler.o 00:50:56.706 SYMLINK libspdk_conf.so 00:50:56.706 CC lib/env_dpdk/pci_dpdk.o 00:50:56.965 CC lib/env_dpdk/pci_dpdk_2207.o 00:50:56.965 CC lib/env_dpdk/pci_dpdk_2211.o 00:50:56.965 LIB libspdk_vmd.a 00:50:57.224 SO libspdk_vmd.so.6.0 00:50:57.224 SYMLINK libspdk_vmd.so 00:50:57.224 LIB libspdk_json.a 00:50:57.224 SO libspdk_json.so.6.0 00:50:57.224 SYMLINK libspdk_json.so 00:50:57.483 LIB libspdk_env_dpdk.a 00:50:57.483 SO libspdk_env_dpdk.so.14.1 00:50:57.483 CC lib/jsonrpc/jsonrpc_server.o 00:50:57.483 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:50:57.483 CC lib/jsonrpc/jsonrpc_client.o 00:50:57.483 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:50:57.741 SYMLINK libspdk_env_dpdk.so 00:50:57.741 LIB libspdk_jsonrpc.a 00:50:57.741 SO libspdk_jsonrpc.so.6.0 00:50:57.741 SYMLINK libspdk_jsonrpc.so 00:50:58.308 CC lib/rpc/rpc.o 00:50:58.567 LIB libspdk_rpc.a 00:50:58.567 SO libspdk_rpc.so.6.0 00:50:58.567 SYMLINK libspdk_rpc.so 00:50:58.825 CC lib/notify/notify.o 00:50:58.825 CC lib/notify/notify_rpc.o 00:50:58.825 CC lib/trace/trace.o 00:50:58.825 CC lib/trace/trace_rpc.o 00:50:58.825 CC lib/trace/trace_flags.o 00:50:58.825 CC lib/keyring/keyring.o 00:50:58.825 CC lib/keyring/keyring_rpc.o 00:50:59.083 LIB libspdk_notify.a 00:50:59.083 SO libspdk_notify.so.6.0 00:50:59.083 LIB libspdk_keyring.a 00:50:59.083 SYMLINK libspdk_notify.so 00:50:59.083 SO libspdk_keyring.so.1.0 00:50:59.083 LIB libspdk_trace.a 00:50:59.083 SYMLINK libspdk_keyring.so 00:50:59.083 SO libspdk_trace.so.10.0 00:50:59.340 SYMLINK libspdk_trace.so 00:50:59.607 CC lib/thread/thread.o 00:50:59.607 CC lib/thread/iobuf.o 00:50:59.607 CC lib/sock/sock.o 00:50:59.607 CC lib/sock/sock_rpc.o 00:51:00.177 LIB libspdk_sock.a 00:51:00.177 SO libspdk_sock.so.9.0 00:51:00.177 SYMLINK libspdk_sock.so 00:51:00.434 CC lib/nvme/nvme_ctrlr_cmd.o 00:51:00.434 CC lib/nvme/nvme_ctrlr.o 00:51:00.434 CC lib/nvme/nvme_ns.o 00:51:00.434 CC lib/nvme/nvme_fabric.o 00:51:00.434 CC lib/nvme/nvme_pcie_common.o 00:51:00.434 CC lib/nvme/nvme_pcie.o 00:51:00.434 CC lib/nvme/nvme_qpair.o 00:51:00.434 CC lib/nvme/nvme_ns_cmd.o 00:51:00.434 CC lib/nvme/nvme.o 00:51:00.693 LIB libspdk_thread.a 00:51:00.693 SO libspdk_thread.so.10.0 00:51:00.693 SYMLINK libspdk_thread.so 00:51:00.693 CC lib/nvme/nvme_quirks.o 00:51:00.951 CC lib/nvme/nvme_transport.o 00:51:01.210 CC lib/nvme/nvme_discovery.o 00:51:01.210 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:51:01.469 CC lib/accel/accel.o 00:51:01.469 CC lib/accel/accel_rpc.o 00:51:01.469 CC lib/blob/blobstore.o 00:51:01.469 CC lib/accel/accel_sw.o 00:51:01.469 CC lib/blob/request.o 00:51:01.729 CC lib/init/json_config.o 00:51:01.729 CC lib/blob/zeroes.o 00:51:01.729 CC lib/blob/blob_bs_dev.o 00:51:01.729 CC lib/init/subsystem.o 00:51:01.729 CC lib/init/subsystem_rpc.o 00:51:01.729 CC lib/init/rpc.o 00:51:01.729 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:51:01.729 CC lib/nvme/nvme_tcp.o 00:51:01.988 CC lib/nvme/nvme_opal.o 00:51:01.988 CC lib/nvme/nvme_io_msg.o 00:51:01.988 CC lib/nvme/nvme_poll_group.o 00:51:01.988 LIB libspdk_init.a 00:51:01.988 SO libspdk_init.so.5.0 00:51:01.988 SYMLINK libspdk_init.so 00:51:01.988 CC lib/nvme/nvme_zns.o 00:51:02.247 CC lib/virtio/virtio.o 00:51:02.504 CC lib/nvme/nvme_stubs.o 00:51:02.504 CC lib/nvme/nvme_auth.o 00:51:02.504 CC lib/virtio/virtio_vhost_user.o 00:51:02.504 CC lib/nvme/nvme_cuse.o 00:51:02.504 LIB libspdk_accel.a 00:51:02.504 SO libspdk_accel.so.15.0 00:51:02.504 CC lib/event/app.o 00:51:02.504 SYMLINK libspdk_accel.so 00:51:02.504 CC lib/event/reactor.o 00:51:02.763 CC lib/event/log_rpc.o 00:51:02.763 CC lib/virtio/virtio_vfio_user.o 00:51:02.763 CC lib/bdev/bdev.o 00:51:02.763 CC lib/bdev/bdev_rpc.o 00:51:03.021 CC lib/bdev/bdev_zone.o 00:51:03.021 CC lib/bdev/part.o 00:51:03.021 CC lib/event/app_rpc.o 00:51:03.021 CC lib/virtio/virtio_pci.o 00:51:03.279 CC lib/bdev/scsi_nvme.o 00:51:03.279 CC lib/event/scheduler_static.o 00:51:03.279 LIB libspdk_nvme.a 00:51:03.279 LIB libspdk_virtio.a 00:51:03.279 SO libspdk_nvme.so.13.0 00:51:03.279 LIB libspdk_event.a 00:51:03.279 SO libspdk_virtio.so.7.0 00:51:03.279 SO libspdk_event.so.13.1 00:51:03.538 SYMLINK libspdk_event.so 00:51:03.538 SYMLINK libspdk_virtio.so 00:51:03.538 SYMLINK libspdk_nvme.so 00:51:04.475 LIB libspdk_blob.a 00:51:04.475 SO libspdk_blob.so.11.0 00:51:04.734 SYMLINK libspdk_blob.so 00:51:04.992 CC lib/lvol/lvol.o 00:51:04.992 CC lib/blobfs/blobfs.o 00:51:04.992 CC lib/blobfs/tree.o 00:51:05.559 LIB libspdk_bdev.a 00:51:05.559 SO libspdk_bdev.so.15.0 00:51:05.559 SYMLINK libspdk_bdev.so 00:51:05.559 LIB libspdk_blobfs.a 00:51:05.559 SO libspdk_blobfs.so.10.0 00:51:05.817 CC lib/nvmf/ctrlr.o 00:51:05.817 SYMLINK libspdk_blobfs.so 00:51:05.817 CC lib/nvmf/ctrlr_discovery.o 00:51:05.817 CC lib/scsi/dev.o 00:51:05.817 CC lib/nvmf/subsystem.o 00:51:05.817 CC lib/scsi/lun.o 00:51:05.817 CC lib/nvmf/ctrlr_bdev.o 00:51:05.817 CC lib/scsi/port.o 00:51:05.817 CC lib/ftl/ftl_core.o 00:51:05.817 CC lib/nbd/nbd.o 00:51:05.817 LIB libspdk_lvol.a 00:51:05.817 SO libspdk_lvol.so.10.0 00:51:05.817 SYMLINK libspdk_lvol.so 00:51:05.817 CC lib/nbd/nbd_rpc.o 00:51:06.077 CC lib/ftl/ftl_init.o 00:51:06.077 CC lib/ftl/ftl_layout.o 00:51:06.077 CC lib/ftl/ftl_debug.o 00:51:06.077 CC lib/scsi/scsi.o 00:51:06.077 CC lib/scsi/scsi_bdev.o 00:51:06.077 CC lib/scsi/scsi_pr.o 00:51:06.077 LIB libspdk_nbd.a 00:51:06.335 CC lib/scsi/scsi_rpc.o 00:51:06.335 SO libspdk_nbd.so.7.0 00:51:06.335 CC lib/scsi/task.o 00:51:06.335 CC lib/ftl/ftl_io.o 00:51:06.335 SYMLINK libspdk_nbd.so 00:51:06.335 CC lib/ftl/ftl_sb.o 00:51:06.335 CC lib/ftl/ftl_l2p.o 00:51:06.335 CC lib/nvmf/nvmf.o 00:51:06.335 CC lib/nvmf/nvmf_rpc.o 00:51:06.335 CC lib/ftl/ftl_l2p_flat.o 00:51:06.335 CC lib/nvmf/transport.o 00:51:06.594 CC lib/nvmf/tcp.o 00:51:06.594 CC lib/nvmf/stubs.o 00:51:06.594 CC lib/ftl/ftl_nv_cache.o 00:51:06.594 CC lib/nvmf/mdns_server.o 00:51:06.852 CC lib/nvmf/auth.o 00:51:06.852 LIB libspdk_scsi.a 00:51:06.852 SO libspdk_scsi.so.9.0 00:51:06.852 CC lib/ftl/ftl_band.o 00:51:06.852 CC lib/ftl/ftl_band_ops.o 00:51:06.852 SYMLINK libspdk_scsi.so 00:51:06.852 CC lib/ftl/ftl_writer.o 00:51:06.852 CC lib/ftl/ftl_rq.o 00:51:07.110 CC lib/ftl/ftl_reloc.o 00:51:07.110 CC lib/ftl/ftl_l2p_cache.o 00:51:07.110 CC lib/ftl/ftl_p2l.o 00:51:07.110 CC lib/ftl/mngt/ftl_mngt.o 00:51:07.110 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:51:07.368 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:51:07.368 CC lib/ftl/mngt/ftl_mngt_startup.o 00:51:07.368 CC lib/iscsi/conn.o 00:51:07.368 CC lib/iscsi/init_grp.o 00:51:07.368 CC lib/ftl/mngt/ftl_mngt_md.o 00:51:07.368 CC lib/vhost/vhost.o 00:51:07.368 CC lib/vhost/vhost_rpc.o 00:51:07.368 CC lib/iscsi/iscsi.o 00:51:07.368 CC lib/iscsi/md5.o 00:51:07.626 CC lib/iscsi/param.o 00:51:07.626 CC lib/iscsi/portal_grp.o 00:51:07.626 CC lib/vhost/vhost_scsi.o 00:51:07.626 CC lib/vhost/vhost_blk.o 00:51:07.626 CC lib/ftl/mngt/ftl_mngt_misc.o 00:51:07.884 LIB libspdk_nvmf.a 00:51:07.884 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:51:07.884 SO libspdk_nvmf.so.18.1 00:51:07.884 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:51:07.884 SYMLINK libspdk_nvmf.so 00:51:07.884 CC lib/ftl/mngt/ftl_mngt_band.o 00:51:07.884 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:51:07.884 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:51:07.884 CC lib/iscsi/tgt_node.o 00:51:07.884 CC lib/iscsi/iscsi_subsystem.o 00:51:08.143 CC lib/iscsi/iscsi_rpc.o 00:51:08.143 CC lib/iscsi/task.o 00:51:08.143 CC lib/vhost/rte_vhost_user.o 00:51:08.143 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:51:08.143 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:51:08.400 CC lib/ftl/utils/ftl_conf.o 00:51:08.400 CC lib/ftl/utils/ftl_md.o 00:51:08.400 CC lib/ftl/utils/ftl_mempool.o 00:51:08.400 CC lib/ftl/utils/ftl_bitmap.o 00:51:08.400 CC lib/ftl/utils/ftl_property.o 00:51:08.658 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:51:08.658 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:51:08.658 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:51:08.658 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:51:08.658 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:51:08.658 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:51:08.658 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:51:08.921 CC lib/ftl/upgrade/ftl_sb_v3.o 00:51:08.921 CC lib/ftl/upgrade/ftl_sb_v5.o 00:51:08.921 CC lib/ftl/nvc/ftl_nvc_dev.o 00:51:08.921 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:51:08.921 CC lib/ftl/base/ftl_base_dev.o 00:51:08.921 CC lib/ftl/base/ftl_base_bdev.o 00:51:08.921 LIB libspdk_iscsi.a 00:51:08.921 SO libspdk_iscsi.so.8.0 00:51:09.179 LIB libspdk_vhost.a 00:51:09.179 SYMLINK libspdk_iscsi.so 00:51:09.179 SO libspdk_vhost.so.8.0 00:51:09.179 LIB libspdk_ftl.a 00:51:09.179 SYMLINK libspdk_vhost.so 00:51:09.179 SO libspdk_ftl.so.9.0 00:51:09.179 SYMLINK libspdk_ftl.so 00:51:09.745 CC module/env_dpdk/env_dpdk_rpc.o 00:51:09.745 CC module/accel/ioat/accel_ioat.o 00:51:09.745 CC module/accel/ioat/accel_ioat_rpc.o 00:51:09.745 CC module/scheduler/dynamic/scheduler_dynamic.o 00:51:09.745 CC module/sock/posix/posix.o 00:51:09.745 CC module/blob/bdev/blob_bdev.o 00:51:09.745 CC module/keyring/linux/keyring.o 00:51:09.745 CC module/keyring/linux/keyring_rpc.o 00:51:09.745 CC module/keyring/file/keyring.o 00:51:09.745 CC module/accel/error/accel_error.o 00:51:09.745 LIB libspdk_env_dpdk_rpc.a 00:51:09.745 SO libspdk_env_dpdk_rpc.so.6.0 00:51:09.745 CC module/accel/error/accel_error_rpc.o 00:51:09.745 CC module/keyring/file/keyring_rpc.o 00:51:09.745 SYMLINK libspdk_env_dpdk_rpc.so 00:51:10.003 LIB libspdk_keyring_linux.a 00:51:10.003 LIB libspdk_accel_ioat.a 00:51:10.003 SO libspdk_keyring_linux.so.1.0 00:51:10.003 LIB libspdk_scheduler_dynamic.a 00:51:10.003 SO libspdk_accel_ioat.so.6.0 00:51:10.003 SO libspdk_scheduler_dynamic.so.4.0 00:51:10.003 SYMLINK libspdk_keyring_linux.so 00:51:10.003 SYMLINK libspdk_accel_ioat.so 00:51:10.003 LIB libspdk_blob_bdev.a 00:51:10.003 LIB libspdk_keyring_file.a 00:51:10.003 SYMLINK libspdk_scheduler_dynamic.so 00:51:10.003 SO libspdk_keyring_file.so.1.0 00:51:10.003 SO libspdk_blob_bdev.so.11.0 00:51:10.003 LIB libspdk_accel_error.a 00:51:10.004 SO libspdk_accel_error.so.2.0 00:51:10.004 SYMLINK libspdk_keyring_file.so 00:51:10.004 SYMLINK libspdk_blob_bdev.so 00:51:10.004 SYMLINK libspdk_accel_error.so 00:51:10.262 CC module/bdev/passthru/vbdev_passthru.o 00:51:10.262 CC module/bdev/gpt/gpt.o 00:51:10.262 CC module/bdev/error/vbdev_error.o 00:51:10.262 CC module/blobfs/bdev/blobfs_bdev.o 00:51:10.262 CC module/bdev/delay/vbdev_delay.o 00:51:10.262 CC module/bdev/lvol/vbdev_lvol.o 00:51:10.262 CC module/bdev/malloc/bdev_malloc.o 00:51:10.520 CC module/bdev/null/bdev_null.o 00:51:10.520 CC module/bdev/nvme/bdev_nvme.o 00:51:10.520 LIB libspdk_sock_posix.a 00:51:10.520 SO libspdk_sock_posix.so.6.0 00:51:10.520 CC module/bdev/gpt/vbdev_gpt.o 00:51:10.520 SYMLINK libspdk_sock_posix.so 00:51:10.520 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:51:10.520 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:51:10.779 CC module/bdev/null/bdev_null_rpc.o 00:51:10.779 CC module/bdev/error/vbdev_error_rpc.o 00:51:10.779 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:51:10.779 CC module/bdev/malloc/bdev_malloc_rpc.o 00:51:10.779 CC module/bdev/delay/vbdev_delay_rpc.o 00:51:10.779 LIB libspdk_blobfs_bdev.a 00:51:10.779 SO libspdk_blobfs_bdev.so.6.0 00:51:10.779 LIB libspdk_bdev_gpt.a 00:51:10.779 LIB libspdk_bdev_passthru.a 00:51:10.779 SYMLINK libspdk_blobfs_bdev.so 00:51:10.779 CC module/bdev/nvme/bdev_nvme_rpc.o 00:51:10.779 LIB libspdk_bdev_null.a 00:51:10.779 LIB libspdk_bdev_error.a 00:51:10.779 SO libspdk_bdev_gpt.so.6.0 00:51:10.779 SO libspdk_bdev_passthru.so.6.0 00:51:10.779 SO libspdk_bdev_error.so.6.0 00:51:10.779 SO libspdk_bdev_null.so.6.0 00:51:11.038 LIB libspdk_bdev_malloc.a 00:51:11.038 SO libspdk_bdev_malloc.so.6.0 00:51:11.038 SYMLINK libspdk_bdev_passthru.so 00:51:11.038 CC module/bdev/nvme/nvme_rpc.o 00:51:11.038 SYMLINK libspdk_bdev_gpt.so 00:51:11.038 CC module/bdev/nvme/bdev_mdns_client.o 00:51:11.038 SYMLINK libspdk_bdev_error.so 00:51:11.038 SYMLINK libspdk_bdev_null.so 00:51:11.038 LIB libspdk_bdev_delay.a 00:51:11.038 CC module/bdev/nvme/vbdev_opal.o 00:51:11.038 CC module/bdev/nvme/vbdev_opal_rpc.o 00:51:11.038 SO libspdk_bdev_delay.so.6.0 00:51:11.038 SYMLINK libspdk_bdev_malloc.so 00:51:11.038 LIB libspdk_bdev_lvol.a 00:51:11.038 SYMLINK libspdk_bdev_delay.so 00:51:11.038 SO libspdk_bdev_lvol.so.6.0 00:51:11.038 SYMLINK libspdk_bdev_lvol.so 00:51:11.038 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:51:11.297 CC module/bdev/raid/bdev_raid.o 00:51:11.297 CC module/bdev/split/vbdev_split.o 00:51:11.297 CC module/bdev/split/vbdev_split_rpc.o 00:51:11.297 CC module/bdev/zone_block/vbdev_zone_block.o 00:51:11.297 CC module/bdev/aio/bdev_aio.o 00:51:11.297 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:51:11.297 CC module/bdev/ftl/bdev_ftl.o 00:51:11.297 CC module/bdev/ftl/bdev_ftl_rpc.o 00:51:11.556 CC module/bdev/virtio/bdev_virtio_scsi.o 00:51:11.556 LIB libspdk_bdev_split.a 00:51:11.556 SO libspdk_bdev_split.so.6.0 00:51:11.556 CC module/bdev/virtio/bdev_virtio_blk.o 00:51:11.556 CC module/bdev/virtio/bdev_virtio_rpc.o 00:51:11.556 SYMLINK libspdk_bdev_split.so 00:51:11.556 CC module/bdev/aio/bdev_aio_rpc.o 00:51:11.556 LIB libspdk_bdev_zone_block.a 00:51:11.556 SO libspdk_bdev_zone_block.so.6.0 00:51:11.556 CC module/bdev/raid/bdev_raid_rpc.o 00:51:11.556 CC module/bdev/raid/bdev_raid_sb.o 00:51:11.814 SYMLINK libspdk_bdev_zone_block.so 00:51:11.815 CC module/bdev/raid/raid0.o 00:51:11.815 LIB libspdk_bdev_ftl.a 00:51:11.815 CC module/bdev/raid/raid1.o 00:51:11.815 LIB libspdk_bdev_aio.a 00:51:11.815 SO libspdk_bdev_ftl.so.6.0 00:51:11.815 SO libspdk_bdev_aio.so.6.0 00:51:11.815 SYMLINK libspdk_bdev_aio.so 00:51:11.815 CC module/bdev/raid/concat.o 00:51:11.815 SYMLINK libspdk_bdev_ftl.so 00:51:12.073 LIB libspdk_bdev_virtio.a 00:51:12.073 SO libspdk_bdev_virtio.so.6.0 00:51:12.073 SYMLINK libspdk_bdev_virtio.so 00:51:12.331 LIB libspdk_bdev_raid.a 00:51:12.331 SO libspdk_bdev_raid.so.6.0 00:51:12.331 SYMLINK libspdk_bdev_raid.so 00:51:12.589 LIB libspdk_bdev_nvme.a 00:51:12.589 SO libspdk_bdev_nvme.so.7.0 00:51:12.845 SYMLINK libspdk_bdev_nvme.so 00:51:13.411 CC module/event/subsystems/vmd/vmd.o 00:51:13.411 CC module/event/subsystems/vmd/vmd_rpc.o 00:51:13.411 CC module/event/subsystems/keyring/keyring.o 00:51:13.411 CC module/event/subsystems/sock/sock.o 00:51:13.411 CC module/event/subsystems/scheduler/scheduler.o 00:51:13.411 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:51:13.411 CC module/event/subsystems/iobuf/iobuf.o 00:51:13.411 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:51:13.411 LIB libspdk_event_keyring.a 00:51:13.411 LIB libspdk_event_vmd.a 00:51:13.411 LIB libspdk_event_sock.a 00:51:13.411 SO libspdk_event_vmd.so.6.0 00:51:13.411 SO libspdk_event_keyring.so.1.0 00:51:13.411 LIB libspdk_event_vhost_blk.a 00:51:13.411 LIB libspdk_event_scheduler.a 00:51:13.411 SO libspdk_event_sock.so.5.0 00:51:13.411 LIB libspdk_event_iobuf.a 00:51:13.411 SO libspdk_event_vhost_blk.so.3.0 00:51:13.411 SO libspdk_event_scheduler.so.4.0 00:51:13.411 SYMLINK libspdk_event_keyring.so 00:51:13.411 SO libspdk_event_iobuf.so.3.0 00:51:13.411 SYMLINK libspdk_event_vmd.so 00:51:13.411 SYMLINK libspdk_event_sock.so 00:51:13.411 SYMLINK libspdk_event_vhost_blk.so 00:51:13.411 SYMLINK libspdk_event_scheduler.so 00:51:13.411 SYMLINK libspdk_event_iobuf.so 00:51:13.977 CC module/event/subsystems/accel/accel.o 00:51:13.977 LIB libspdk_event_accel.a 00:51:13.977 SO libspdk_event_accel.so.6.0 00:51:13.977 SYMLINK libspdk_event_accel.so 00:51:14.543 CC module/event/subsystems/bdev/bdev.o 00:51:14.543 LIB libspdk_event_bdev.a 00:51:14.543 SO libspdk_event_bdev.so.6.0 00:51:14.543 SYMLINK libspdk_event_bdev.so 00:51:14.801 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:51:14.801 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:51:14.801 CC module/event/subsystems/scsi/scsi.o 00:51:14.801 CC module/event/subsystems/nbd/nbd.o 00:51:15.099 LIB libspdk_event_nbd.a 00:51:15.099 LIB libspdk_event_scsi.a 00:51:15.099 SO libspdk_event_nbd.so.6.0 00:51:15.099 SO libspdk_event_scsi.so.6.0 00:51:15.099 SYMLINK libspdk_event_nbd.so 00:51:15.099 SYMLINK libspdk_event_scsi.so 00:51:15.357 LIB libspdk_event_nvmf.a 00:51:15.357 SO libspdk_event_nvmf.so.6.0 00:51:15.357 SYMLINK libspdk_event_nvmf.so 00:51:15.357 CC module/event/subsystems/iscsi/iscsi.o 00:51:15.357 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:51:15.616 LIB libspdk_event_vhost_scsi.a 00:51:15.616 SO libspdk_event_vhost_scsi.so.3.0 00:51:15.616 LIB libspdk_event_iscsi.a 00:51:15.616 SO libspdk_event_iscsi.so.6.0 00:51:15.616 SYMLINK libspdk_event_vhost_scsi.so 00:51:15.616 SYMLINK libspdk_event_iscsi.so 00:51:15.874 SO libspdk.so.6.0 00:51:15.874 SYMLINK libspdk.so 00:51:15.874 make[1]: Nothing to be done for 'all'. 00:51:16.132 CXX app/trace/trace.o 00:51:16.132 CC examples/vmd/lsvmd/lsvmd.o 00:51:16.132 CC examples/util/zipf/zipf.o 00:51:16.132 CC examples/ioat/perf/perf.o 00:51:16.132 CC examples/nvme/hello_world/hello_world.o 00:51:16.389 CC examples/accel/perf/accel_perf.o 00:51:16.390 CC examples/blob/hello_world/hello_blob.o 00:51:16.390 CC examples/bdev/hello_world/hello_bdev.o 00:51:16.390 CC examples/sock/hello_world/hello_sock.o 00:51:16.390 CC examples/nvmf/nvmf/nvmf.o 00:51:16.390 LINK lsvmd 00:51:16.390 LINK hello_world 00:51:16.390 LINK zipf 00:51:16.648 LINK ioat_perf 00:51:16.648 LINK hello_blob 00:51:16.648 LINK hello_sock 00:51:16.648 LINK hello_bdev 00:51:16.648 LINK spdk_trace 00:51:16.648 CC examples/vmd/led/led.o 00:51:16.648 LINK nvmf 00:51:16.648 CC examples/nvme/reconnect/reconnect.o 00:51:16.905 CC examples/ioat/verify/verify.o 00:51:16.905 CC examples/blob/cli/blobcli.o 00:51:16.905 LINK accel_perf 00:51:16.905 LINK led 00:51:16.905 CC examples/nvme/nvme_manage/nvme_manage.o 00:51:16.905 CC examples/bdev/bdevperf/bdevperf.o 00:51:16.905 CC app/trace_record/trace_record.o 00:51:16.905 CC examples/interrupt_tgt/interrupt_tgt.o 00:51:16.905 LINK verify 00:51:17.164 CC examples/thread/thread/thread_ex.o 00:51:17.164 CC app/nvmf_tgt/nvmf_main.o 00:51:17.164 LINK reconnect 00:51:17.164 CC app/iscsi_tgt/iscsi_tgt.o 00:51:17.164 LINK interrupt_tgt 00:51:17.421 LINK spdk_trace_record 00:51:17.421 LINK thread 00:51:17.421 LINK nvmf_tgt 00:51:17.421 CC examples/nvme/arbitration/arbitration.o 00:51:17.421 LINK iscsi_tgt 00:51:17.421 CC app/spdk_tgt/spdk_tgt.o 00:51:17.421 LINK blobcli 00:51:17.679 CC app/spdk_lspci/spdk_lspci.o 00:51:17.679 CC app/spdk_nvme_perf/perf.o 00:51:17.679 CC app/spdk_nvme_identify/identify.o 00:51:17.679 LINK nvme_manage 00:51:17.679 CC app/spdk_nvme_discover/discovery_aer.o 00:51:17.679 LINK spdk_tgt 00:51:17.679 CC app/spdk_top/spdk_top.o 00:51:17.679 LINK spdk_lspci 00:51:17.937 CC app/vhost/vhost.o 00:51:17.937 LINK arbitration 00:51:17.937 LINK spdk_nvme_discover 00:51:17.937 LINK bdevperf 00:51:17.937 CC examples/nvme/hotplug/hotplug.o 00:51:17.937 CC examples/nvme/cmb_copy/cmb_copy.o 00:51:17.937 LINK vhost 00:51:17.937 CC app/spdk_dd/spdk_dd.o 00:51:18.195 CC examples/nvme/abort/abort.o 00:51:18.195 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:51:18.195 LINK hotplug 00:51:18.195 LINK cmb_copy 00:51:18.453 LINK pmr_persistence 00:51:18.453 LINK abort 00:51:18.453 LINK spdk_dd 00:51:18.710 LINK spdk_nvme_perf 00:51:18.968 LINK spdk_top 00:51:19.226 LINK spdk_nvme_identify 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel_module.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/assert.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/barrier.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/base64.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_module.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_zone.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_array.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_pool.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs_bdev.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/conf.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob_bdev.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/config.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc16.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/cpuset.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc64.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc32.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dif.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dma.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/endian.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env.h 00:51:19.494 cp /home/vagrant/spdk_repo/spdk/scripts/rpc.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd.h 00:51:19.494 cp /home/vagrant/spdk_repo/spdk/scripts/spdkcli.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env_dpdk.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/event.h 00:51:19.494 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:51:19.494 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/file.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd_group.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ftl.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/gpt_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_rpc 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_cli 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/hexlify.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/histogram_data.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/init.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/iscsi_spec.h 00:51:19.494 patchelf: not an ELF executable 00:51:19.494 patchelf: not an ELF executable 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/json.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/jsonrpc.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring_module.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/likely.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/log.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/lvol.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/mmio.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nbd.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/notify.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/memory.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_intel.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_cmd.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_fc_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_spec.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_zns.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_transport.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal.h 00:51:19.494 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pci_ids.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pipe.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue_extras.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal_spec.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/reduce.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/rpc.h 00:51:19.495 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scheduler.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi_spec.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/stdinc.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/sock.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/string.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/thread.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace_parser.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/tree.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ublk.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/util.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/uuid.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/version.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_pci.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_spec.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vmd.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfu_target.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vhost.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/zipf.h 00:51:19.770 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/xor.h 00:51:19.770 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:51:20.028 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:51:20.028 Processing /home/vagrant/spdk_repo/spdk/python 00:51:20.028 DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. 00:51:20.028 pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. 00:51:20.286 Using legacy 'setup.py install' for spdk, since package 'wheel' is not installed. 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.a 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ut_mock.pc 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.a 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.so 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_log.pc 00:51:20.544 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.so 00:51:20.802 Installing collected packages: spdk 00:51:20.802 Running setup.py install for spdk: started 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.a 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.a 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dma.pc 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.a 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace_parser.pc 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ioat.pc 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.a 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.so 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.so 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_util.pc 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.so 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.so 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.a 00:51:20.802 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vfio_user.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.so 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.a 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vmd.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.a 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.a 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_conf.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.so 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_json.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.so 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dpdklibs.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.so 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.a 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk.pc 00:51:21.060 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.so 00:51:21.060 Running setup.py install for spdk: finished with status 'done' 00:51:21.060 Successfully installed spdk-24.9rc0 00:51:21.318 rm -rf /home/vagrant/spdk_repo/spdk/python/spdk.egg-info 00:51:21.318 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.a 00:51:21.318 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_jsonrpc.pc 00:51:21.318 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.so 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.a 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_rpc.pc 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.so 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.a 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.a 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring.pc 00:51:21.884 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.a 00:51:22.141 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_notify.pc 00:51:22.141 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace.pc 00:51:22.141 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.so 00:51:22.141 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.so 00:51:22.141 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.so 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.a 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.a 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_thread.pc 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock.pc 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.so 00:51:22.399 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.so 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.a 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.a 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob.pc 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.a 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel.pc 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.a 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.a 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvme.pc 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_init.pc 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.so 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_virtio.pc 00:51:22.656 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.so 00:51:22.657 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.so 00:51:22.657 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.so 00:51:22.657 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.so 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.a 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs.pc 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.a 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.a 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev.pc 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.so 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.a 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_lvol.pc 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event.pc 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.so 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.so 00:51:22.914 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.so 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.a 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.a 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.a 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nbd.pc 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scsi.pc 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvmf.pc 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.a 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.so 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.so 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ftl.pc 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.so 00:51:23.171 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.so 00:51:23.428 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.a 00:51:23.429 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_iscsi.pc 00:51:23.429 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.a 00:51:23.429 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vhost.pc 00:51:23.429 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.so 00:51:23.429 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk_rpc.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_posix.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dynamic.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob_bdev.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.a 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_error.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_file.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_linux.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_ioat.pc 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.so 00:51:24.003 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.so 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.a 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_gpt.pc 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.a 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.so 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.a 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs_bdev.pc 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.so 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_passthru.pc 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.a 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.a 00:51:24.293 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_error.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_lvol.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_null.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_delay.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_raid.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_nvme.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_malloc.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_split.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_zone_block.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_ftl.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_aio.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.a 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_virtio.pc 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.so 00:51:24.551 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iobuf.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_blk.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_sock.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.a 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scheduler.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_keyring.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vmd.pc 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.so 00:51:25.117 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.so 00:51:25.376 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.a 00:51:25.376 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_accel.pc 00:51:25.635 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.so 00:51:25.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.a 00:51:25.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_bdev.pc 00:51:25.894 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.so 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.a 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scsi.pc 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.a 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.a 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nbd.pc 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nvmf.pc 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.so 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.so 00:51:26.153 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.so 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.a 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iscsi.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.a 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.so 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_scsi.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.so 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_modules.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_modules.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_modules.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_syslibs.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_modules.pc 00:51:26.412 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_modules.pc 00:51:26.671 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk.so 00:51:26.930 make[1]: Nothing to be done for 'install'. 00:51:26.930 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace 00:51:27.188 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace_record 00:51:27.188 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/iscsi_tgt 00:51:27.188 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/nvmf_tgt 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_lspci 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_tgt 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_discover 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_perf 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_identify 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_top 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_dd 00:51:27.446 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/vhost 00:51:27.446 Installed to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local 00:51:27.446 Processing files: spdk-v24.09-1.x86_64 00:51:27.704 Provides: spdk = v24.09-1 spdk(x86-64) = v24.09-1 00:51:27.704 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:51:27.963 Requires: /usr/bin/env libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3.3)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libcrypto.so.3()(64bit) libfuse3.so.3()(64bit) libgcc_s.so.1()(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libm.so.6()(64bit) libmenu.so.6()(64bit) libncurses.so.6()(64bit) libpanel.so.6()(64bit) librte_bus_pci.so.23()(64bit) librte_cryptodev.so.23()(64bit) librte_dmadev.so.23()(64bit) librte_eal.so.23()(64bit) librte_hash.so.23()(64bit) librte_kvargs.so.23()(64bit) librte_mbuf.so.23()(64bit) librte_mempool.so.23()(64bit) librte_mempool_ring.so.23()(64bit) librte_net.so.23()(64bit) librte_pci.so.23()(64bit) librte_rcu.so.23()(64bit) librte_ring.so.23()(64bit) librte_telemetry.so.23()(64bit) librte_vhost.so.23()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libstdc++.so.6()(64bit) libtinfo.so.6()(64bit) libuuid.so.1()(64bit) rtld(GNU_HASH) 00:51:27.963 Processing files: spdk-devel-v24.09-1.x86_64 00:51:29.865 Provides: libisal_crypto.so.2()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) spdk-devel = v24.09-1 spdk-devel(x86-64) = v24.09-1 00:51:29.865 Requires(interp): /bin/sh 00:51:29.865 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:51:29.865 Requires(post): /bin/sh 00:51:29.865 Requires: ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.27)(64bit) libc.so.6(GLIBC_2.28)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.32)(64bit) libc.so.6(GLIBC_2.33)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libcrypto.so.3()(64bit) libcrypto.so.3(OPENSSL_3.0.0)(64bit) libfuse3.so.3()(64bit) libfuse3.so.3(FUSE_3.0)(64bit) libfuse3.so.3(FUSE_3.7)(64bit) libgcc_s.so.1()(64bit) libgcc_s.so.1(GCC_3.0)(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libkeyutils.so.1(KEYUTILS_0.3)(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.29)(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libssl.so.3(OPENSSL_3.0.0)(64bit) libstdc++.so.6()(64bit) libstdc++.so.6(CXXABI_1.3)(64bit) libstdc++.so.6(GLIBCXX_3.4)(64bit) libuuid.so.1()(64bit) libuuid.so.1(UUID_1.0)(64bit) libuuid.so.1(UUID_2.31)(64bit) rtld(GNU_HASH) 00:51:29.865 Processing files: spdk-scripts-v24.09-1.x86_64 00:51:29.865 warning: absolute symlink: /etc/bash_completion.d/spdk -> /usr/libexec/spdk/scripts/bash-completion/spdk 00:51:29.865 warning: absolute symlink: /usr/libexec/spdk/include -> /usr/local/include 00:51:31.239 Provides: spdk-scripts = v24.09-1 spdk-scripts(x86-64) = v24.09-1 00:51:31.239 Requires(interp): /bin/sh 00:51:31.239 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:51:31.239 Requires(post): /bin/sh 00:51:31.239 Requires: /bin/bash /usr/bin/env 00:51:31.239 Checking for unpackaged file(s): /usr/lib/rpm/check-files /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:51:31.239 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/srcrpm/spdk-v24.09-1.src.rpm 00:51:31.497 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-scripts-v24.09-1.x86_64.rpm 00:51:32.872 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-v24.09-1.x86_64.rpm 00:51:39.503 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-devel-v24.09-1.x86_64.rpm 00:51:39.503 Executing(%clean): /bin/sh -e /var/tmp/rpm-tmp.7CePgq 00:51:39.503 + umask 022 00:51:39.503 + cd /home/vagrant/spdk_repo/spdk 00:51:39.503 + /usr/bin/rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:51:39.503 + RPM_EC=0 00:51:39.503 ++ jobs -p 00:51:39.503 + exit 0 00:51:39.503 Executing(--clean): /bin/sh -e /var/tmp/rpm-tmp.ZCn7Ki 00:51:39.503 + umask 022 00:51:39.503 + cd /home/vagrant/spdk_repo/spdk 00:51:39.503 + RPM_EC=0 00:51:39.503 ++ jobs -p 00:51:39.503 + exit 0 00:51:39.503 12:54:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@120 -- $ [[ -n '' ]] 00:51:39.503 12:54:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@123 -- $ install_uninstall_rpms 00:51:39.503 12:54:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@98 -- $ local rpms 00:51:39.503 12:54:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@100 -- $ rpms=("${1:-$builddir/rpm/}/$arch/"*.rpm) 00:51:39.503 12:54:03 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@103 -- $ make -C /home/vagrant/spdk_repo/spdk clean -j10 00:51:39.503 make: Entering directory '/home/vagrant/spdk_repo/spdk' 00:51:40.071 make[1]: Nothing to be done for 'clean'. 00:51:46.648 make[1]: Nothing to be done for 'clean'. 00:51:47.247 make: Leaving directory '/home/vagrant/spdk_repo/spdk' 00:51:47.247 12:54:10 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@105 -- $ sudo rpm -i /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:51:47.247 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:51:48.182 12:54:11 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@108 -- $ LIST_LIBS=yes 00:51:48.182 12:54:11 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@108 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm-deps.sh spdk_tgt 00:51:48.182 /usr/local/bin/spdk_tgt 00:51:50.084 /usr/lib64/libacl.so.1:libacl-2.3.1-3.el9.x86_64 00:51:50.084 /usr/lib64/libaio.so.1:libaio-0.3.111-13.el9.x86_64 00:51:50.085 /usr/lib64/libarchive.so.13:libarchive-3.5.3-4.el9.x86_64 00:51:50.085 /usr/lib64/libattr.so.1:libattr-2.5.1-3.el9.x86_64 00:51:50.085 /usr/lib64/libbz2.so.1:bzip2-libs-1.0.8-8.el9.x86_64 00:51:50.085 /usr/lib64/libc.so.6:glibc-2.34-83.el9.12.x86_64 00:51:50.085 /usr/lib64/libcrypto.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:51:50.085 /usr/lib64/libfuse3.so.3:fuse3-libs-3.10.2-6.el9.x86_64 00:51:50.085 /usr/lib64/libgcc_s.so.1:libgcc-11.4.1-2.1.el9.x86_64 00:51:50.085 /usr/lib64/libkeyutils.so.1:keyutils-libs-1.6.3-1.el9.x86_64 00:51:50.085 /usr/lib64/liblz4.so.1:lz4-libs-1.9.3-5.el9.x86_64 00:51:50.085 /usr/lib64/liblzma.so.5:xz-libs-5.2.5-8.el9_0.x86_64 00:51:50.085 /usr/lib64/libm.so.6:glibc-2.34-83.el9.12.x86_64 00:51:50.085 /usr/lib64/libnuma.so.1:numactl-libs-2.0.16-1.el9.x86_64 00:51:50.085 /usr/lib64/librte_bus_pci.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_cryptodev.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_dmadev.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_eal.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_hash.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_kvargs.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_mbuf.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_mempool.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_mempool_ring.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_net.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_pci.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_rcu.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_ring.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_telemetry.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/librte_vhost.so.23:dpdk-22.11-4.el9.x86_64 00:51:50.085 /usr/lib64/libssl.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:51:50.085 /usr/lib64/libuuid.so.1:libuuid-2.37.4-15.el9.x86_64 00:51:50.085 /usr/lib64/libxml2.so.2:libxml2-2.9.13-5.el9_3.x86_64 00:51:50.085 /usr/lib64/libz.so.1:zlib-1.2.11-40.el9.x86_64 00:51:50.085 /usr/lib64/libzstd.so.1:libzstd-1.5.1-2.el9.x86_64 00:51:50.085 /usr/local/lib/libisal_crypto.so.2:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_accel.so.15.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_accel_error.so.2.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_accel_ioat.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev.so.15.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_aio.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_delay.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_error.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_ftl.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_gpt.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_lvol.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_malloc.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_null.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_nvme.so.7.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_passthru.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_raid.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_split.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_virtio.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_bdev_zone_block.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_blob.so.11.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_blob_bdev.so.11.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_blobfs.so.10.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_blobfs_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_conf.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_dma.so.4.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_env_dpdk.so.14.1:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_env_dpdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event.so.13.1:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_accel.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_iobuf.so.3.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_iscsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_nbd.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_nvmf.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_scheduler.so.4.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_scsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_sock.so.5.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_vhost_blk.so.3.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_vhost_scsi.so.3.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_event_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_ftl.so.9.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_init.so.5.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_ioat.so.7.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_iscsi.so.8.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_json.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_jsonrpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_keyring_file.so.1.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_keyring_linux.so.1.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_log.so.7.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_lvol.so.10.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_nbd.so.7.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_notify.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_nvme.so.13.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_nvmf.so.18.1:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_scheduler_dynamic.so.4.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_scsi.so.9.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_sock.so.9.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_sock_posix.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_thread.so.10.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_trace.so.10.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_util.so.9.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_vfio_user.so.5.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_vhost.so.8.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_virtio.so.7.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 /usr/local/lib/libspdk_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:51:50.085 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@109 -- $ rm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:51:50.085 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]##*/}") 00:51:50.085 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]%.rpm}") 00:51:50.085 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@111 -- $ sudo rpm -e spdk-devel-v24.09-1.x86_64 spdk-scripts-v24.09-1.x86_64 spdk-v24.09-1.x86_64 00:51:50.085 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:51:50.343 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@124 -- $ [[ -n '' ]] 00:51:50.343 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@146 -- $ (( es == 11 )) 00:51:50.343 12:54:13 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@150 -- $ sudo rpm -e dpdk dpdk-devel 00:51:50.343 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- rpm/rpm.sh@152 -- $ return 0 00:51:50.912 00:51:50.912 real 2m10.580s 00:51:50.912 user 6m17.200s 00:51:50.912 sys 3m15.406s 00:51:50.912 ************************************ 00:51:50.912 END TEST build_shared_rpm_with_rpmed_dpdk 00:51:50.912 ************************************ 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_rpm_with_rpmed_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:51:50.912 12:54:14 packaging.rpm_packaging -- rpm/rpm.sh@202 -- $ [[ -n v22.11.4 ]] 00:51:50.912 12:54:14 packaging.rpm_packaging -- rpm/rpm.sh@203 -- $ run_test build_shared_native_dpdk_rpm build_shared_native_dpdk_rpm 00:51:50.912 12:54:14 packaging.rpm_packaging -- common/autotest_common.sh@1100 -- $ '[' 2 -le 1 ']' 00:51:50.912 12:54:14 packaging.rpm_packaging -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:51:50.912 12:54:14 packaging.rpm_packaging -- common/autotest_common.sh@10 -- $ set +x 00:51:50.912 ************************************ 00:51:50.912 START TEST build_shared_native_dpdk_rpm 00:51:50.912 ************************************ 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- common/autotest_common.sh@1124 -- $ build_shared_native_dpdk_rpm 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@185 -- $ [[ -e /tmp/spdk-ld-path ]] 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@186 -- $ source /tmp/spdk-ld-path 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- tmp/spdk-ld-path@1 -- $ export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- tmp/spdk-ld-path@1 -- $ LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- tmp/spdk-ld-path@2 -- $ export PKG_CONFIG_PATH= 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- tmp/spdk-ld-path@2 -- $ PKG_CONFIG_PATH= 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@189 -- $ build_dpdk_rpm /home/vagrant/spdk_repo/dpdk/build 22.11.4 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@50 -- $ local dpdkdir=/home/vagrant/spdk_repo/dpdk/build version=22.11 spec=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk.spec 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@51 -- $ local dpdkbuildroot=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk/usr/local/lib dep 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@52 -- $ local srcdir=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/source 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@53 -- $ local rpmdir=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpms 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@54 -- $ local release=1 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@56 -- $ mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/source /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpms 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@59 -- $ : 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@61 -- $ gen_dpdk_spec 22.11 1 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@31 -- $ cat 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@67 -- $ mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk/usr/local/lib 00:51:50.912 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@69 -- $ [[ -e /home/vagrant/spdk_repo/dpdk/build/lib ]] 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@70 -- $ cp -a /home/vagrant/spdk_repo/dpdk/build/lib/dpdk /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_pci.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_pci.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_pci.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_pci.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_vdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_vdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_vdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_bus_vdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool_ring.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool_ring.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool_ring.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool_ring.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_net_i40e.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_net_i40e.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_net_i40e.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_net_i40e.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.a /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so.23 /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so.23.0 /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk/usr/local/lib/ 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@73 -- $ for dep in isa-l/build/lib intel-ipsec-mb 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@74 -- $ [[ -e /home/vagrant/spdk_repo/dpdk/build/../isa-l/build/lib ]] 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@74 -- $ continue 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@73 -- $ for dep in isa-l/build/lib intel-ipsec-mb 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@74 -- $ [[ -e /home/vagrant/spdk_repo/dpdk/build/../intel-ipsec-mb ]] 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@74 -- $ continue 00:51:50.913 12:54:14 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@81 -- $ rpmbuild --buildroot=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk -D '_rpmdir /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpms' -D '_sourcedir /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/source' --noclean --nodebuginfo -ba /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk.spec 00:51:50.913 Processing files: dpdk-devel-22.11-1.x86_64 00:51:53.444 Provides: dpdk-devel = 22.11-1 dpdk-devel(x86-64) = 22.11-1 librte_acl.so.23()(64bit) librte_acl.so.23(DPDK_23)(64bit) librte_bbdev.so.23()(64bit) librte_bbdev.so.23(DPDK_23)(64bit) librte_bbdev.so.23(EXPERIMENTAL)(64bit) librte_bitratestats.so.23()(64bit) librte_bitratestats.so.23(DPDK_23)(64bit) librte_bpf.so.23()(64bit) librte_bpf.so.23(DPDK_23)(64bit) librte_bpf.so.23(EXPERIMENTAL)(64bit) librte_bus_pci.so.23()(64bit) librte_bus_pci.so.23(DPDK_23)(64bit) librte_bus_pci.so.23(EXPERIMENTAL)(64bit) librte_bus_pci.so.23(INTERNAL)(64bit) librte_bus_vdev.so.23()(64bit) librte_bus_vdev.so.23(DPDK_23)(64bit) librte_bus_vdev.so.23(INTERNAL)(64bit) librte_cfgfile.so.23()(64bit) librte_cfgfile.so.23(DPDK_23)(64bit) librte_cmdline.so.23()(64bit) librte_cmdline.so.23(DPDK_23)(64bit) librte_cmdline.so.23(EXPERIMENTAL)(64bit) librte_compressdev.so.23()(64bit) librte_compressdev.so.23(EXPERIMENTAL)(64bit) librte_cryptodev.so.23()(64bit) librte_cryptodev.so.23(DPDK_23)(64bit) librte_cryptodev.so.23(EXPERIMENTAL)(64bit) librte_cryptodev.so.23(INTERNAL)(64bit) librte_distributor.so.23()(64bit) librte_distributor.so.23(DPDK_23)(64bit) librte_dmadev.so.23()(64bit) librte_dmadev.so.23(EXPERIMENTAL)(64bit) librte_dmadev.so.23(INTERNAL)(64bit) librte_eal.so.23()(64bit) librte_eal.so.23(DPDK_23)(64bit) librte_eal.so.23(EXPERIMENTAL)(64bit) librte_eal.so.23(INTERNAL)(64bit) librte_efd.so.23()(64bit) librte_efd.so.23(DPDK_23)(64bit) librte_ethdev.so.23()(64bit) librte_ethdev.so.23(DPDK_23)(64bit) librte_ethdev.so.23(EXPERIMENTAL)(64bit) librte_ethdev.so.23(INTERNAL)(64bit) librte_eventdev.so.23()(64bit) librte_eventdev.so.23(DPDK_23)(64bit) librte_eventdev.so.23(EXPERIMENTAL)(64bit) librte_eventdev.so.23(INTERNAL)(64bit) librte_fib.so.23()(64bit) librte_fib.so.23(DPDK_23)(64bit) librte_gpudev.so.23()(64bit) librte_gpudev.so.23(EXPERIMENTAL)(64bit) librte_gpudev.so.23(INTERNAL)(64bit) librte_graph.so.23()(64bit) librte_graph.so.23(EXPERIMENTAL)(64bit) librte_gro.so.23()(64bit) librte_gro.so.23(DPDK_23)(64bit) librte_gso.so.23()(64bit) librte_gso.so.23(DPDK_23)(64bit) librte_hash.so.23()(64bit) librte_hash.so.23(DPDK_23)(64bit) librte_hash.so.23(EXPERIMENTAL)(64bit) librte_ip_frag.so.23()(64bit) librte_ip_frag.so.23(DPDK_23)(64bit) librte_ip_frag.so.23(EXPERIMENTAL)(64bit) librte_ipsec.so.23()(64bit) librte_ipsec.so.23(DPDK_23)(64bit) librte_ipsec.so.23(EXPERIMENTAL)(64bit) librte_jobstats.so.23()(64bit) librte_jobstats.so.23(DPDK_23)(64bit) librte_kvargs.so.23()(64bit) librte_kvargs.so.23(DPDK_23)(64bit) librte_kvargs.so.23(EXPERIMENTAL)(64bit) librte_latencystats.so.23()(64bit) librte_latencystats.so.23(DPDK_23)(64bit) librte_lpm.so.23()(64bit) librte_lpm.so.23(DPDK_23)(64bit) librte_lpm.so.23(EXPERIMENTAL)(64bit) librte_mbuf.so.23()(64bit) librte_mbuf.so.23(DPDK_23)(64bit) librte_mbuf.so.23(EXPERIMENTAL)(64bit) librte_member.so.23()(64bit) librte_member.so.23(DPDK_23)(64bit) librte_member.so.23(EXPERIMENTAL)(64bit) librte_mempool.so.23()(64bit) librte_mempool.so.23(DPDK_23)(64bit) librte_mempool.so.23(EXPERIMENTAL)(64bit) librte_mempool.so.23(INTERNAL)(64bit) librte_mempool_ring.so.23()(64bit) librte_mempool_ring.so.23(DPDK_23)(64bit) librte_meter.so.23()(64bit) librte_meter.so.23(DPDK_23)(64bit) librte_metrics.so.23()(64bit) librte_metrics.so.23(DPDK_23)(64bit) librte_metrics.so.23(EXPERIMENTAL)(64bit) librte_net.so.23()(64bit) librte_net.so.23(DPDK_23)(64bit) librte_net_i40e.so.23()(64bit) librte_net_i40e.so.23(DPDK_23)(64bit) librte_net_i40e.so.23(EXPERIMENTAL)(64bit) librte_node.so.23()(64bit) librte_node.so.23(EXPERIMENTAL)(64bit) librte_pcapng.so.23()(64bit) librte_pcapng.so.23(EXPERIMENTAL)(64bit) librte_pci.so.23()(64bit) librte_pci.so.23(DPDK_23)(64bit) librte_pdump.so.23()(64bit) librte_pdump.so.23(DPDK_23)(64bit) librte_pdump.so.23(EXPERIMENTAL)(64bit) librte_pipeline.so.23()(64bit) librte_pipeline.so.23(DPDK_23)(64bit) librte_pipeline.so.23(EXPERIMENTAL)(64bit) librte_port.so.23()(64bit) librte_port.so.23(DPDK_23)(64bit) librte_port.so.23(EXPERIMENTAL)(64bit) librte_power.so.23()(64bit) librte_power.so.23(DPDK_23)(64bit) librte_power.so.23(EXPERIMENTAL)(64bit) librte_rawdev.so.23()(64bit) librte_rawdev.so.23(DPDK_23)(64bit) librte_rcu.so.23()(64bit) librte_rcu.so.23(DPDK_23)(64bit) librte_rcu.so.23(EXPERIMENTAL)(64bit) librte_regexdev.so.23()(64bit) librte_regexdev.so.23(EXPERIMENTAL)(64bit) librte_regexdev.so.23(INTERNAL)(64bit) librte_reorder.so.23()(64bit) librte_reorder.so.23(DPDK_23)(64bit) librte_reorder.so.23(EXPERIMENTAL)(64bit) librte_rib.so.23()(64bit) librte_rib.so.23(DPDK_23)(64bit) librte_ring.so.23()(64bit) librte_ring.so.23(DPDK_23)(64bit) librte_sched.so.23()(64bit) librte_sched.so.23(DPDK_23)(64bit) librte_sched.so.23(EXPERIMENTAL)(64bit) librte_security.so.23()(64bit) librte_security.so.23(DPDK_23)(64bit) librte_security.so.23(EXPERIMENTAL)(64bit) librte_security.so.23(INTERNAL)(64bit) librte_stack.so.23()(64bit) librte_stack.so.23(DPDK_23)(64bit) librte_table.so.23()(64bit) librte_table.so.23(DPDK_23)(64bit) librte_table.so.23(EXPERIMENTAL)(64bit) librte_telemetry.so.23()(64bit) librte_telemetry.so.23(DPDK_23)(64bit) librte_telemetry.so.23(INTERNAL)(64bit) librte_timer.so.23()(64bit) librte_timer.so.23(DPDK_23)(64bit) librte_timer.so.23(EXPERIMENTAL)(64bit) librte_vhost.so.23()(64bit) librte_vhost.so.23(DPDK_23)(64bit) librte_vhost.so.23(EXPERIMENTAL)(64bit) librte_vhost.so.23(INTERNAL)(64bit) 00:51:53.444 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:51:53.444 Requires: ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.22)(64bit) libc.so.6(GLIBC_2.25)(64bit) libc.so.6(GLIBC_2.27)(64bit) libc.so.6(GLIBC_2.28)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.32)(64bit) libc.so.6(GLIBC_2.33)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.6)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libelf.so.1()(64bit) libelf.so.1(ELFUTILS_1.0)(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.2.5)(64bit) libm.so.6(GLIBC_2.29)(64bit) libnuma.so.1()(64bit) libnuma.so.1(libnuma_1.1)(64bit) libnuma.so.1(libnuma_1.2)(64bit) librte_acl.so.23()(64bit) librte_acl.so.23(DPDK_23)(64bit) librte_bbdev.so.23()(64bit) librte_bitratestats.so.23()(64bit) librte_bpf.so.23()(64bit) librte_bpf.so.23(DPDK_23)(64bit) librte_bus_pci.so.23()(64bit) librte_bus_pci.so.23(DPDK_23)(64bit) librte_bus_pci.so.23(INTERNAL)(64bit) librte_bus_vdev.so.23()(64bit) librte_cfgfile.so.23()(64bit) librte_cmdline.so.23()(64bit) librte_compressdev.so.23()(64bit) librte_cryptodev.so.23()(64bit) librte_cryptodev.so.23(DPDK_23)(64bit) librte_cryptodev.so.23(EXPERIMENTAL)(64bit) librte_cryptodev.so.23(INTERNAL)(64bit) librte_distributor.so.23()(64bit) librte_dmadev.so.23()(64bit) librte_dmadev.so.23(EXPERIMENTAL)(64bit) librte_dmadev.so.23(INTERNAL)(64bit) librte_eal.so.23()(64bit) librte_eal.so.23(DPDK_23)(64bit) librte_eal.so.23(EXPERIMENTAL)(64bit) librte_eal.so.23(INTERNAL)(64bit) librte_efd.so.23()(64bit) librte_ethdev.so.23()(64bit) librte_ethdev.so.23(DPDK_23)(64bit) librte_ethdev.so.23(EXPERIMENTAL)(64bit) librte_ethdev.so.23(INTERNAL)(64bit) librte_eventdev.so.23()(64bit) librte_eventdev.so.23(DPDK_23)(64bit) librte_fib.so.23()(64bit) librte_gpudev.so.23()(64bit) librte_graph.so.23()(64bit) librte_graph.so.23(EXPERIMENTAL)(64bit) librte_gro.so.23()(64bit) librte_gso.so.23()(64bit) librte_hash.so.23()(64bit) librte_hash.so.23(DPDK_23)(64bit) librte_ip_frag.so.23()(64bit) librte_ip_frag.so.23(DPDK_23)(64bit) librte_ipsec.so.23()(64bit) librte_jobstats.so.23()(64bit) librte_kvargs.so.23()(64bit) librte_kvargs.so.23(DPDK_23)(64bit) librte_latencystats.so.23()(64bit) librte_lpm.so.23()(64bit) librte_lpm.so.23(DPDK_23)(64bit) librte_mbuf.so.23()(64bit) librte_mbuf.so.23(DPDK_23)(64bit) librte_member.so.23()(64bit) librte_mempool.so.23()(64bit) librte_mempool.so.23(DPDK_23)(64bit) librte_mempool_ring.so.23()(64bit) librte_meter.so.23()(64bit) librte_meter.so.23(DPDK_23)(64bit) librte_metrics.so.23()(64bit) librte_metrics.so.23(DPDK_23)(64bit) librte_net.so.23()(64bit) librte_net.so.23(DPDK_23)(64bit) librte_net_i40e.so.23()(64bit) librte_node.so.23()(64bit) librte_pcapng.so.23()(64bit) librte_pcapng.so.23(EXPERIMENTAL)(64bit) librte_pci.so.23()(64bit) librte_pci.so.23(DPDK_23)(64bit) librte_pdump.so.23()(64bit) librte_pipeline.so.23()(64bit) librte_port.so.23()(64bit) librte_port.so.23(EXPERIMENTAL)(64bit) librte_power.so.23()(64bit) librte_rawdev.so.23()(64bit) librte_rcu.so.23()(64bit) librte_rcu.so.23(DPDK_23)(64bit) librte_rcu.so.23(EXPERIMENTAL)(64bit) librte_regexdev.so.23()(64bit) librte_reorder.so.23()(64bit) librte_rib.so.23()(64bit) librte_rib.so.23(DPDK_23)(64bit) librte_ring.so.23()(64bit) librte_ring.so.23(DPDK_23)(64bit) librte_sched.so.23()(64bit) librte_sched.so.23(DPDK_23)(64bit) librte_security.so.23()(64bit) librte_security.so.23(EXPERIMENTAL)(64bit) librte_stack.so.23()(64bit) librte_table.so.23()(64bit) librte_table.so.23(EXPERIMENTAL)(64bit) librte_telemetry.so.23()(64bit) librte_telemetry.so.23(DPDK_23)(64bit) librte_telemetry.so.23(INTERNAL)(64bit) librte_timer.so.23()(64bit) librte_timer.so.23(DPDK_23)(64bit) librte_vhost.so.23()(64bit) rtld(GNU_HASH) 00:51:53.444 Checking for unpackaged file(s): /usr/lib/rpm/check-files /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/dpdk 00:51:53.444 Wrote: /home/vagrant/rpmbuild/SRPMS/dpdk-devel-22.11-1.src.rpm 00:52:25.517 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpms/x86_64/dpdk-devel-22.11-1.x86_64.rpm 00:52:25.517 12:54:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@89 -- $ sudo rpm -i /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpms/x86_64/dpdk-devel-22.11-1.x86_64.rpm 00:52:25.517 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:52:25.517 12:54:46 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@92 -- $ MV_RUNPATH=/home/vagrant/spdk_repo/dpdk/build 00:52:25.517 12:54:46 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@92 -- $ build_rpm --with-shared --with-dpdk=/home/vagrant/spdk_repo/dpdk/build 00:52:25.517 12:54:46 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@116 -- $ GEN_SPEC=yes 00:52:25.517 12:54:46 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@116 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared --with-dpdk=/home/vagrant/spdk_repo/dpdk/build 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 Name: spdk 00:52:25.517 Version: v24.09 00:52:25.517 Release: 1 00:52:25.517 Summary: Storage Performance Development Kit 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 00:52:25.517 Requires: glibc 00:52:25.517 Requires: libaio 00:52:25.517 Requires: libgcc 00:52:25.517 Requires: libstdc++ 00:52:25.517 Requires: libuuid 00:52:25.517 Requires: ncurses-libs 00:52:25.517 Requires: numactl-libs 00:52:25.517 Requires: openssl-libs 00:52:25.517 Requires: zlib 00:52:25.517 00:52:25.517 00:52:25.517 Requires: dpdk-devel >= 22.11 00:52:25.517 00:52:25.517 00:52:25.517 BuildRequires: python3-devel 00:52:25.517 00:52:25.517 00:52:25.517 BuildRequires: dpdk-devel >= 22.11 00:52:25.517 00:52:25.517 00:52:25.517 License: BSD 00:52:25.517 URL: https://spdk.io 00:52:25.517 Source: spdk-v24.09.tar.gz 00:52:25.517 00:52:25.517 %description 00:52:25.517 00:52:25.517 The Storage Performance Development Kit (SPDK) provides a set of tools and libraries for 00:52:25.517 writing high performance, scalable, user-mode storage applications. It achieves high 00:52:25.517 performance by moving all of the necessary drivers into userspace and operating in a 00:52:25.517 polled mode instead of relying on interrupts, which avoids kernel context switches and 00:52:25.517 eliminates interrupt handling overhead. 00:52:25.517 00:52:25.517 %prep 00:52:25.517 make clean -j10 &>/dev/null || : 00:52:25.517 %setup 00:52:25.517 00:52:25.517 %build 00:52:25.517 set +x 00:52:25.517 00:52:25.517 cfs() { 00:52:25.517 (($# > 1)) || return 0 00:52:25.517 00:52:25.517 local dst=$1 f 00:52:25.517 00:52:25.517 mkdir -p "$dst" 00:52:25.517 shift; for f; do [[ -e $f ]] && cp -a "$f" "$dst"; done 00:52:25.517 } 00:52:25.517 00:52:25.517 cl() { 00:52:25.517 [[ -e $2 ]] || return 0 00:52:25.517 00:52:25.517 cfs "$1" $(find "$2" -name '*.so*' -type f -o -type l | grep -v .symbols) 00:52:25.517 } 00:52:25.517 00:52:25.517 00:52:25.517 # Rely mainly on CONFIG 00:52:25.517 git submodule update --init 00:52:25.517 ./configure --disable-unit-tests --disable-tests --with-shared --with-dpdk=/home/vagrant/spdk_repo/dpdk/build 00:52:25.517 make -j10 00:52:25.517 make DESTDIR=/home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 install -j10 00:52:25.517 # DPDK always builds both static and shared, so we need to remove one or the other 00:52:25.517 # SPDK always builds static, so remove it if we want shared. 00:52:25.517 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/lib*.a 00:52:25.517 00:52:25.517 # The ISA-L install may have installed some binaries that we do not want to package 00:52:25.517 rm -f /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/igzip 00:52:25.517 rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/share/man 00:52:25.517 00:52:25.517 # Include libvfio-user libs in case --with-vfio-user is in use together with --with-shared 00:52:25.517 00:52:25.517 # And some useful setup scripts SPDK uses 00:52:25.517 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:52:25.517 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d 00:52:25.517 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d 00:52:25.517 mkdir -p /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d 00:52:25.517 00:52:25.517 cat <<-EOF > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/ld.so.conf.d/spdk.conf 00:52:25.517 /usr/local/lib 00:52:25.517 /usr/local/lib/dpdk 00:52:25.517 /usr/local/lib/libvfio-user 00:52:25.517 EOF 00:52:25.517 00:52:25.517 cat <<-'EOF' > /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/profile.d/spdk_path.sh 00:52:25.517 PATH=$PATH:/usr/libexec/spdk/scripts 00:52:25.517 PATH=$PATH:/usr/libexec/spdk/scripts/vagrant 00:52:25.517 PATH=$PATH:/usr/libexec/spdk/test/common/config 00:52:25.517 export PATH 00:52:25.517 EOF 00:52:25.517 00:52:25.518 cfs /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk scripts 00:52:25.518 ln -s /usr/libexec/spdk/scripts/bash-completion/spdk /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/etc/bash_completion.d/ 00:52:25.518 00:52:25.518 # We need to take into the account the fact that most of the scripts depend on being 00:52:25.518 # run directly from the repo. To workaround it, create common root space under dir 00:52:25.518 # like /usr/libexec/spdk and link all potential relative paths the script may try 00:52:25.518 # to reference. 00:52:25.518 00:52:25.518 # setup.sh uses pci_ids.h 00:52:25.518 ln -s /usr/local/include /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/libexec/spdk 00:52:25.518 00:52:25.518 %files 00:52:25.518 /usr/local/bin/* 00:52:25.518 /usr/local/lib/python3.9/site-packages/spdk*/* 00:52:25.518 00:52:25.518 %package devel 00:52:25.518 00:52:25.518 Summary: SPDK development libraries and headers 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 %description devel 00:52:25.518 00:52:25.518 SPDK development libraries and header 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 00:52:25.518 %files devel 00:52:25.518 /usr/local/include/* 00:52:25.518 /usr/local/lib/pkgconfig/*.pc 00:52:25.518 /usr/local/lib/*.la 00:52:25.518 /usr/local/lib/*.so* 00:52:25.518 /etc/ld.so.conf.d/spdk.conf 00:52:25.518 00:52:25.518 %post devel 00:52:25.518 ldconfig 00:52:25.518 00:52:25.518 %package scripts 00:52:25.518 Summary: SPDK scripts and utilities 00:52:25.518 00:52:25.518 %description scripts 00:52:25.518 SPDK scripts and utilities 00:52:25.518 00:52:25.518 %files scripts 00:52:25.518 /usr/libexec/spdk/* 00:52:25.518 /etc/profile.d/* 00:52:25.518 /etc/bash_completion.d/* 00:52:25.518 00:52:25.518 %post scripts 00:52:25.518 ldconfig 00:52:25.518 00:52:25.518 %changelog 00:52:25.518 * Tue Feb 16 2021 Michal Berger 00:52:25.518 - Initial RPM .spec for the SPDK 00:52:25.518 12:54:46 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@118 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm.sh --with-shared --with-dpdk=/home/vagrant/spdk_repo/dpdk/build 00:52:25.518 * Starting rpmbuild... 00:52:25.518 setting SOURCE_DATE_EPOCH=1613433600 00:52:25.518 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.3LvhOz 00:52:25.518 + umask 022 00:52:25.518 + cd /home/vagrant/spdk_repo/spdk 00:52:25.518 + make clean -j10 00:52:32.071 + RPM_EC=0 00:52:32.071 ++ jobs -p 00:52:32.071 + exit 0 00:52:32.071 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.UmenEY 00:52:32.071 + umask 022 00:52:32.071 + cd /home/vagrant/spdk_repo/spdk 00:52:32.071 + set +x 00:52:32.071 Using /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig for additional libs... 00:52:32.071 DPDK libraries: /home/vagrant/spdk_repo/dpdk/build/lib 00:52:32.071 DPDK includes: //home/vagrant/spdk_repo/dpdk/build/include 00:52:32.071 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:52:47.509 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:53:02.377 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:53:02.378 Creating mk/config.mk...done. 00:53:02.378 Creating mk/cc.flags.mk...done. 00:53:02.378 Type 'make' to build. 00:53:02.378 make[1]: Nothing to be done for 'all'. 00:53:41.085 CC lib/ut_mock/mock.o 00:53:41.085 CC lib/log/log.o 00:53:41.085 CC lib/log/log_flags.o 00:53:41.085 CC lib/log/log_deprecated.o 00:53:41.085 LIB libspdk_log.a 00:53:41.085 LIB libspdk_ut_mock.a 00:53:41.085 SO libspdk_log.so.7.0 00:53:41.085 SO libspdk_ut_mock.so.6.0 00:53:41.085 SYMLINK libspdk_log.so 00:53:41.085 SYMLINK libspdk_ut_mock.so 00:53:41.085 CC lib/ioat/ioat.o 00:53:41.085 CXX lib/trace_parser/trace.o 00:53:41.085 CC lib/util/base64.o 00:53:41.085 CC lib/dma/dma.o 00:53:41.085 CC lib/util/bit_array.o 00:53:41.085 CC lib/util/cpuset.o 00:53:41.085 CC lib/util/crc16.o 00:53:41.085 CC lib/util/crc32.o 00:53:41.085 CC lib/util/crc32c.o 00:53:41.085 CC lib/util/crc32_ieee.o 00:53:41.085 CC lib/vfio_user/host/vfio_user_pci.o 00:53:41.085 CC lib/vfio_user/host/vfio_user.o 00:53:41.085 LIB libspdk_dma.a 00:53:41.085 CC lib/util/crc64.o 00:53:41.085 SO libspdk_dma.so.4.0 00:53:41.085 LIB libspdk_ioat.a 00:53:41.085 CC lib/util/dif.o 00:53:41.085 CC lib/util/fd.o 00:53:41.085 SO libspdk_ioat.so.7.0 00:53:41.085 SYMLINK libspdk_dma.so 00:53:41.085 CC lib/util/file.o 00:53:41.085 CC lib/util/hexlify.o 00:53:41.085 CC lib/util/iov.o 00:53:41.085 SYMLINK libspdk_ioat.so 00:53:41.085 CC lib/util/math.o 00:53:41.085 CC lib/util/pipe.o 00:53:41.085 CC lib/util/strerror_tls.o 00:53:41.085 CC lib/util/string.o 00:53:41.085 CC lib/util/uuid.o 00:53:41.085 CC lib/util/fd_group.o 00:53:41.085 LIB libspdk_vfio_user.a 00:53:41.085 SO libspdk_vfio_user.so.5.0 00:53:41.085 CC lib/util/xor.o 00:53:41.085 CC lib/util/zipf.o 00:53:41.085 SYMLINK libspdk_vfio_user.so 00:53:41.085 LIB libspdk_trace_parser.a 00:53:41.085 SO libspdk_trace_parser.so.5.0 00:53:41.085 SYMLINK libspdk_trace_parser.so 00:53:41.085 LIB libspdk_util.a 00:53:41.085 SO libspdk_util.so.9.0 00:53:41.085 SYMLINK libspdk_util.so 00:53:41.085 CC lib/conf/conf.o 00:53:41.085 CC lib/vmd/vmd.o 00:53:41.085 CC lib/vmd/led.o 00:53:41.085 CC lib/json/json_parse.o 00:53:41.085 CC lib/json/json_util.o 00:53:41.085 CC lib/json/json_write.o 00:53:41.085 CC lib/env_dpdk/env.o 00:53:41.085 CC lib/env_dpdk/pci.o 00:53:41.085 CC lib/env_dpdk/memory.o 00:53:41.085 CC lib/env_dpdk/init.o 00:53:41.085 CC lib/env_dpdk/threads.o 00:53:41.085 LIB libspdk_conf.a 00:53:41.085 SO libspdk_conf.so.6.0 00:53:41.085 CC lib/env_dpdk/pci_ioat.o 00:53:41.085 SYMLINK libspdk_conf.so 00:53:41.085 CC lib/env_dpdk/pci_virtio.o 00:53:41.085 CC lib/env_dpdk/pci_vmd.o 00:53:41.085 CC lib/env_dpdk/pci_idxd.o 00:53:41.085 CC lib/env_dpdk/pci_event.o 00:53:41.085 CC lib/env_dpdk/sigbus_handler.o 00:53:41.085 CC lib/env_dpdk/pci_dpdk.o 00:53:41.085 CC lib/env_dpdk/pci_dpdk_2207.o 00:53:41.085 CC lib/env_dpdk/pci_dpdk_2211.o 00:53:41.085 LIB libspdk_vmd.a 00:53:41.085 LIB libspdk_json.a 00:53:41.085 SO libspdk_vmd.so.6.0 00:53:41.085 SO libspdk_json.so.6.0 00:53:41.085 SYMLINK libspdk_vmd.so 00:53:41.085 SYMLINK libspdk_json.so 00:53:41.085 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:53:41.085 CC lib/jsonrpc/jsonrpc_server.o 00:53:41.085 CC lib/jsonrpc/jsonrpc_client.o 00:53:41.085 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:53:41.085 LIB libspdk_jsonrpc.a 00:53:41.085 SO libspdk_jsonrpc.so.6.0 00:53:41.085 LIB libspdk_env_dpdk.a 00:53:41.085 SYMLINK libspdk_jsonrpc.so 00:53:41.085 SO libspdk_env_dpdk.so.14.1 00:53:41.085 SYMLINK libspdk_env_dpdk.so 00:53:41.345 CC lib/rpc/rpc.o 00:53:41.603 LIB libspdk_rpc.a 00:53:41.603 SO libspdk_rpc.so.6.0 00:53:41.862 SYMLINK libspdk_rpc.so 00:53:42.120 CC lib/keyring/keyring.o 00:53:42.120 CC lib/keyring/keyring_rpc.o 00:53:42.120 CC lib/notify/notify.o 00:53:42.120 CC lib/notify/notify_rpc.o 00:53:42.120 CC lib/trace/trace_flags.o 00:53:42.120 CC lib/trace/trace_rpc.o 00:53:42.121 CC lib/trace/trace.o 00:53:42.121 LIB libspdk_notify.a 00:53:42.380 SO libspdk_notify.so.6.0 00:53:42.380 LIB libspdk_keyring.a 00:53:42.380 SYMLINK libspdk_notify.so 00:53:42.380 SO libspdk_keyring.so.1.0 00:53:42.380 LIB libspdk_trace.a 00:53:42.380 SYMLINK libspdk_keyring.so 00:53:42.380 SO libspdk_trace.so.10.0 00:53:42.638 SYMLINK libspdk_trace.so 00:53:42.897 CC lib/thread/thread.o 00:53:42.897 CC lib/thread/iobuf.o 00:53:42.897 CC lib/sock/sock.o 00:53:42.897 CC lib/sock/sock_rpc.o 00:53:43.463 LIB libspdk_sock.a 00:53:43.463 SO libspdk_sock.so.9.0 00:53:43.463 SYMLINK libspdk_sock.so 00:53:43.722 CC lib/nvme/nvme_ctrlr_cmd.o 00:53:43.722 CC lib/nvme/nvme_fabric.o 00:53:43.722 CC lib/nvme/nvme_ns_cmd.o 00:53:43.722 CC lib/nvme/nvme_ctrlr.o 00:53:43.722 CC lib/nvme/nvme_pcie.o 00:53:43.722 CC lib/nvme/nvme_qpair.o 00:53:43.722 CC lib/nvme/nvme_ns.o 00:53:43.722 CC lib/nvme/nvme.o 00:53:43.722 CC lib/nvme/nvme_pcie_common.o 00:53:43.981 LIB libspdk_thread.a 00:53:43.981 SO libspdk_thread.so.10.0 00:53:43.981 SYMLINK libspdk_thread.so 00:53:43.981 CC lib/nvme/nvme_quirks.o 00:53:44.572 CC lib/nvme/nvme_transport.o 00:53:44.572 CC lib/nvme/nvme_discovery.o 00:53:44.572 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:53:44.572 CC lib/accel/accel.o 00:53:44.572 CC lib/blob/blobstore.o 00:53:44.572 CC lib/virtio/virtio.o 00:53:44.572 CC lib/init/json_config.o 00:53:44.858 CC lib/virtio/virtio_vhost_user.o 00:53:44.858 CC lib/virtio/virtio_vfio_user.o 00:53:44.858 CC lib/init/subsystem.o 00:53:44.858 CC lib/init/subsystem_rpc.o 00:53:44.858 CC lib/init/rpc.o 00:53:44.858 CC lib/virtio/virtio_pci.o 00:53:44.858 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:53:45.116 CC lib/nvme/nvme_tcp.o 00:53:45.116 CC lib/blob/request.o 00:53:45.116 CC lib/blob/zeroes.o 00:53:45.116 CC lib/nvme/nvme_opal.o 00:53:45.116 LIB libspdk_init.a 00:53:45.116 SO libspdk_init.so.5.0 00:53:45.116 CC lib/nvme/nvme_io_msg.o 00:53:45.116 SYMLINK libspdk_init.so 00:53:45.116 CC lib/nvme/nvme_poll_group.o 00:53:45.116 LIB libspdk_virtio.a 00:53:45.375 SO libspdk_virtio.so.7.0 00:53:45.375 SYMLINK libspdk_virtio.so 00:53:45.375 CC lib/nvme/nvme_zns.o 00:53:45.375 CC lib/accel/accel_rpc.o 00:53:45.375 CC lib/accel/accel_sw.o 00:53:45.375 CC lib/event/app.o 00:53:45.375 CC lib/blob/blob_bs_dev.o 00:53:45.634 CC lib/nvme/nvme_stubs.o 00:53:45.634 CC lib/nvme/nvme_auth.o 00:53:45.634 CC lib/nvme/nvme_cuse.o 00:53:45.634 LIB libspdk_accel.a 00:53:45.634 CC lib/event/reactor.o 00:53:45.634 SO libspdk_accel.so.15.0 00:53:45.634 CC lib/event/log_rpc.o 00:53:45.634 SYMLINK libspdk_accel.so 00:53:45.634 CC lib/event/app_rpc.o 00:53:45.893 CC lib/event/scheduler_static.o 00:53:46.152 LIB libspdk_event.a 00:53:46.152 SO libspdk_event.so.13.1 00:53:46.152 CC lib/bdev/bdev.o 00:53:46.152 CC lib/bdev/bdev_rpc.o 00:53:46.152 CC lib/bdev/bdev_zone.o 00:53:46.152 CC lib/bdev/part.o 00:53:46.152 CC lib/bdev/scsi_nvme.o 00:53:46.152 SYMLINK libspdk_event.so 00:53:46.411 LIB libspdk_nvme.a 00:53:46.411 SO libspdk_nvme.so.13.0 00:53:46.411 SYMLINK libspdk_nvme.so 00:53:46.978 LIB libspdk_blob.a 00:53:46.978 SO libspdk_blob.so.11.0 00:53:46.979 SYMLINK libspdk_blob.so 00:53:47.238 CC lib/lvol/lvol.o 00:53:47.238 CC lib/blobfs/blobfs.o 00:53:47.238 CC lib/blobfs/tree.o 00:53:48.175 LIB libspdk_blobfs.a 00:53:48.175 SO libspdk_blobfs.so.10.0 00:53:48.175 SYMLINK libspdk_blobfs.so 00:53:48.175 LIB libspdk_lvol.a 00:53:48.175 SO libspdk_lvol.so.10.0 00:53:48.175 SYMLINK libspdk_lvol.so 00:53:48.434 LIB libspdk_bdev.a 00:53:48.434 SO libspdk_bdev.so.15.0 00:53:48.434 SYMLINK libspdk_bdev.so 00:53:48.692 CC lib/scsi/dev.o 00:53:48.692 CC lib/scsi/port.o 00:53:48.692 CC lib/scsi/lun.o 00:53:48.692 CC lib/scsi/scsi_bdev.o 00:53:48.692 CC lib/scsi/scsi.o 00:53:48.692 CC lib/scsi/scsi_pr.o 00:53:48.692 CC lib/scsi/scsi_rpc.o 00:53:48.692 CC lib/ftl/ftl_core.o 00:53:48.692 CC lib/nvmf/ctrlr.o 00:53:48.692 CC lib/nbd/nbd.o 00:53:48.950 CC lib/scsi/task.o 00:53:48.950 CC lib/nbd/nbd_rpc.o 00:53:48.950 CC lib/nvmf/ctrlr_discovery.o 00:53:48.950 CC lib/nvmf/ctrlr_bdev.o 00:53:48.950 CC lib/ftl/ftl_init.o 00:53:48.950 CC lib/ftl/ftl_layout.o 00:53:49.209 CC lib/ftl/ftl_debug.o 00:53:49.209 CC lib/nvmf/subsystem.o 00:53:49.209 CC lib/nvmf/nvmf.o 00:53:49.209 CC lib/nvmf/nvmf_rpc.o 00:53:49.209 LIB libspdk_nbd.a 00:53:49.209 SO libspdk_nbd.so.7.0 00:53:49.209 CC lib/nvmf/transport.o 00:53:49.209 LIB libspdk_scsi.a 00:53:49.209 CC lib/nvmf/tcp.o 00:53:49.209 SYMLINK libspdk_nbd.so 00:53:49.209 CC lib/nvmf/stubs.o 00:53:49.209 SO libspdk_scsi.so.9.0 00:53:49.209 CC lib/ftl/ftl_io.o 00:53:49.468 CC lib/ftl/ftl_sb.o 00:53:49.468 SYMLINK libspdk_scsi.so 00:53:49.468 CC lib/ftl/ftl_l2p.o 00:53:49.468 CC lib/nvmf/mdns_server.o 00:53:49.468 CC lib/nvmf/auth.o 00:53:49.468 CC lib/ftl/ftl_l2p_flat.o 00:53:49.468 CC lib/ftl/ftl_nv_cache.o 00:53:49.727 CC lib/ftl/ftl_band.o 00:53:49.727 CC lib/ftl/ftl_band_ops.o 00:53:49.727 CC lib/ftl/ftl_writer.o 00:53:49.727 CC lib/iscsi/conn.o 00:53:49.727 CC lib/iscsi/init_grp.o 00:53:49.985 CC lib/iscsi/iscsi.o 00:53:49.985 CC lib/iscsi/md5.o 00:53:49.985 CC lib/vhost/vhost.o 00:53:49.985 CC lib/vhost/vhost_rpc.o 00:53:49.985 CC lib/vhost/vhost_scsi.o 00:53:49.985 CC lib/vhost/vhost_blk.o 00:53:49.985 CC lib/vhost/rte_vhost_user.o 00:53:50.243 CC lib/ftl/ftl_rq.o 00:53:50.243 CC lib/ftl/ftl_reloc.o 00:53:50.243 CC lib/iscsi/param.o 00:53:50.243 LIB libspdk_nvmf.a 00:53:50.243 SO libspdk_nvmf.so.18.1 00:53:50.243 CC lib/ftl/ftl_l2p_cache.o 00:53:50.506 SYMLINK libspdk_nvmf.so 00:53:50.506 CC lib/iscsi/portal_grp.o 00:53:50.506 CC lib/iscsi/tgt_node.o 00:53:50.506 CC lib/iscsi/iscsi_subsystem.o 00:53:50.506 CC lib/iscsi/iscsi_rpc.o 00:53:50.506 CC lib/ftl/ftl_p2l.o 00:53:50.804 CC lib/iscsi/task.o 00:53:50.804 CC lib/ftl/mngt/ftl_mngt.o 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_startup.o 00:53:50.804 LIB libspdk_vhost.a 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_md.o 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_misc.o 00:53:50.804 SO libspdk_vhost.so.8.0 00:53:50.804 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:53:51.062 SYMLINK libspdk_vhost.so 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_band.o 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:53:51.062 LIB libspdk_iscsi.a 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:53:51.062 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:53:51.062 SO libspdk_iscsi.so.8.0 00:53:51.062 CC lib/ftl/utils/ftl_conf.o 00:53:51.062 CC lib/ftl/utils/ftl_md.o 00:53:51.062 SYMLINK libspdk_iscsi.so 00:53:51.062 CC lib/ftl/utils/ftl_mempool.o 00:53:51.062 CC lib/ftl/utils/ftl_bitmap.o 00:53:51.062 CC lib/ftl/utils/ftl_property.o 00:53:51.062 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:53:51.062 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:53:51.062 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:53:51.321 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:53:51.321 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:53:51.321 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:53:51.321 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:53:51.321 CC lib/ftl/upgrade/ftl_sb_v3.o 00:53:51.321 CC lib/ftl/upgrade/ftl_sb_v5.o 00:53:51.321 CC lib/ftl/nvc/ftl_nvc_dev.o 00:53:51.321 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:53:51.321 CC lib/ftl/base/ftl_base_dev.o 00:53:51.579 CC lib/ftl/base/ftl_base_bdev.o 00:53:51.579 LIB libspdk_ftl.a 00:53:51.838 SO libspdk_ftl.so.9.0 00:53:51.838 SYMLINK libspdk_ftl.so 00:53:52.096 CC module/env_dpdk/env_dpdk_rpc.o 00:53:52.354 CC module/accel/error/accel_error.o 00:53:52.354 CC module/accel/ioat/accel_ioat.o 00:53:52.354 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:53:52.354 CC module/blob/bdev/blob_bdev.o 00:53:52.354 CC module/keyring/file/keyring.o 00:53:52.354 CC module/scheduler/gscheduler/gscheduler.o 00:53:52.354 CC module/sock/posix/posix.o 00:53:52.354 CC module/keyring/linux/keyring.o 00:53:52.354 LIB libspdk_env_dpdk_rpc.a 00:53:52.354 CC module/scheduler/dynamic/scheduler_dynamic.o 00:53:52.354 SO libspdk_env_dpdk_rpc.so.6.0 00:53:52.354 LIB libspdk_scheduler_gscheduler.a 00:53:52.354 CC module/keyring/file/keyring_rpc.o 00:53:52.354 LIB libspdk_scheduler_dpdk_governor.a 00:53:52.354 CC module/accel/error/accel_error_rpc.o 00:53:52.613 SO libspdk_scheduler_dpdk_governor.so.4.0 00:53:52.613 SO libspdk_scheduler_gscheduler.so.4.0 00:53:52.613 CC module/keyring/linux/keyring_rpc.o 00:53:52.613 SYMLINK libspdk_env_dpdk_rpc.so 00:53:52.613 CC module/accel/ioat/accel_ioat_rpc.o 00:53:52.613 SYMLINK libspdk_scheduler_gscheduler.so 00:53:52.613 SYMLINK libspdk_scheduler_dpdk_governor.so 00:53:52.613 LIB libspdk_scheduler_dynamic.a 00:53:52.613 LIB libspdk_keyring_file.a 00:53:52.613 LIB libspdk_blob_bdev.a 00:53:52.613 SO libspdk_scheduler_dynamic.so.4.0 00:53:52.613 SO libspdk_keyring_file.so.1.0 00:53:52.613 LIB libspdk_accel_ioat.a 00:53:52.613 LIB libspdk_keyring_linux.a 00:53:52.613 LIB libspdk_accel_error.a 00:53:52.613 SO libspdk_blob_bdev.so.11.0 00:53:52.613 SYMLINK libspdk_scheduler_dynamic.so 00:53:52.613 SO libspdk_keyring_linux.so.1.0 00:53:52.613 SO libspdk_accel_ioat.so.6.0 00:53:52.613 SO libspdk_accel_error.so.2.0 00:53:52.613 SYMLINK libspdk_keyring_file.so 00:53:52.613 SYMLINK libspdk_blob_bdev.so 00:53:52.613 SYMLINK libspdk_keyring_linux.so 00:53:52.613 SYMLINK libspdk_accel_ioat.so 00:53:52.613 SYMLINK libspdk_accel_error.so 00:53:52.871 CC module/bdev/passthru/vbdev_passthru.o 00:53:52.871 CC module/bdev/error/vbdev_error.o 00:53:52.871 CC module/bdev/malloc/bdev_malloc.o 00:53:52.871 CC module/bdev/nvme/bdev_nvme.o 00:53:52.871 CC module/bdev/null/bdev_null.o 00:53:52.871 CC module/bdev/delay/vbdev_delay.o 00:53:52.871 CC module/blobfs/bdev/blobfs_bdev.o 00:53:52.871 CC module/bdev/gpt/gpt.o 00:53:52.871 CC module/bdev/lvol/vbdev_lvol.o 00:53:53.129 LIB libspdk_sock_posix.a 00:53:53.129 SO libspdk_sock_posix.so.6.0 00:53:53.129 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:53:53.129 SYMLINK libspdk_sock_posix.so 00:53:53.129 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:53:53.129 CC module/bdev/gpt/vbdev_gpt.o 00:53:53.129 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:53:53.129 CC module/bdev/error/vbdev_error_rpc.o 00:53:53.129 CC module/bdev/null/bdev_null_rpc.o 00:53:53.387 CC module/bdev/delay/vbdev_delay_rpc.o 00:53:53.387 CC module/bdev/malloc/bdev_malloc_rpc.o 00:53:53.387 LIB libspdk_blobfs_bdev.a 00:53:53.387 SO libspdk_blobfs_bdev.so.6.0 00:53:53.387 LIB libspdk_bdev_passthru.a 00:53:53.387 LIB libspdk_bdev_error.a 00:53:53.387 LIB libspdk_bdev_null.a 00:53:53.387 SO libspdk_bdev_error.so.6.0 00:53:53.387 SO libspdk_bdev_passthru.so.6.0 00:53:53.387 LIB libspdk_bdev_gpt.a 00:53:53.387 SO libspdk_bdev_null.so.6.0 00:53:53.387 SO libspdk_bdev_gpt.so.6.0 00:53:53.387 SYMLINK libspdk_blobfs_bdev.so 00:53:53.387 LIB libspdk_bdev_delay.a 00:53:53.387 CC module/bdev/nvme/bdev_nvme_rpc.o 00:53:53.387 LIB libspdk_bdev_malloc.a 00:53:53.387 SYMLINK libspdk_bdev_error.so 00:53:53.387 CC module/bdev/nvme/nvme_rpc.o 00:53:53.387 SYMLINK libspdk_bdev_passthru.so 00:53:53.387 SO libspdk_bdev_delay.so.6.0 00:53:53.387 SO libspdk_bdev_malloc.so.6.0 00:53:53.387 SYMLINK libspdk_bdev_gpt.so 00:53:53.387 SYMLINK libspdk_bdev_null.so 00:53:53.645 SYMLINK libspdk_bdev_delay.so 00:53:53.645 SYMLINK libspdk_bdev_malloc.so 00:53:53.645 CC module/bdev/nvme/bdev_mdns_client.o 00:53:53.645 LIB libspdk_bdev_lvol.a 00:53:53.645 SO libspdk_bdev_lvol.so.6.0 00:53:53.645 CC module/bdev/raid/bdev_raid.o 00:53:53.645 CC module/bdev/raid/bdev_raid_rpc.o 00:53:53.645 CC module/bdev/raid/bdev_raid_sb.o 00:53:53.645 SYMLINK libspdk_bdev_lvol.so 00:53:53.645 CC module/bdev/raid/raid0.o 00:53:53.645 CC module/bdev/zone_block/vbdev_zone_block.o 00:53:53.645 CC module/bdev/aio/bdev_aio.o 00:53:53.645 CC module/bdev/split/vbdev_split.o 00:53:53.645 CC module/bdev/ftl/bdev_ftl.o 00:53:53.903 CC module/bdev/split/vbdev_split_rpc.o 00:53:53.903 CC module/bdev/aio/bdev_aio_rpc.o 00:53:53.903 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:53:53.903 CC module/bdev/nvme/vbdev_opal.o 00:53:53.903 CC module/bdev/ftl/bdev_ftl_rpc.o 00:53:54.161 CC module/bdev/raid/raid1.o 00:53:54.161 LIB libspdk_bdev_aio.a 00:53:54.161 LIB libspdk_bdev_split.a 00:53:54.161 CC module/bdev/raid/concat.o 00:53:54.161 SO libspdk_bdev_split.so.6.0 00:53:54.161 SO libspdk_bdev_aio.so.6.0 00:53:54.161 LIB libspdk_bdev_zone_block.a 00:53:54.161 SYMLINK libspdk_bdev_split.so 00:53:54.161 SO libspdk_bdev_zone_block.so.6.0 00:53:54.161 CC module/bdev/nvme/vbdev_opal_rpc.o 00:53:54.161 SYMLINK libspdk_bdev_aio.so 00:53:54.161 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:53:54.161 SYMLINK libspdk_bdev_zone_block.so 00:53:54.161 LIB libspdk_bdev_ftl.a 00:53:54.161 CC module/bdev/virtio/bdev_virtio_blk.o 00:53:54.161 CC module/bdev/virtio/bdev_virtio_scsi.o 00:53:54.161 SO libspdk_bdev_ftl.so.6.0 00:53:54.161 CC module/bdev/virtio/bdev_virtio_rpc.o 00:53:54.419 SYMLINK libspdk_bdev_ftl.so 00:53:54.678 LIB libspdk_bdev_raid.a 00:53:54.678 SO libspdk_bdev_raid.so.6.0 00:53:54.678 SYMLINK libspdk_bdev_raid.so 00:53:54.678 LIB libspdk_bdev_virtio.a 00:53:54.678 SO libspdk_bdev_virtio.so.6.0 00:53:54.678 SYMLINK libspdk_bdev_virtio.so 00:53:54.937 LIB libspdk_bdev_nvme.a 00:53:54.937 SO libspdk_bdev_nvme.so.7.0 00:53:54.937 SYMLINK libspdk_bdev_nvme.so 00:53:55.503 CC module/event/subsystems/scheduler/scheduler.o 00:53:55.503 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:53:55.503 CC module/event/subsystems/iobuf/iobuf.o 00:53:55.503 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:53:55.503 CC module/event/subsystems/vmd/vmd.o 00:53:55.503 CC module/event/subsystems/sock/sock.o 00:53:55.503 CC module/event/subsystems/vmd/vmd_rpc.o 00:53:55.503 CC module/event/subsystems/keyring/keyring.o 00:53:55.503 LIB libspdk_event_vhost_blk.a 00:53:55.503 LIB libspdk_event_iobuf.a 00:53:55.503 SO libspdk_event_vhost_blk.so.3.0 00:53:55.503 LIB libspdk_event_vmd.a 00:53:55.761 SO libspdk_event_iobuf.so.3.0 00:53:55.761 LIB libspdk_event_keyring.a 00:53:55.761 SO libspdk_event_vmd.so.6.0 00:53:55.761 LIB libspdk_event_scheduler.a 00:53:55.761 LIB libspdk_event_sock.a 00:53:55.761 SYMLINK libspdk_event_vhost_blk.so 00:53:55.761 SO libspdk_event_scheduler.so.4.0 00:53:55.761 SYMLINK libspdk_event_iobuf.so 00:53:55.761 SO libspdk_event_keyring.so.1.0 00:53:55.761 SO libspdk_event_sock.so.5.0 00:53:55.761 SYMLINK libspdk_event_vmd.so 00:53:55.761 SYMLINK libspdk_event_sock.so 00:53:55.761 SYMLINK libspdk_event_scheduler.so 00:53:55.761 SYMLINK libspdk_event_keyring.so 00:53:56.020 CC module/event/subsystems/accel/accel.o 00:53:56.020 LIB libspdk_event_accel.a 00:53:56.278 SO libspdk_event_accel.so.6.0 00:53:56.278 SYMLINK libspdk_event_accel.so 00:53:56.537 CC module/event/subsystems/bdev/bdev.o 00:53:56.855 LIB libspdk_event_bdev.a 00:53:56.855 SO libspdk_event_bdev.so.6.0 00:53:56.855 SYMLINK libspdk_event_bdev.so 00:53:57.114 CC module/event/subsystems/scsi/scsi.o 00:53:57.114 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:53:57.114 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:53:57.114 CC module/event/subsystems/nbd/nbd.o 00:53:57.372 LIB libspdk_event_nbd.a 00:53:57.372 LIB libspdk_event_scsi.a 00:53:57.372 SO libspdk_event_scsi.so.6.0 00:53:57.372 SO libspdk_event_nbd.so.6.0 00:53:57.372 SYMLINK libspdk_event_scsi.so 00:53:57.372 SYMLINK libspdk_event_nbd.so 00:53:57.372 LIB libspdk_event_nvmf.a 00:53:57.372 SO libspdk_event_nvmf.so.6.0 00:53:57.632 SYMLINK libspdk_event_nvmf.so 00:53:57.632 CC module/event/subsystems/iscsi/iscsi.o 00:53:57.632 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:53:57.891 LIB libspdk_event_vhost_scsi.a 00:53:57.891 SO libspdk_event_vhost_scsi.so.3.0 00:53:57.891 LIB libspdk_event_iscsi.a 00:53:57.891 SO libspdk_event_iscsi.so.6.0 00:53:57.891 SYMLINK libspdk_event_vhost_scsi.so 00:53:57.891 SYMLINK libspdk_event_iscsi.so 00:53:58.149 SO libspdk.so.6.0 00:53:58.149 SYMLINK libspdk.so 00:53:58.149 make[1]: Nothing to be done for 'all'. 00:53:58.407 CC app/trace_record/trace_record.o 00:53:58.407 CXX app/trace/trace.o 00:53:58.407 CC examples/ioat/perf/perf.o 00:53:58.407 CC examples/sock/hello_world/hello_sock.o 00:53:58.407 CC examples/vmd/lsvmd/lsvmd.o 00:53:58.407 CC examples/accel/perf/accel_perf.o 00:53:58.407 CC examples/nvme/hello_world/hello_world.o 00:53:58.407 CC examples/blob/hello_world/hello_blob.o 00:53:58.665 CC examples/nvmf/nvmf/nvmf.o 00:53:58.665 CC examples/bdev/hello_world/hello_bdev.o 00:53:58.665 LINK lsvmd 00:53:58.665 LINK spdk_trace_record 00:53:58.665 LINK hello_sock 00:53:58.665 LINK ioat_perf 00:53:58.665 LINK hello_world 00:53:58.665 LINK hello_blob 00:53:58.924 LINK hello_bdev 00:53:58.924 LINK spdk_trace 00:53:58.924 LINK nvmf 00:53:58.924 CC examples/vmd/led/led.o 00:53:58.924 CC examples/ioat/verify/verify.o 00:53:58.924 CC examples/blob/cli/blobcli.o 00:53:58.924 CC examples/nvme/reconnect/reconnect.o 00:53:58.924 CC examples/bdev/bdevperf/bdevperf.o 00:53:58.924 CC examples/nvme/nvme_manage/nvme_manage.o 00:53:58.924 LINK accel_perf 00:53:59.182 LINK led 00:53:59.182 CC app/iscsi_tgt/iscsi_tgt.o 00:53:59.182 CC app/nvmf_tgt/nvmf_main.o 00:53:59.182 LINK verify 00:53:59.182 CC app/spdk_tgt/spdk_tgt.o 00:53:59.182 CC app/spdk_lspci/spdk_lspci.o 00:53:59.182 CC app/spdk_nvme_perf/perf.o 00:53:59.182 LINK iscsi_tgt 00:53:59.182 LINK nvmf_tgt 00:53:59.182 LINK reconnect 00:53:59.439 CC app/spdk_nvme_identify/identify.o 00:53:59.439 LINK spdk_lspci 00:53:59.439 LINK spdk_tgt 00:53:59.439 LINK blobcli 00:53:59.439 CC examples/nvme/arbitration/arbitration.o 00:53:59.439 CC examples/nvme/hotplug/hotplug.o 00:53:59.439 CC app/spdk_nvme_discover/discovery_aer.o 00:53:59.697 LINK nvme_manage 00:53:59.697 CC examples/nvme/cmb_copy/cmb_copy.o 00:53:59.697 CC app/spdk_top/spdk_top.o 00:53:59.697 LINK spdk_nvme_discover 00:53:59.697 LINK hotplug 00:53:59.697 CC examples/util/zipf/zipf.o 00:53:59.697 LINK cmb_copy 00:53:59.697 LINK bdevperf 00:53:59.697 LINK arbitration 00:53:59.954 LINK zipf 00:53:59.954 CC examples/nvme/abort/abort.o 00:53:59.954 CC examples/thread/thread/thread_ex.o 00:53:59.954 CC app/vhost/vhost.o 00:53:59.954 CC app/spdk_dd/spdk_dd.o 00:53:59.954 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:54:00.213 CC examples/interrupt_tgt/interrupt_tgt.o 00:54:00.213 LINK spdk_nvme_perf 00:54:00.213 LINK vhost 00:54:00.213 LINK thread 00:54:00.213 LINK pmr_persistence 00:54:00.213 LINK abort 00:54:00.213 LINK interrupt_tgt 00:54:00.473 LINK spdk_dd 00:54:00.731 LINK spdk_top 00:54:00.731 LINK spdk_nvme_identify 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/accel_module.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/barrier.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/base64.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/assert.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_zone.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_array.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bit_pool.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/bdev_module.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blob_bdev.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/blobfs_bdev.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/conf.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/cpuset.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/config.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc16.h 00:54:01.298 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc32.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/crc64.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dif.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/dma.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/endian.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/event.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/env_dpdk.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/fd_group.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/file.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/hexlify.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/gpt_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ftl.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/histogram_data.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd.h 00:54:01.299 cp /home/vagrant/spdk_repo/spdk/scripts/rpc.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/idxd_spec.h 00:54:01.299 cp /home/vagrant/spdk_repo/spdk/scripts/spdkcli.py /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:54:01.299 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_rpc 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/init.h 00:54:01.299 chmod +x /home/vagrant/spdk_repo/spdk/build/bin/spdk_cli 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ioat.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/json.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/iscsi_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_rpc 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/jsonrpc.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_cli 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/keyring_module.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/log.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/lvol.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/likely.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/memory.h 00:54:01.299 patchelf: not an ELF executable 00:54:01.299 patchelf: not an ELF executable 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/mmio.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nbd.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/notify.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_ocssd_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_intel.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvme_zns.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_fc_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_cmd.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/nvmf_transport.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/opal_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pci_ids.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/pipe.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/queue_extras.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/reduce.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/rpc.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scheduler.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/scsi_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/stdinc.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/string.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/sock.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/thread.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/trace_parser.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/tree.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/ublk.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/util.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/uuid.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/version.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_pci.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfio_user_spec.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vhost.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vmd.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/vfu_target.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/xor.h 00:54:01.299 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/include/spdk/zipf.h 00:54:01.299 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:54:01.559 libtool: warning: remember to run 'libtool --finish /usr/local/lib' 00:54:01.559 Processing /home/vagrant/spdk_repo/spdk/python 00:54:01.559 DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. 00:54:01.559 pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. 00:54:01.817 Using legacy 'setup.py install' for spdk, since package 'wheel' is not installed. 00:54:01.817 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.a 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_log.pc 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_log.so 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.a 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ut_mock.pc 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ut_mock.so 00:54:02.074 Installing collected packages: spdk 00:54:02.074 Running setup.py install for spdk: started 00:54:02.074 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_util.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_util.so 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ioat.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace_parser.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ioat.so 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dma.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace_parser.so 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_dma.so 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vfio_user.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vfio_user.so 00:54:02.331 Running setup.py install for spdk: finished with status 'done' 00:54:02.331 Successfully installed spdk-24.9rc0 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.a 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vmd.pc 00:54:02.331 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.a 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.a 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_json.pc 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_conf.pc 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vmd.so 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_dpdklibs.pc 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_json.so 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_conf.so 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.a 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk.pc 00:54:02.589 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk.so 00:54:02.589 rm -rf /home/vagrant/spdk_repo/spdk/python/spdk.egg-info 00:54:02.848 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.a 00:54:02.848 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_jsonrpc.pc 00:54:02.848 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_jsonrpc.so 00:54:03.106 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.a 00:54:03.106 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_rpc.pc 00:54:03.106 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_rpc.so 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.a 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.a 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.a 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_trace.pc 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_notify.pc 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring.pc 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_trace.so 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_notify.so 00:54:03.675 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring.so 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.a 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_thread.pc 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.a 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock.pc 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_thread.so 00:54:03.934 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock.so 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.a 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_init.pc 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.a 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.a 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob.pc 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.a 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel.pc 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_init.so 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_virtio.pc 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob.so 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel.so 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.a 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_virtio.so 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvme.pc 00:54:04.193 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvme.so 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.a 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.a 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs.pc 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.a 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev.pc 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.a 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event.pc 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_lvol.pc 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev.so 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs.so 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event.so 00:54:04.452 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_lvol.so 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.a 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.a 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.a 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nbd.pc 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_nvmf.pc 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.a 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scsi.pc 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nbd.so 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_ftl.pc 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_nvmf.so 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scsi.so 00:54:04.711 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_ftl.so 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.a 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_vhost.pc 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.a 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_iscsi.pc 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_vhost.so 00:54:05.279 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_iscsi.so 00:54:05.539 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.a 00:54:05.539 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_env_dpdk_rpc.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_env_dpdk_rpc.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dpdk_governor.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_error.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_ioat.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_file.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dpdk_governor.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_dynamic.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_gscheduler.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.a 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_posix.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_error.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_accel_ioat.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_file.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_dynamic.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blob_bdev.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_linux.pc 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_scheduler_gscheduler.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_sock_posix.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blob_bdev.so 00:54:05.799 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_keyring_linux.so 00:54:06.059 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.a 00:54:06.059 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.a 00:54:06.059 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_delay.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_malloc.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_blobfs_bdev.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_raid.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_lvol.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_delay.so 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_malloc.so 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.a 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_raid.so 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_error.pc 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_blobfs_bdev.so 00:54:06.319 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_nvme.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_lvol.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_passthru.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_gpt.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_null.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_passthru.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_error.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_nvme.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_null.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_gpt.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.a 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.a 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_zone_block.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_split.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.a 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.a 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_split.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.a 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_zone_block.so 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_aio.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_virtio.pc 00:54:06.320 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_ftl.pc 00:54:06.578 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_virtio.so 00:54:06.578 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_ftl.so 00:54:06.578 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_bdev_aio.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vmd.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iobuf.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vmd.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_keyring.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_blk.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iobuf.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.a 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scheduler.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_sock.pc 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_blk.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_keyring.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scheduler.so 00:54:07.146 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_sock.so 00:54:07.405 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.a 00:54:07.405 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_accel.pc 00:54:07.405 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_accel.so 00:54:07.664 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.a 00:54:07.664 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_bdev.pc 00:54:07.664 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_bdev.so 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.a 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.a 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.a 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nvmf.pc 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_nbd.pc 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_scsi.pc 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nbd.so 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_scsi.so 00:54:08.233 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_nvmf.so 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.a 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.a 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_vhost_scsi.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_event_iscsi.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_vhost_scsi.so 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk_event_iscsi.so 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_bdev_modules.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_accel_modules.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_sock_modules.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_syslibs.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_scheduler_modules.pc 00:54:08.492 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/pkgconfig/spdk_keyring_modules.pc 00:54:08.751 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/lib/libspdk.so 00:54:09.010 make[1]: Nothing to be done for 'install'. 00:54:09.010 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace 00:54:09.010 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_trace_record 00:54:09.010 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/nvmf_tgt 00:54:09.269 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/iscsi_tgt 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_lspci 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_tgt 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_discover 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_perf 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_top 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_nvme_identify 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/vhost 00:54:09.527 INSTALL /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local/bin/spdk_dd 00:54:09.527 Installed to /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64/usr/local 00:54:09.785 Processing files: spdk-v24.09-1.x86_64 00:54:10.043 Provides: spdk = v24.09-1 spdk(x86-64) = v24.09-1 00:54:10.043 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:54:10.043 Requires: /usr/bin/env libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3.3)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libcrypto.so.3()(64bit) libfuse3.so.3()(64bit) libgcc_s.so.1()(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libm.so.6()(64bit) libmenu.so.6()(64bit) libncurses.so.6()(64bit) libpanel.so.6()(64bit) librte_bus_pci.so.23()(64bit) librte_cryptodev.so.23()(64bit) librte_dmadev.so.23()(64bit) librte_eal.so.23()(64bit) librte_ethdev.so.23()(64bit) librte_hash.so.23()(64bit) librte_kvargs.so.23()(64bit) librte_mbuf.so.23()(64bit) librte_mempool.so.23()(64bit) librte_mempool_ring.so.23()(64bit) librte_net.so.23()(64bit) librte_pci.so.23()(64bit) librte_power.so.23()(64bit) librte_rcu.so.23()(64bit) librte_ring.so.23()(64bit) librte_telemetry.so.23()(64bit) librte_vhost.so.23()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libstdc++.so.6()(64bit) libtinfo.so.6()(64bit) libuuid.so.1()(64bit) rtld(GNU_HASH) 00:54:10.043 Processing files: spdk-devel-v24.09-1.x86_64 00:54:12.572 Provides: libisal_crypto.so.2()(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) spdk-devel = v24.09-1 spdk-devel(x86-64) = v24.09-1 00:54:12.572 Requires(interp): /bin/sh 00:54:12.572 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:54:12.572 Requires(post): /bin/sh 00:54:12.573 Requires: ld-linux-x86-64.so.2()(64bit) ld-linux-x86-64.so.2(GLIBC_2.3)(64bit) libaio.so.1()(64bit) libaio.so.1(LIBAIO_0.1)(64bit) libaio.so.1(LIBAIO_0.4)(64bit) libc.so.6()(64bit) libc.so.6(GLIBC_2.10)(64bit) libc.so.6(GLIBC_2.14)(64bit) libc.so.6(GLIBC_2.16)(64bit) libc.so.6(GLIBC_2.17)(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.27)(64bit) libc.so.6(GLIBC_2.28)(64bit) libc.so.6(GLIBC_2.3)(64bit) libc.so.6(GLIBC_2.3.2)(64bit) libc.so.6(GLIBC_2.3.4)(64bit) libc.so.6(GLIBC_2.32)(64bit) libc.so.6(GLIBC_2.33)(64bit) libc.so.6(GLIBC_2.34)(64bit) libc.so.6(GLIBC_2.4)(64bit) libc.so.6(GLIBC_2.7)(64bit) libc.so.6(GLIBC_2.8)(64bit) libc.so.6(GLIBC_2.9)(64bit) libcrypto.so.3()(64bit) libcrypto.so.3(OPENSSL_3.0.0)(64bit) libfuse3.so.3()(64bit) libfuse3.so.3(FUSE_3.0)(64bit) libfuse3.so.3(FUSE_3.7)(64bit) libgcc_s.so.1()(64bit) libgcc_s.so.1(GCC_3.0)(64bit) libisal_crypto.so.2()(64bit) libkeyutils.so.1()(64bit) libkeyutils.so.1(KEYUTILS_0.3)(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.29)(64bit) libspdk_accel.so.15.0()(64bit) libspdk_accel_error.so.2.0()(64bit) libspdk_accel_ioat.so.6.0()(64bit) libspdk_bdev.so.15.0()(64bit) libspdk_bdev_aio.so.6.0()(64bit) libspdk_bdev_delay.so.6.0()(64bit) libspdk_bdev_error.so.6.0()(64bit) libspdk_bdev_ftl.so.6.0()(64bit) libspdk_bdev_gpt.so.6.0()(64bit) libspdk_bdev_lvol.so.6.0()(64bit) libspdk_bdev_malloc.so.6.0()(64bit) libspdk_bdev_null.so.6.0()(64bit) libspdk_bdev_nvme.so.7.0()(64bit) libspdk_bdev_passthru.so.6.0()(64bit) libspdk_bdev_raid.so.6.0()(64bit) libspdk_bdev_split.so.6.0()(64bit) libspdk_bdev_virtio.so.6.0()(64bit) libspdk_bdev_zone_block.so.6.0()(64bit) libspdk_blob.so.11.0()(64bit) libspdk_blob_bdev.so.11.0()(64bit) libspdk_blobfs.so.10.0()(64bit) libspdk_blobfs_bdev.so.6.0()(64bit) libspdk_conf.so.6.0()(64bit) libspdk_dma.so.4.0()(64bit) libspdk_env_dpdk.so.14.1()(64bit) libspdk_env_dpdk_rpc.so.6.0()(64bit) libspdk_event.so.13.1()(64bit) libspdk_event_accel.so.6.0()(64bit) libspdk_event_bdev.so.6.0()(64bit) libspdk_event_iobuf.so.3.0()(64bit) libspdk_event_iscsi.so.6.0()(64bit) libspdk_event_keyring.so.1.0()(64bit) libspdk_event_nbd.so.6.0()(64bit) libspdk_event_nvmf.so.6.0()(64bit) libspdk_event_scheduler.so.4.0()(64bit) libspdk_event_scsi.so.6.0()(64bit) libspdk_event_sock.so.5.0()(64bit) libspdk_event_vhost_blk.so.3.0()(64bit) libspdk_event_vhost_scsi.so.3.0()(64bit) libspdk_event_vmd.so.6.0()(64bit) libspdk_ftl.so.9.0()(64bit) libspdk_init.so.5.0()(64bit) libspdk_ioat.so.7.0()(64bit) libspdk_iscsi.so.8.0()(64bit) libspdk_json.so.6.0()(64bit) libspdk_jsonrpc.so.6.0()(64bit) libspdk_keyring.so.1.0()(64bit) libspdk_keyring_file.so.1.0()(64bit) libspdk_keyring_linux.so.1.0()(64bit) libspdk_log.so.7.0()(64bit) libspdk_lvol.so.10.0()(64bit) libspdk_nbd.so.7.0()(64bit) libspdk_notify.so.6.0()(64bit) libspdk_nvme.so.13.0()(64bit) libspdk_nvmf.so.18.1()(64bit) libspdk_rpc.so.6.0()(64bit) libspdk_scheduler_dpdk_governor.so.4.0()(64bit) libspdk_scheduler_dynamic.so.4.0()(64bit) libspdk_scheduler_gscheduler.so.4.0()(64bit) libspdk_scsi.so.9.0()(64bit) libspdk_sock.so.9.0()(64bit) libspdk_sock_posix.so.6.0()(64bit) libspdk_thread.so.10.0()(64bit) libspdk_trace.so.10.0()(64bit) libspdk_trace_parser.so.5.0()(64bit) libspdk_ut_mock.so.6.0()(64bit) libspdk_util.so.9.0()(64bit) libspdk_vfio_user.so.5.0()(64bit) libspdk_vhost.so.8.0()(64bit) libspdk_virtio.so.7.0()(64bit) libspdk_vmd.so.6.0()(64bit) libssl.so.3()(64bit) libssl.so.3(OPENSSL_3.0.0)(64bit) libstdc++.so.6()(64bit) libstdc++.so.6(CXXABI_1.3)(64bit) libstdc++.so.6(GLIBCXX_3.4)(64bit) libuuid.so.1()(64bit) libuuid.so.1(UUID_1.0)(64bit) libuuid.so.1(UUID_2.31)(64bit) rtld(GNU_HASH) 00:54:12.573 Processing files: spdk-scripts-v24.09-1.x86_64 00:54:12.573 warning: absolute symlink: /etc/bash_completion.d/spdk -> /usr/libexec/spdk/scripts/bash-completion/spdk 00:54:12.573 warning: absolute symlink: /usr/libexec/spdk/include -> /usr/local/include 00:54:13.947 Provides: spdk-scripts = v24.09-1 spdk-scripts(x86-64) = v24.09-1 00:54:13.947 Requires(interp): /bin/sh 00:54:13.947 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(FileDigests) <= 4.6.0-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 00:54:13.947 Requires(post): /bin/sh 00:54:13.947 Requires: /bin/bash /usr/bin/env 00:54:13.947 Checking for unpackaged file(s): /usr/lib/rpm/check-files /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:54:13.947 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/srcrpm/spdk-v24.09-1.src.rpm 00:54:14.515 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-scripts-v24.09-1.x86_64.rpm 00:54:15.889 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-v24.09-1.x86_64.rpm 00:54:22.450 Wrote: /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm/x86_64/spdk-devel-v24.09-1.x86_64.rpm 00:54:22.450 Executing(%clean): /bin/sh -e /var/tmp/rpm-tmp.dn30CL 00:54:22.450 + umask 022 00:54:22.450 + cd /home/vagrant/spdk_repo/spdk 00:54:22.450 + /usr/bin/rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/buildroot/spdk-v24.09-1.x86_64 00:54:22.450 + RPM_EC=0 00:54:22.450 ++ jobs -p 00:54:22.450 + exit 0 00:54:22.450 Executing(--clean): /bin/sh -e /var/tmp/rpm-tmp.yyWLQ3 00:54:22.450 + umask 022 00:54:22.450 + cd /home/vagrant/spdk_repo/spdk 00:54:22.450 + RPM_EC=0 00:54:22.450 ++ jobs -p 00:54:22.450 + exit 0 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@120 -- $ [[ -n /home/vagrant/spdk_repo/dpdk/build ]] 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@121 -- $ mv /home/vagrant/spdk_repo/dpdk/build /home/vagrant/spdk_repo/dpdk/build.hidden 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@123 -- $ install_uninstall_rpms 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@98 -- $ local rpms 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@100 -- $ rpms=("${1:-$builddir/rpm/}/$arch/"*.rpm) 00:54:22.450 12:56:45 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@103 -- $ make -C /home/vagrant/spdk_repo/spdk clean -j10 00:54:22.450 make: Entering directory '/home/vagrant/spdk_repo/spdk' 00:54:22.450 make[1]: Nothing to be done for 'clean'. 00:54:29.018 make[1]: Nothing to be done for 'clean'. 00:54:29.277 make: Leaving directory '/home/vagrant/spdk_repo/spdk' 00:54:29.277 12:56:52 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@105 -- $ sudo rpm -i /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:54:29.277 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:54:30.213 12:56:53 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@108 -- $ LIST_LIBS=yes 00:54:30.213 12:56:53 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@108 -- $ /home/vagrant/spdk_repo/spdk/rpmbuild/rpm-deps.sh spdk_tgt 00:54:30.213 /usr/local/bin/spdk_tgt 00:54:32.748 /usr/lib64/libaio.so.1:libaio-0.3.111-13.el9.x86_64 00:54:32.748 /usr/lib64/libc.so.6:glibc-2.34-83.el9.12.x86_64 00:54:32.748 /usr/lib64/libcrypto.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:54:32.748 /usr/lib64/libfuse3.so.3:fuse3-libs-3.10.2-6.el9.x86_64 00:54:32.748 /usr/lib64/libgcc_s.so.1:libgcc-11.4.1-2.1.el9.x86_64 00:54:32.748 /usr/lib64/libkeyutils.so.1:keyutils-libs-1.6.3-1.el9.x86_64 00:54:32.748 /usr/lib64/libm.so.6:glibc-2.34-83.el9.12.x86_64 00:54:32.748 /usr/lib64/libnuma.so.1:numactl-libs-2.0.16-1.el9.x86_64 00:54:32.748 /usr/lib64/libssl.so.3:openssl-libs-3.0.7-25.el9_3.x86_64 00:54:32.748 /usr/lib64/libuuid.so.1:libuuid-2.37.4-15.el9.x86_64 00:54:32.748 /usr/lib64/libz.so.1:zlib-1.2.11-40.el9.x86_64 00:54:32.748 /usr/local/lib/libisal_crypto.so.2:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/librte_bus_pci.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_cryptodev.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_dmadev.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_eal.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_ethdev.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_hash.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_kvargs.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_mbuf.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_mempool.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_mempool_ring.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_meter.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_net.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_pci.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_power.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_rcu.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_ring.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_telemetry.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_timer.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/librte_vhost.so.23:dpdk-devel-22.11-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_accel.so.15.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_accel_error.so.2.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_accel_ioat.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev.so.15.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_aio.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_delay.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_error.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_ftl.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_gpt.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_lvol.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_malloc.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_null.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_nvme.so.7.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_passthru.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_raid.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_split.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_virtio.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_bdev_zone_block.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_blob.so.11.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_blob_bdev.so.11.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_blobfs.so.10.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_blobfs_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_conf.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_dma.so.4.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_env_dpdk.so.14.1:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_env_dpdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event.so.13.1:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_accel.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_bdev.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_iobuf.so.3.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_iscsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_nbd.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_nvmf.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_scheduler.so.4.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_scsi.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_sock.so.5.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_vhost_blk.so.3.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_vhost_scsi.so.3.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_event_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_ftl.so.9.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_init.so.5.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_ioat.so.7.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_iscsi.so.8.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_json.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_jsonrpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_keyring.so.1.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_keyring_file.so.1.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_keyring_linux.so.1.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_log.so.7.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_lvol.so.10.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_nbd.so.7.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_notify.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.748 /usr/local/lib/libspdk_nvme.so.13.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_nvmf.so.18.1:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_rpc.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_scheduler_dpdk_governor.so.4.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_scheduler_dynamic.so.4.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_scheduler_gscheduler.so.4.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_scsi.so.9.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_sock.so.9.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_sock_posix.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_thread.so.10.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_trace.so.10.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_util.so.9.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_vfio_user.so.5.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_vhost.so.8.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_virtio.so.7.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 /usr/local/lib/libspdk_vmd.so.6.0:spdk-devel-v24.09-1.x86_64 00:54:32.749 12:56:55 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@109 -- $ rm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-devel-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-scripts-v24.09-1.x86_64.rpm /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm/rpm//x86_64/spdk-v24.09-1.x86_64.rpm 00:54:32.749 12:56:55 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]##*/}") 00:54:32.749 12:56:55 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@110 -- $ rpms=("${rpms[@]%.rpm}") 00:54:32.749 12:56:55 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@111 -- $ sudo rpm -e spdk-devel-v24.09-1.x86_64 spdk-scripts-v24.09-1.x86_64 spdk-v24.09-1.x86_64 00:54:32.749 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:54:32.749 12:56:56 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@124 -- $ [[ -n /home/vagrant/spdk_repo/dpdk/build ]] 00:54:32.749 12:56:56 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@125 -- $ mv /home/vagrant/spdk_repo/dpdk/build.hidden /home/vagrant/spdk_repo/dpdk/build 00:54:32.749 12:56:56 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- rpm/rpm.sh@94 -- $ sudo rpm -e dpdk-devel 00:54:32.749 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:54:33.008 00:54:33.008 real 2m42.275s 00:54:33.008 user 6m18.466s 00:54:33.008 sys 3m34.866s 00:54:33.008 12:56:56 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:54:33.008 12:56:56 packaging.rpm_packaging.build_shared_native_dpdk_rpm -- common/autotest_common.sh@10 -- $ set +x 00:54:33.008 ************************************ 00:54:33.008 END TEST build_shared_native_dpdk_rpm 00:54:33.008 ************************************ 00:54:33.268 12:56:56 packaging.rpm_packaging -- rpm/rpm.sh@1 -- $ cleanup 00:54:33.268 12:56:56 packaging.rpm_packaging -- rpm/rpm.sh@24 -- $ rm -rf /home/vagrant/spdk_repo/spdk/test/packaging/rpm/test-rpm 00:54:33.268 12:56:56 packaging.rpm_packaging -- rpm/rpm.sh@25 -- $ rpm --eval '%{_topdir}' 00:54:33.268 12:56:56 packaging.rpm_packaging -- rpm/rpm.sh@25 -- $ rm -rf /home/vagrant/rpmbuild 00:54:33.268 12:56:56 packaging.rpm_packaging -- rpm/rpm.sh@26 -- $ rm -rf /tmp/spdk-test_gen_spec 00:54:33.835 12:56:57 packaging.rpm_packaging -- rpm/rpm.sh@27 -- $ rpm -qa 00:54:33.835 12:56:57 packaging.rpm_packaging -- rpm/rpm.sh@27 -- $ grep -E 'spdk|dpdk' 00:54:34.773 12:56:58 packaging.rpm_packaging -- rpm/rpm.sh@27 -- $ sudo rpm -e 00:54:34.773 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 00:54:34.773 rpm: no packages given for erase 00:54:34.773 12:56:58 packaging.rpm_packaging -- rpm/rpm.sh@27 -- $ true 00:54:34.773 00:54:34.773 real 10m46.538s 00:54:34.773 user 30m6.623s 00:54:34.773 sys 14m58.307s 00:54:34.773 12:56:58 packaging.rpm_packaging -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:54:34.773 12:56:58 packaging.rpm_packaging -- common/autotest_common.sh@10 -- $ set +x 00:54:34.773 ************************************ 00:54:34.773 END TEST rpm_packaging 00:54:34.773 ************************************ 00:54:34.773 00:54:34.773 real 10m46.702s 00:54:34.773 user 30m6.684s 00:54:34.773 sys 14m58.412s 00:54:34.773 12:56:58 packaging -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:54:34.773 12:56:58 packaging -- common/autotest_common.sh@10 -- $ set +x 00:54:34.773 ************************************ 00:54:34.773 END TEST packaging 00:54:34.773 ************************************ 00:54:34.773 12:56:58 -- spdk/autopackage.sh@15 -- $ make clean 00:55:13.485 make[1]: Nothing to be done for 'clean'. 00:55:13.485 make[1]: Nothing to be done for 'clean'. 00:55:13.485 12:57:36 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:55:13.485 12:57:36 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:55:13.485 12:57:36 -- spdk/autopackage.sh@23 -- $ timing_enter build_release 00:55:13.485 12:57:36 -- common/autotest_common.sh@723 -- $ xtrace_disable 00:55:13.485 12:57:36 -- common/autotest_common.sh@10 -- $ set +x 00:55:13.485 12:57:36 -- spdk/autopackage.sh@26 -- $ [[ '' == *clang* ]] 00:55:13.485 12:57:36 -- spdk/autopackage.sh@36 -- $ [[ -n v22.11.4 ]] 00:55:13.485 12:57:36 -- spdk/autopackage.sh@36 -- $ [[ -e /tmp/spdk-ld-path ]] 00:55:13.485 12:57:36 -- spdk/autopackage.sh@37 -- $ source /tmp/spdk-ld-path 00:55:13.485 12:57:36 -- tmp/spdk-ld-path@1 -- $ export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:55:13.485 12:57:36 -- tmp/spdk-ld-path@1 -- $ LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:55:13.485 12:57:36 -- tmp/spdk-ld-path@2 -- $ export PKG_CONFIG_PATH= 00:55:13.485 12:57:36 -- tmp/spdk-ld-path@2 -- $ PKG_CONFIG_PATH= 00:55:13.485 12:57:36 -- spdk/autopackage.sh@40 -- $ get_config_params 00:55:13.485 12:57:36 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:55:13.485 12:57:36 -- spdk/autopackage.sh@40 -- $ sed s/--enable-debug//g 00:55:13.485 12:57:36 -- common/autotest_common.sh@10 -- $ set +x 00:55:13.485 12:57:36 -- spdk/autopackage.sh@40 -- $ config_params=' --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --enable-asan --enable-coverage --with-dpdk=/home/vagrant/spdk_repo/dpdk/build' 00:55:13.485 12:57:36 -- spdk/autopackage.sh@41 -- $ /home/vagrant/spdk_repo/spdk/configure --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --enable-asan --enable-coverage --with-dpdk=/home/vagrant/spdk_repo/dpdk/build --enable-lto --disable-unit-tests 00:55:13.485 Using /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig for additional libs... 00:55:13.485 DPDK libraries: /home/vagrant/spdk_repo/dpdk/build/lib 00:55:13.485 DPDK includes: //home/vagrant/spdk_repo/dpdk/build/include 00:55:13.485 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:55:13.744 Using 'verbs' RDMA provider 00:55:29.636 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:55:41.851 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:55:41.851 Creating mk/config.mk...done. 00:55:41.851 Creating mk/cc.flags.mk...done. 00:55:41.851 Type 'make' to build. 00:55:41.851 12:58:05 -- spdk/autopackage.sh@43 -- $ make -j10 00:55:42.109 make[1]: Nothing to be done for 'all'. 00:56:14.222 CC lib/ut_mock/mock.o 00:56:14.222 CC lib/log/log.o 00:56:14.222 CC lib/log/log_flags.o 00:56:14.222 CC lib/log/log_deprecated.o 00:56:14.222 CC lib/ut/ut.o 00:56:14.222 LIB libspdk_ut_mock.a 00:56:14.222 LIB libspdk_ut.a 00:56:14.222 LIB libspdk_log.a 00:56:14.222 CC lib/dma/dma.o 00:56:14.222 CC lib/util/base64.o 00:56:14.222 CC lib/util/bit_array.o 00:56:14.222 CC lib/util/cpuset.o 00:56:14.222 CC lib/util/crc16.o 00:56:14.222 CC lib/util/crc32.o 00:56:14.222 CC lib/util/crc32c.o 00:56:14.222 CC lib/ioat/ioat.o 00:56:14.222 CXX lib/trace_parser/trace.o 00:56:14.222 CC lib/util/crc32_ieee.o 00:56:14.222 LIB libspdk_dma.a 00:56:14.222 CC lib/vfio_user/host/vfio_user_pci.o 00:56:14.222 CC lib/util/crc64.o 00:56:14.222 CC lib/vfio_user/host/vfio_user.o 00:56:14.222 CC lib/util/dif.o 00:56:14.222 CC lib/util/fd.o 00:56:14.222 CC lib/util/file.o 00:56:14.222 CC lib/util/hexlify.o 00:56:14.222 LIB libspdk_ioat.a 00:56:14.222 CC lib/util/iov.o 00:56:14.222 CC lib/util/math.o 00:56:14.222 CC lib/util/pipe.o 00:56:14.222 CC lib/util/strerror_tls.o 00:56:14.222 CC lib/util/string.o 00:56:14.222 CC lib/util/uuid.o 00:56:14.222 LIB libspdk_vfio_user.a 00:56:14.222 CC lib/util/fd_group.o 00:56:14.222 CC lib/util/xor.o 00:56:14.222 CC lib/util/zipf.o 00:56:14.222 LIB libspdk_trace_parser.a 00:56:14.222 LIB libspdk_util.a 00:56:14.222 CC lib/idxd/idxd_user.o 00:56:14.222 CC lib/idxd/idxd.o 00:56:14.222 CC lib/rdma/common.o 00:56:14.222 CC lib/rdma/rdma_verbs.o 00:56:14.222 CC lib/json/json_parse.o 00:56:14.222 CC lib/json/json_util.o 00:56:14.222 CC lib/env_dpdk/env.o 00:56:14.222 CC lib/json/json_write.o 00:56:14.222 CC lib/vmd/vmd.o 00:56:14.222 CC lib/conf/conf.o 00:56:14.222 CC lib/vmd/led.o 00:56:14.222 CC lib/env_dpdk/memory.o 00:56:14.222 CC lib/env_dpdk/pci.o 00:56:14.222 CC lib/env_dpdk/init.o 00:56:14.222 LIB libspdk_json.a 00:56:14.222 LIB libspdk_rdma.a 00:56:14.222 LIB libspdk_idxd.a 00:56:14.222 CC lib/env_dpdk/threads.o 00:56:14.222 CC lib/env_dpdk/pci_ioat.o 00:56:14.222 LIB libspdk_conf.a 00:56:14.222 CC lib/env_dpdk/pci_virtio.o 00:56:14.222 CC lib/env_dpdk/pci_vmd.o 00:56:14.222 LIB libspdk_vmd.a 00:56:14.222 CC lib/env_dpdk/pci_idxd.o 00:56:14.222 CC lib/env_dpdk/pci_event.o 00:56:14.222 CC lib/env_dpdk/sigbus_handler.o 00:56:14.222 CC lib/env_dpdk/pci_dpdk.o 00:56:14.222 CC lib/env_dpdk/pci_dpdk_2207.o 00:56:14.222 CC lib/env_dpdk/pci_dpdk_2211.o 00:56:14.222 CC lib/jsonrpc/jsonrpc_server.o 00:56:14.222 CC lib/jsonrpc/jsonrpc_client.o 00:56:14.222 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:56:14.222 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:56:14.222 LIB libspdk_jsonrpc.a 00:56:14.222 LIB libspdk_env_dpdk.a 00:56:14.222 CC lib/rpc/rpc.o 00:56:14.222 LIB libspdk_rpc.a 00:56:14.222 CC lib/trace/trace_flags.o 00:56:14.222 CC lib/trace/trace.o 00:56:14.222 CC lib/trace/trace_rpc.o 00:56:14.222 CC lib/keyring/keyring_rpc.o 00:56:14.222 CC lib/keyring/keyring.o 00:56:14.222 CC lib/notify/notify.o 00:56:14.222 CC lib/notify/notify_rpc.o 00:56:14.222 LIB libspdk_notify.a 00:56:14.222 LIB libspdk_trace.a 00:56:14.222 LIB libspdk_keyring.a 00:56:14.222 CC lib/thread/thread.o 00:56:14.222 CC lib/thread/iobuf.o 00:56:14.222 CC lib/sock/sock.o 00:56:14.222 CC lib/sock/sock_rpc.o 00:56:14.222 LIB libspdk_sock.a 00:56:14.222 LIB libspdk_thread.a 00:56:14.222 CC lib/nvme/nvme_ctrlr_cmd.o 00:56:14.222 CC lib/nvme/nvme_ctrlr.o 00:56:14.222 CC lib/nvme/nvme_fabric.o 00:56:14.222 CC lib/nvme/nvme_ns_cmd.o 00:56:14.222 CC lib/nvme/nvme_ns.o 00:56:14.222 CC lib/nvme/nvme_pcie_common.o 00:56:14.222 CC lib/blob/blobstore.o 00:56:14.222 CC lib/accel/accel.o 00:56:14.222 CC lib/virtio/virtio.o 00:56:14.222 CC lib/init/json_config.o 00:56:14.222 CC lib/init/subsystem.o 00:56:14.222 CC lib/virtio/virtio_vhost_user.o 00:56:14.222 CC lib/accel/accel_rpc.o 00:56:14.222 CC lib/accel/accel_sw.o 00:56:14.222 CC lib/blob/request.o 00:56:14.222 CC lib/init/subsystem_rpc.o 00:56:14.222 CC lib/virtio/virtio_vfio_user.o 00:56:14.222 CC lib/virtio/virtio_pci.o 00:56:14.223 CC lib/blob/zeroes.o 00:56:14.223 CC lib/blob/blob_bs_dev.o 00:56:14.223 CC lib/nvme/nvme_pcie.o 00:56:14.223 CC lib/init/rpc.o 00:56:14.223 CC lib/nvme/nvme_qpair.o 00:56:14.223 LIB libspdk_accel.a 00:56:14.223 CC lib/nvme/nvme.o 00:56:14.223 CC lib/nvme/nvme_quirks.o 00:56:14.223 LIB libspdk_virtio.a 00:56:14.223 CC lib/nvme/nvme_transport.o 00:56:14.223 CC lib/nvme/nvme_discovery.o 00:56:14.223 LIB libspdk_blob.a 00:56:14.223 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:56:14.223 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:56:14.223 LIB libspdk_init.a 00:56:14.223 CC lib/bdev/bdev.o 00:56:14.223 CC lib/blobfs/blobfs.o 00:56:14.480 CC lib/bdev/bdev_rpc.o 00:56:14.480 CC lib/bdev/bdev_zone.o 00:56:14.480 CC lib/bdev/part.o 00:56:14.480 CC lib/nvme/nvme_tcp.o 00:56:14.480 CC lib/nvme/nvme_opal.o 00:56:14.737 CC lib/blobfs/tree.o 00:56:14.737 CC lib/bdev/scsi_nvme.o 00:56:14.737 CC lib/nvme/nvme_io_msg.o 00:56:14.737 CC lib/nvme/nvme_poll_group.o 00:56:14.737 CC lib/nvme/nvme_zns.o 00:56:14.737 CC lib/event/app.o 00:56:14.737 LIB libspdk_blobfs.a 00:56:14.737 CC lib/lvol/lvol.o 00:56:14.737 CC lib/event/reactor.o 00:56:14.737 CC lib/event/log_rpc.o 00:56:14.737 CC lib/event/app_rpc.o 00:56:14.995 CC lib/nvme/nvme_stubs.o 00:56:14.995 CC lib/nvme/nvme_auth.o 00:56:14.995 LIB libspdk_bdev.a 00:56:14.995 CC lib/nvme/nvme_cuse.o 00:56:14.995 CC lib/nvme/nvme_rdma.o 00:56:14.995 CC lib/event/scheduler_static.o 00:56:14.995 LIB libspdk_lvol.a 00:56:15.253 LIB libspdk_event.a 00:56:15.253 CC lib/scsi/dev.o 00:56:15.253 CC lib/scsi/lun.o 00:56:15.253 CC lib/scsi/scsi.o 00:56:15.253 CC lib/scsi/port.o 00:56:15.253 CC lib/nbd/nbd.o 00:56:15.253 CC lib/ftl/ftl_core.o 00:56:15.510 CC lib/ftl/ftl_init.o 00:56:15.510 CC lib/ftl/ftl_layout.o 00:56:15.510 CC lib/ftl/ftl_debug.o 00:56:15.510 CC lib/ftl/ftl_io.o 00:56:15.510 CC lib/scsi/scsi_bdev.o 00:56:15.510 CC lib/nbd/nbd_rpc.o 00:56:15.510 CC lib/ftl/ftl_sb.o 00:56:15.510 CC lib/ftl/ftl_l2p.o 00:56:15.510 CC lib/ftl/ftl_l2p_flat.o 00:56:15.510 CC lib/scsi/scsi_pr.o 00:56:15.510 CC lib/ftl/ftl_nv_cache.o 00:56:15.510 CC lib/scsi/scsi_rpc.o 00:56:15.510 CC lib/ftl/ftl_band.o 00:56:15.768 LIB libspdk_nbd.a 00:56:15.768 LIB libspdk_nvme.a 00:56:15.768 CC lib/ftl/ftl_band_ops.o 00:56:15.768 CC lib/ftl/ftl_writer.o 00:56:15.768 CC lib/ftl/ftl_rq.o 00:56:15.768 CC lib/ftl/ftl_reloc.o 00:56:15.768 CC lib/ftl/ftl_l2p_cache.o 00:56:15.768 CC lib/ftl/ftl_p2l.o 00:56:15.768 CC lib/ftl/mngt/ftl_mngt.o 00:56:15.768 CC lib/scsi/task.o 00:56:15.768 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:56:15.769 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:56:15.769 CC lib/ftl/mngt/ftl_mngt_startup.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_md.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_misc.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:56:16.027 LIB libspdk_scsi.a 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_band.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:56:16.027 CC lib/nvmf/ctrlr.o 00:56:16.027 CC lib/nvmf/ctrlr_discovery.o 00:56:16.027 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:56:16.027 CC lib/nvmf/ctrlr_bdev.o 00:56:16.027 CC lib/nvmf/subsystem.o 00:56:16.027 CC lib/nvmf/nvmf.o 00:56:16.027 CC lib/nvmf/nvmf_rpc.o 00:56:16.027 CC lib/nvmf/transport.o 00:56:16.285 CC lib/iscsi/conn.o 00:56:16.285 CC lib/vhost/vhost.o 00:56:16.285 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:56:16.285 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:56:16.285 CC lib/ftl/utils/ftl_conf.o 00:56:16.285 CC lib/ftl/utils/ftl_md.o 00:56:16.285 CC lib/ftl/utils/ftl_mempool.o 00:56:16.543 CC lib/ftl/utils/ftl_bitmap.o 00:56:16.543 CC lib/ftl/utils/ftl_property.o 00:56:16.543 CC lib/nvmf/tcp.o 00:56:16.543 CC lib/vhost/vhost_rpc.o 00:56:16.543 CC lib/iscsi/init_grp.o 00:56:16.543 CC lib/iscsi/iscsi.o 00:56:16.543 CC lib/iscsi/md5.o 00:56:16.543 CC lib/iscsi/param.o 00:56:16.543 CC lib/iscsi/portal_grp.o 00:56:16.543 CC lib/iscsi/tgt_node.o 00:56:16.543 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:56:16.543 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:56:16.801 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:56:16.801 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:56:16.801 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:56:16.801 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:56:16.801 CC lib/vhost/vhost_scsi.o 00:56:16.801 CC lib/vhost/vhost_blk.o 00:56:16.801 CC lib/vhost/rte_vhost_user.o 00:56:16.801 CC lib/iscsi/iscsi_subsystem.o 00:56:16.801 CC lib/iscsi/iscsi_rpc.o 00:56:16.801 CC lib/iscsi/task.o 00:56:16.801 CC lib/nvmf/stubs.o 00:56:16.801 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:56:16.801 CC lib/ftl/upgrade/ftl_sb_v3.o 00:56:17.059 CC lib/ftl/upgrade/ftl_sb_v5.o 00:56:17.059 CC lib/ftl/nvc/ftl_nvc_dev.o 00:56:17.059 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:56:17.059 CC lib/nvmf/mdns_server.o 00:56:17.059 LIB libspdk_iscsi.a 00:56:17.059 CC lib/nvmf/rdma.o 00:56:17.059 CC lib/nvmf/auth.o 00:56:17.059 CC lib/ftl/base/ftl_base_dev.o 00:56:17.059 CC lib/ftl/base/ftl_base_bdev.o 00:56:17.318 LIB libspdk_ftl.a 00:56:17.318 LIB libspdk_vhost.a 00:56:17.629 LIB libspdk_nvmf.a 00:56:17.888 CC module/env_dpdk/env_dpdk_rpc.o 00:56:18.146 CC module/accel/error/accel_error.o 00:56:18.146 CC module/accel/iaa/accel_iaa.o 00:56:18.146 CC module/keyring/linux/keyring.o 00:56:18.146 CC module/accel/ioat/accel_ioat.o 00:56:18.146 CC module/sock/posix/posix.o 00:56:18.146 CC module/blob/bdev/blob_bdev.o 00:56:18.146 CC module/keyring/file/keyring.o 00:56:18.146 CC module/scheduler/dynamic/scheduler_dynamic.o 00:56:18.146 CC module/accel/dsa/accel_dsa.o 00:56:18.146 LIB libspdk_env_dpdk_rpc.a 00:56:18.146 CC module/keyring/file/keyring_rpc.o 00:56:18.146 CC module/accel/error/accel_error_rpc.o 00:56:18.146 CC module/keyring/linux/keyring_rpc.o 00:56:18.146 CC module/accel/iaa/accel_iaa_rpc.o 00:56:18.146 LIB libspdk_scheduler_dynamic.a 00:56:18.146 CC module/accel/dsa/accel_dsa_rpc.o 00:56:18.146 CC module/accel/ioat/accel_ioat_rpc.o 00:56:18.146 LIB libspdk_blob_bdev.a 00:56:18.405 LIB libspdk_keyring_file.a 00:56:18.405 LIB libspdk_accel_error.a 00:56:18.405 LIB libspdk_keyring_linux.a 00:56:18.405 LIB libspdk_accel_dsa.a 00:56:18.405 LIB libspdk_accel_ioat.a 00:56:18.405 LIB libspdk_accel_iaa.a 00:56:18.405 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:56:18.405 LIB libspdk_sock_posix.a 00:56:18.405 CC module/scheduler/gscheduler/gscheduler.o 00:56:18.663 LIB libspdk_scheduler_dpdk_governor.a 00:56:18.663 CC module/bdev/delay/vbdev_delay.o 00:56:18.663 CC module/bdev/gpt/gpt.o 00:56:18.663 CC module/bdev/malloc/bdev_malloc.o 00:56:18.663 CC module/bdev/lvol/vbdev_lvol.o 00:56:18.663 CC module/bdev/null/bdev_null.o 00:56:18.663 CC module/bdev/error/vbdev_error.o 00:56:18.663 LIB libspdk_scheduler_gscheduler.a 00:56:18.663 CC module/blobfs/bdev/blobfs_bdev.o 00:56:18.663 CC module/bdev/nvme/bdev_nvme.o 00:56:18.663 CC module/bdev/gpt/vbdev_gpt.o 00:56:18.663 CC module/bdev/error/vbdev_error_rpc.o 00:56:18.663 CC module/bdev/null/bdev_null_rpc.o 00:56:18.663 CC module/bdev/malloc/bdev_malloc_rpc.o 00:56:18.663 CC module/bdev/delay/vbdev_delay_rpc.o 00:56:18.922 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:56:18.922 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:56:18.922 CC module/bdev/passthru/vbdev_passthru.o 00:56:18.922 LIB libspdk_bdev_gpt.a 00:56:18.922 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:56:18.922 LIB libspdk_bdev_error.a 00:56:18.922 CC module/bdev/nvme/bdev_nvme_rpc.o 00:56:18.922 LIB libspdk_bdev_null.a 00:56:18.922 LIB libspdk_bdev_delay.a 00:56:18.922 LIB libspdk_bdev_malloc.a 00:56:18.922 CC module/bdev/nvme/nvme_rpc.o 00:56:18.922 LIB libspdk_blobfs_bdev.a 00:56:18.922 LIB libspdk_bdev_passthru.a 00:56:18.922 CC module/bdev/nvme/bdev_mdns_client.o 00:56:18.922 LIB libspdk_bdev_lvol.a 00:56:19.182 CC module/bdev/raid/bdev_raid.o 00:56:19.182 CC module/bdev/raid/bdev_raid_rpc.o 00:56:19.182 CC module/bdev/split/vbdev_split.o 00:56:19.182 CC module/bdev/split/vbdev_split_rpc.o 00:56:19.182 CC module/bdev/zone_block/vbdev_zone_block.o 00:56:19.182 CC module/bdev/ftl/bdev_ftl.o 00:56:19.182 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:56:19.182 CC module/bdev/aio/bdev_aio.o 00:56:19.182 CC module/bdev/nvme/vbdev_opal.o 00:56:19.182 CC module/bdev/aio/bdev_aio_rpc.o 00:56:19.182 LIB libspdk_bdev_split.a 00:56:19.441 CC module/bdev/nvme/vbdev_opal_rpc.o 00:56:19.441 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:56:19.441 CC module/bdev/iscsi/bdev_iscsi.o 00:56:19.441 LIB libspdk_bdev_zone_block.a 00:56:19.441 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:56:19.441 CC module/bdev/raid/bdev_raid_sb.o 00:56:19.441 CC module/bdev/raid/raid0.o 00:56:19.441 CC module/bdev/ftl/bdev_ftl_rpc.o 00:56:19.441 CC module/bdev/raid/raid1.o 00:56:19.441 LIB libspdk_bdev_aio.a 00:56:19.441 CC module/bdev/raid/concat.o 00:56:19.441 LIB libspdk_bdev_nvme.a 00:56:19.441 LIB libspdk_bdev_ftl.a 00:56:19.441 LIB libspdk_bdev_iscsi.a 00:56:19.441 CC module/bdev/virtio/bdev_virtio_scsi.o 00:56:19.441 CC module/bdev/virtio/bdev_virtio_blk.o 00:56:19.441 CC module/bdev/virtio/bdev_virtio_rpc.o 00:56:19.700 LIB libspdk_bdev_raid.a 00:56:19.700 LIB libspdk_bdev_virtio.a 00:56:20.268 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:56:20.268 CC module/event/subsystems/iobuf/iobuf.o 00:56:20.268 CC module/event/subsystems/scheduler/scheduler.o 00:56:20.268 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:56:20.268 CC module/event/subsystems/vmd/vmd.o 00:56:20.268 CC module/event/subsystems/sock/sock.o 00:56:20.268 CC module/event/subsystems/keyring/keyring.o 00:56:20.268 CC module/event/subsystems/vmd/vmd_rpc.o 00:56:20.527 LIB libspdk_event_sock.a 00:56:20.527 LIB libspdk_event_scheduler.a 00:56:20.527 LIB libspdk_event_vhost_blk.a 00:56:20.527 LIB libspdk_event_keyring.a 00:56:20.527 LIB libspdk_event_vmd.a 00:56:20.527 LIB libspdk_event_iobuf.a 00:56:20.786 CC module/event/subsystems/accel/accel.o 00:56:21.045 LIB libspdk_event_accel.a 00:56:21.304 CC module/event/subsystems/bdev/bdev.o 00:56:21.563 LIB libspdk_event_bdev.a 00:56:21.822 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:56:21.822 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:56:21.822 CC module/event/subsystems/nbd/nbd.o 00:56:21.822 CC module/event/subsystems/scsi/scsi.o 00:56:22.081 LIB libspdk_event_nbd.a 00:56:22.081 LIB libspdk_event_scsi.a 00:56:22.081 LIB libspdk_event_nvmf.a 00:56:22.383 CC module/event/subsystems/iscsi/iscsi.o 00:56:22.383 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:56:22.646 LIB libspdk_event_vhost_scsi.a 00:56:22.646 LIB libspdk_event_iscsi.a 00:56:22.906 CXX app/trace/trace.o 00:56:22.906 TEST_HEADER include/spdk/accel.h 00:56:22.906 TEST_HEADER include/spdk/accel_module.h 00:56:22.906 TEST_HEADER include/spdk/assert.h 00:56:22.906 TEST_HEADER include/spdk/barrier.h 00:56:22.906 TEST_HEADER include/spdk/base64.h 00:56:22.906 TEST_HEADER include/spdk/bdev.h 00:56:22.906 TEST_HEADER include/spdk/bdev_module.h 00:56:22.906 TEST_HEADER include/spdk/bdev_zone.h 00:56:22.906 TEST_HEADER include/spdk/bit_array.h 00:56:22.906 TEST_HEADER include/spdk/bit_pool.h 00:56:22.906 TEST_HEADER include/spdk/blob.h 00:56:22.906 TEST_HEADER include/spdk/blob_bdev.h 00:56:22.906 TEST_HEADER include/spdk/blobfs.h 00:56:22.906 TEST_HEADER include/spdk/blobfs_bdev.h 00:56:22.906 TEST_HEADER include/spdk/conf.h 00:56:22.906 TEST_HEADER include/spdk/config.h 00:56:22.906 TEST_HEADER include/spdk/cpuset.h 00:56:22.906 TEST_HEADER include/spdk/crc16.h 00:56:22.906 TEST_HEADER include/spdk/crc32.h 00:56:22.906 TEST_HEADER include/spdk/crc64.h 00:56:22.906 TEST_HEADER include/spdk/dif.h 00:56:22.906 TEST_HEADER include/spdk/dma.h 00:56:22.906 TEST_HEADER include/spdk/endian.h 00:56:22.906 TEST_HEADER include/spdk/env.h 00:56:22.906 TEST_HEADER include/spdk/env_dpdk.h 00:56:22.906 TEST_HEADER include/spdk/event.h 00:56:22.906 TEST_HEADER include/spdk/fd.h 00:56:22.906 TEST_HEADER include/spdk/fd_group.h 00:56:22.906 TEST_HEADER include/spdk/file.h 00:56:22.906 TEST_HEADER include/spdk/ftl.h 00:56:22.906 TEST_HEADER include/spdk/gpt_spec.h 00:56:22.906 TEST_HEADER include/spdk/hexlify.h 00:56:22.906 TEST_HEADER include/spdk/histogram_data.h 00:56:22.906 TEST_HEADER include/spdk/idxd.h 00:56:22.906 TEST_HEADER include/spdk/idxd_spec.h 00:56:22.906 CC test/event/event_perf/event_perf.o 00:56:22.906 TEST_HEADER include/spdk/init.h 00:56:22.906 CC examples/accel/perf/accel_perf.o 00:56:22.906 TEST_HEADER include/spdk/ioat.h 00:56:22.906 TEST_HEADER include/spdk/ioat_spec.h 00:56:23.165 TEST_HEADER include/spdk/iscsi_spec.h 00:56:23.165 TEST_HEADER include/spdk/json.h 00:56:23.165 TEST_HEADER include/spdk/jsonrpc.h 00:56:23.165 CC test/accel/dif/dif.o 00:56:23.165 TEST_HEADER include/spdk/keyring.h 00:56:23.165 TEST_HEADER include/spdk/keyring_module.h 00:56:23.165 CC test/dma/test_dma/test_dma.o 00:56:23.165 TEST_HEADER include/spdk/likely.h 00:56:23.165 TEST_HEADER include/spdk/log.h 00:56:23.165 TEST_HEADER include/spdk/lvol.h 00:56:23.165 TEST_HEADER include/spdk/memory.h 00:56:23.165 CC test/app/bdev_svc/bdev_svc.o 00:56:23.165 CC test/blobfs/mkfs/mkfs.o 00:56:23.165 TEST_HEADER include/spdk/mmio.h 00:56:23.165 TEST_HEADER include/spdk/nbd.h 00:56:23.165 TEST_HEADER include/spdk/notify.h 00:56:23.165 CC test/bdev/bdevio/bdevio.o 00:56:23.165 TEST_HEADER include/spdk/nvme.h 00:56:23.165 TEST_HEADER include/spdk/nvme_intel.h 00:56:23.165 TEST_HEADER include/spdk/nvme_ocssd.h 00:56:23.165 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:56:23.165 TEST_HEADER include/spdk/nvme_spec.h 00:56:23.165 TEST_HEADER include/spdk/nvme_zns.h 00:56:23.165 TEST_HEADER include/spdk/nvmf.h 00:56:23.165 TEST_HEADER include/spdk/nvmf_cmd.h 00:56:23.165 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:56:23.165 TEST_HEADER include/spdk/nvmf_spec.h 00:56:23.165 TEST_HEADER include/spdk/nvmf_transport.h 00:56:23.165 TEST_HEADER include/spdk/opal.h 00:56:23.165 TEST_HEADER include/spdk/opal_spec.h 00:56:23.165 TEST_HEADER include/spdk/pci_ids.h 00:56:23.165 TEST_HEADER include/spdk/pipe.h 00:56:23.165 TEST_HEADER include/spdk/queue.h 00:56:23.165 TEST_HEADER include/spdk/reduce.h 00:56:23.165 TEST_HEADER include/spdk/rpc.h 00:56:23.165 TEST_HEADER include/spdk/scheduler.h 00:56:23.165 TEST_HEADER include/spdk/scsi.h 00:56:23.165 TEST_HEADER include/spdk/scsi_spec.h 00:56:23.165 TEST_HEADER include/spdk/sock.h 00:56:23.165 TEST_HEADER include/spdk/stdinc.h 00:56:23.165 TEST_HEADER include/spdk/string.h 00:56:23.165 TEST_HEADER include/spdk/thread.h 00:56:23.165 TEST_HEADER include/spdk/trace.h 00:56:23.165 TEST_HEADER include/spdk/trace_parser.h 00:56:23.165 TEST_HEADER include/spdk/tree.h 00:56:23.165 LINK event_perf 00:56:23.165 TEST_HEADER include/spdk/ublk.h 00:56:23.165 CC test/env/mem_callbacks/mem_callbacks.o 00:56:23.165 TEST_HEADER include/spdk/util.h 00:56:23.165 TEST_HEADER include/spdk/uuid.h 00:56:23.165 TEST_HEADER include/spdk/version.h 00:56:23.165 TEST_HEADER include/spdk/vfio_user_pci.h 00:56:23.165 TEST_HEADER include/spdk/vfio_user_spec.h 00:56:23.165 TEST_HEADER include/spdk/vhost.h 00:56:23.165 TEST_HEADER include/spdk/vmd.h 00:56:23.165 TEST_HEADER include/spdk/xor.h 00:56:23.165 TEST_HEADER include/spdk/zipf.h 00:56:23.165 CXX test/cpp_headers/accel.o 00:56:23.165 LINK accel_perf 00:56:23.165 LINK bdev_svc 00:56:23.165 LINK spdk_trace 00:56:23.165 LINK mkfs 00:56:23.165 LINK dif 00:56:23.425 LINK mem_callbacks 00:56:23.425 LINK bdevio 00:56:23.425 LINK test_dma 00:56:23.425 CXX test/cpp_headers/accel_module.o 00:56:23.683 CXX test/cpp_headers/assert.o 00:56:23.941 CXX test/cpp_headers/barrier.o 00:56:24.508 CXX test/cpp_headers/base64.o 00:56:24.766 CXX test/cpp_headers/bdev.o 00:56:25.025 CC test/env/vtophys/vtophys.o 00:56:25.025 CXX test/cpp_headers/bdev_module.o 00:56:25.591 LINK vtophys 00:56:25.591 CXX test/cpp_headers/bdev_zone.o 00:56:25.850 CXX test/cpp_headers/bit_array.o 00:56:26.416 CXX test/cpp_headers/bit_pool.o 00:56:26.675 CXX test/cpp_headers/blob.o 00:56:27.259 CXX test/cpp_headers/blob_bdev.o 00:56:27.527 CC app/trace_record/trace_record.o 00:56:27.786 CXX test/cpp_headers/blobfs.o 00:56:28.045 LINK spdk_trace_record 00:56:28.614 CXX test/cpp_headers/blobfs_bdev.o 00:56:29.990 CXX test/cpp_headers/conf.o 00:56:29.990 CC test/event/reactor/reactor.o 00:56:30.557 CXX test/cpp_headers/config.o 00:56:30.557 LINK reactor 00:56:30.814 CXX test/cpp_headers/cpuset.o 00:56:31.751 CC examples/bdev/hello_world/hello_bdev.o 00:56:31.751 CXX test/cpp_headers/crc16.o 00:56:32.688 LINK hello_bdev 00:56:32.946 CXX test/cpp_headers/crc32.o 00:56:33.880 CXX test/cpp_headers/crc64.o 00:56:34.818 CXX test/cpp_headers/dif.o 00:56:35.387 CC app/nvmf_tgt/nvmf_main.o 00:56:35.646 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:56:36.215 CXX test/cpp_headers/dma.o 00:56:36.474 LINK env_dpdk_post_init 00:56:36.474 LINK nvmf_tgt 00:56:37.042 CXX test/cpp_headers/endian.o 00:56:38.017 CXX test/cpp_headers/env.o 00:56:39.396 CXX test/cpp_headers/env_dpdk.o 00:56:40.334 CXX test/cpp_headers/event.o 00:56:41.713 CXX test/cpp_headers/fd.o 00:56:42.650 CXX test/cpp_headers/fd_group.o 00:56:44.026 CXX test/cpp_headers/file.o 00:56:45.403 CXX test/cpp_headers/ftl.o 00:56:46.339 CC test/event/reactor_perf/reactor_perf.o 00:56:46.597 CXX test/cpp_headers/gpt_spec.o 00:56:47.163 LINK reactor_perf 00:56:48.110 CXX test/cpp_headers/hexlify.o 00:56:49.488 CXX test/cpp_headers/histogram_data.o 00:56:50.866 CXX test/cpp_headers/idxd.o 00:56:52.246 CXX test/cpp_headers/idxd_spec.o 00:56:53.622 CXX test/cpp_headers/init.o 00:56:54.559 CXX test/cpp_headers/ioat.o 00:56:56.462 CXX test/cpp_headers/ioat_spec.o 00:56:57.864 CXX test/cpp_headers/iscsi_spec.o 00:56:59.241 CXX test/cpp_headers/json.o 00:57:00.618 CXX test/cpp_headers/jsonrpc.o 00:57:01.995 CXX test/cpp_headers/keyring.o 00:57:03.423 CXX test/cpp_headers/keyring_module.o 00:57:04.800 CXX test/cpp_headers/likely.o 00:57:06.176 CXX test/cpp_headers/log.o 00:57:06.742 CC test/event/app_repeat/app_repeat.o 00:57:07.758 CXX test/cpp_headers/lvol.o 00:57:07.758 LINK app_repeat 00:57:09.133 CXX test/cpp_headers/memory.o 00:57:09.133 CC test/env/memory/memory_ut.o 00:57:10.510 CXX test/cpp_headers/mmio.o 00:57:11.447 LINK memory_ut 00:57:11.706 CXX test/cpp_headers/nbd.o 00:57:11.964 CXX test/cpp_headers/notify.o 00:57:13.340 CXX test/cpp_headers/nvme.o 00:57:15.245 CXX test/cpp_headers/nvme_intel.o 00:57:16.182 CXX test/cpp_headers/nvme_ocssd.o 00:57:17.561 CXX test/cpp_headers/nvme_ocssd_spec.o 00:57:18.500 CXX test/cpp_headers/nvme_spec.o 00:57:19.437 CC test/env/pci/pci_ut.o 00:57:19.437 CXX test/cpp_headers/nvme_zns.o 00:57:20.002 LINK pci_ut 00:57:20.261 CXX test/cpp_headers/nvmf.o 00:57:20.827 CXX test/cpp_headers/nvmf_cmd.o 00:57:21.763 CXX test/cpp_headers/nvmf_fc_spec.o 00:57:22.698 CXX test/cpp_headers/nvmf_spec.o 00:57:22.698 CXX test/cpp_headers/nvmf_transport.o 00:57:23.266 CC test/event/scheduler/scheduler.o 00:57:23.266 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:57:23.266 CXX test/cpp_headers/opal.o 00:57:23.525 CXX test/cpp_headers/opal_spec.o 00:57:23.783 LINK nvme_fuzz 00:57:23.783 LINK scheduler 00:57:24.042 CXX test/cpp_headers/pci_ids.o 00:57:24.609 CXX test/cpp_headers/pipe.o 00:57:24.609 CC test/lvol/esnap/esnap.o 00:57:24.609 CC app/iscsi_tgt/iscsi_tgt.o 00:57:24.867 CC app/spdk_tgt/spdk_tgt.o 00:57:24.867 CC test/nvme/aer/aer.o 00:57:24.867 CXX test/cpp_headers/queue.o 00:57:25.127 CXX test/cpp_headers/reduce.o 00:57:25.127 LINK iscsi_tgt 00:57:25.385 LINK spdk_tgt 00:57:25.385 CXX test/cpp_headers/rpc.o 00:57:25.643 LINK aer 00:57:25.901 CXX test/cpp_headers/scheduler.o 00:57:26.468 CXX test/cpp_headers/scsi.o 00:57:26.726 CC app/spdk_lspci/spdk_lspci.o 00:57:26.985 CXX test/cpp_headers/scsi_spec.o 00:57:27.243 LINK spdk_lspci 00:57:27.501 CXX test/cpp_headers/sock.o 00:57:28.128 CXX test/cpp_headers/stdinc.o 00:57:29.064 CXX test/cpp_headers/string.o 00:57:29.064 LINK esnap 00:57:29.630 CXX test/cpp_headers/thread.o 00:57:30.568 CXX test/cpp_headers/trace.o 00:57:31.505 CXX test/cpp_headers/trace_parser.o 00:57:32.914 CXX test/cpp_headers/tree.o 00:57:32.914 CXX test/cpp_headers/ublk.o 00:57:34.303 CXX test/cpp_headers/util.o 00:57:35.240 CXX test/cpp_headers/uuid.o 00:57:35.807 CC examples/bdev/bdevperf/bdevperf.o 00:57:36.065 CXX test/cpp_headers/version.o 00:57:36.330 CXX test/cpp_headers/vfio_user_pci.o 00:57:37.706 CXX test/cpp_headers/vfio_user_spec.o 00:57:37.963 LINK bdevperf 00:57:38.526 CXX test/cpp_headers/vhost.o 00:57:39.899 CXX test/cpp_headers/vmd.o 00:57:40.465 CXX test/cpp_headers/xor.o 00:57:41.399 CXX test/cpp_headers/zipf.o 00:57:42.343 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:57:42.607 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:57:42.607 CC test/nvme/reset/reset.o 00:57:43.173 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:57:43.173 LINK reset 00:57:43.738 LINK vhost_fuzz 00:57:44.673 LINK iscsi_fuzz 00:57:46.044 CC app/spdk_nvme_perf/perf.o 00:57:46.044 CC test/rpc_client/rpc_client_test.o 00:57:46.978 LINK rpc_client_test 00:57:48.349 LINK spdk_nvme_perf 00:57:50.889 CC test/thread/poller_perf/poller_perf.o 00:57:52.269 LINK poller_perf 00:58:04.473 CC test/thread/lock/spdk_lock.o 00:58:04.473 CC app/spdk_nvme_identify/identify.o 00:58:05.855 LINK spdk_lock 00:58:06.421 LINK spdk_nvme_identify 00:58:06.680 CC test/nvme/sgl/sgl.o 00:58:07.253 CC test/app/histogram_perf/histogram_perf.o 00:58:07.513 LINK sgl 00:58:07.771 LINK histogram_perf 00:58:09.820 CC test/nvme/e2edp/nvme_dp.o 00:58:10.755 LINK nvme_dp 00:58:12.656 CC test/nvme/overhead/overhead.o 00:58:13.593 LINK overhead 00:58:21.731 CC test/app/jsoncat/jsoncat.o 00:58:21.731 CC app/spdk_nvme_discover/discovery_aer.o 00:58:21.989 LINK jsoncat 00:58:22.925 LINK spdk_nvme_discover 00:58:23.492 CC app/spdk_top/spdk_top.o 00:58:24.865 LINK spdk_top 00:58:29.053 CC test/nvme/err_injection/err_injection.o 00:58:29.053 LINK err_injection 00:58:29.985 CC app/vhost/vhost.o 00:58:30.243 CC test/app/stub/stub.o 00:58:30.243 LINK vhost 00:58:30.501 CC test/nvme/startup/startup.o 00:58:30.757 LINK stub 00:58:30.757 CC test/nvme/reserve/reserve.o 00:58:31.014 LINK startup 00:58:31.272 LINK reserve 00:58:31.272 CC test/nvme/simple_copy/simple_copy.o 00:58:31.836 LINK simple_copy 00:58:33.210 CC test/nvme/connect_stress/connect_stress.o 00:58:33.467 CC examples/blob/hello_world/hello_blob.o 00:58:33.467 CC examples/blob/cli/blobcli.o 00:58:33.724 LINK connect_stress 00:58:34.289 LINK hello_blob 00:58:34.547 LINK blobcli 00:58:35.918 CC test/nvme/boot_partition/boot_partition.o 00:58:36.484 LINK boot_partition 00:58:46.483 CC test/nvme/compliance/nvme_compliance.o 00:58:46.483 LINK nvme_compliance 00:58:49.770 CC app/spdk_dd/spdk_dd.o 00:58:50.028 CC app/fio/nvme/fio_plugin.o 00:58:50.287 CC app/fio/bdev/fio_plugin.o 00:58:50.287 LINK spdk_dd 00:58:50.879 CC test/nvme/fused_ordering/fused_ordering.o 00:58:51.448 LINK spdk_nvme 00:58:51.448 LINK spdk_bdev 00:58:51.707 LINK fused_ordering 00:58:51.965 CC test/nvme/doorbell_aers/doorbell_aers.o 00:58:52.224 CC test/nvme/fdp/fdp.o 00:58:52.793 LINK doorbell_aers 00:58:53.052 LINK fdp 00:59:11.142 CC test/nvme/cuse/cuse.o 00:59:14.430 LINK cuse 00:59:16.967 CC examples/ioat/perf/perf.o 00:59:17.226 CC examples/nvme/hello_world/hello_world.o 00:59:17.794 LINK ioat_perf 00:59:18.053 LINK hello_world 00:59:18.989 CC examples/nvme/reconnect/reconnect.o 00:59:20.363 LINK reconnect 00:59:20.622 CC examples/nvme/nvme_manage/nvme_manage.o 00:59:22.011 LINK nvme_manage 00:59:23.917 CC examples/nvme/arbitration/arbitration.o 00:59:25.294 LINK arbitration 00:59:30.563 CC examples/ioat/verify/verify.o 00:59:30.563 LINK verify 00:59:30.563 CC examples/sock/hello_world/hello_sock.o 00:59:30.821 CC examples/nvme/hotplug/hotplug.o 00:59:31.755 LINK hotplug 00:59:31.755 LINK hello_sock 00:59:37.024 CC examples/vmd/lsvmd/lsvmd.o 00:59:37.024 LINK lsvmd 00:59:38.935 CC examples/nvmf/nvmf/nvmf.o 00:59:39.872 LINK nvmf 00:59:40.809 CC examples/util/zipf/zipf.o 00:59:41.377 LINK zipf 00:59:42.312 CC examples/interrupt_tgt/interrupt_tgt.o 00:59:42.312 CC examples/thread/thread/thread_ex.o 00:59:42.312 CC examples/idxd/perf/perf.o 00:59:42.570 LINK interrupt_tgt 00:59:42.842 LINK thread 00:59:42.842 LINK idxd_perf 00:59:43.413 CC examples/nvme/cmb_copy/cmb_copy.o 00:59:43.413 CC examples/nvme/abort/abort.o 00:59:43.672 LINK cmb_copy 00:59:43.931 LINK abort 00:59:44.190 CC examples/vmd/led/led.o 00:59:44.758 LINK led 00:59:46.792 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:59:47.730 LINK pmr_persistence 01:00:14.317 13:02:35 -- spdk/autopackage.sh@44 -- $ make -j10 clean 01:00:14.317 make[1]: Nothing to be done for 'clean'. 01:00:20.881 13:02:43 -- spdk/autopackage.sh@46 -- $ timing_exit build_release 01:00:20.881 13:02:43 -- common/autotest_common.sh@729 -- $ xtrace_disable 01:00:20.881 13:02:43 -- common/autotest_common.sh@10 -- $ set +x 01:00:20.881 13:02:43 -- spdk/autopackage.sh@48 -- $ timing_finish 01:00:20.881 13:02:43 -- common/autotest_common.sh@735 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 01:00:20.881 13:02:43 -- common/autotest_common.sh@736 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 01:00:20.881 13:02:43 -- common/autotest_common.sh@738 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /home/vagrant/spdk_repo/spdk/../output/timing.txt 01:00:20.881 13:02:43 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 01:00:20.881 13:02:43 -- pm/common@29 -- $ signal_monitor_resources TERM 01:00:20.881 13:02:43 -- pm/common@40 -- $ local monitor pid pids signal=TERM 01:00:20.881 13:02:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 01:00:20.881 13:02:43 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-cpu-load.pid ]] 01:00:20.881 13:02:43 -- pm/common@44 -- $ pid=236757 01:00:20.881 13:02:43 -- pm/common@50 -- $ kill -TERM 236757 01:00:20.881 13:02:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 01:00:20.881 13:02:43 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-vmstat.pid ]] 01:00:20.881 13:02:43 -- pm/common@44 -- $ pid=236759 01:00:20.881 13:02:43 -- pm/common@50 -- $ kill -TERM 236759 01:00:20.881 + [[ -n 8955 ]] 01:00:20.881 + sudo kill 8955 01:00:20.881 sudo: /etc/sudoers.d/99-spdk-rlimits:1:23: unknown defaults entry "rlimit_core" 01:00:20.901 [Pipeline] } 01:00:20.920 [Pipeline] // timeout 01:00:20.926 [Pipeline] } 01:00:20.944 [Pipeline] // stage 01:00:20.950 [Pipeline] } 01:00:20.967 [Pipeline] // catchError 01:00:20.977 [Pipeline] stage 01:00:20.979 [Pipeline] { (Stop VM) 01:00:20.992 [Pipeline] sh 01:00:21.275 + vagrant halt 01:00:25.466 ==> default: Halting domain... 01:00:40.352 [Pipeline] sh 01:00:40.631 + vagrant destroy -f 01:00:43.916 ==> default: Removing domain... 01:00:43.929 [Pipeline] sh 01:00:44.209 + mv output /var/jenkins/workspace/rocky9-vg-autotest/output 01:00:44.218 [Pipeline] } 01:00:44.237 [Pipeline] // stage 01:00:44.243 [Pipeline] } 01:00:44.261 [Pipeline] // dir 01:00:44.267 [Pipeline] } 01:00:44.284 [Pipeline] // wrap 01:00:44.295 [Pipeline] } 01:00:44.310 [Pipeline] // catchError 01:00:44.320 [Pipeline] stage 01:00:44.322 [Pipeline] { (Epilogue) 01:00:44.337 [Pipeline] sh 01:00:44.619 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 01:01:02.735 [Pipeline] catchError 01:01:02.737 [Pipeline] { 01:01:02.753 [Pipeline] sh 01:01:03.036 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 01:01:03.295 Artifacts sizes are good 01:01:03.305 [Pipeline] } 01:01:03.324 [Pipeline] // catchError 01:01:03.336 [Pipeline] archiveArtifacts 01:01:03.343 Archiving artifacts 01:01:03.732 [Pipeline] cleanWs 01:01:03.743 [WS-CLEANUP] Deleting project workspace... 01:01:03.743 [WS-CLEANUP] Deferred wipeout is used... 01:01:03.749 [WS-CLEANUP] done 01:01:03.751 [Pipeline] } 01:01:03.770 [Pipeline] // stage 01:01:03.775 [Pipeline] } 01:01:03.793 [Pipeline] // node 01:01:03.799 [Pipeline] End of Pipeline 01:01:03.833 Finished: SUCCESS